Você está na página 1de 66

ACKNOWLEDGEMENT

First of all I thank god for all his blessings on me for


doing this project as a successful one.

I convey my thanks to COMMANDER K.VELU, THE


CHAIRMAN, SRI VENKATESWARA COLLEGE OF
COMPUTER APPLICATIONS AND MANAGEMENT, for
providing all the facilities that are needed to do my course in
this college.

I wish my gratitude to Dr. A.ASHRAF ALI M.A.,


M.Phil., Ph.D., B.Ed., The Director, Sri venkateswara
college of computer application and management , for
permitting us to do this project.

I wish to thank Mrs.N.Subha M.C.A Head of the


department, Assistant Professor, and department of
computer science, for her encouragement, help and valuable
suggestion in my project.

I am very much grateful to express my thanks to my


project guide, Mrs.N.Subha M.C.A Head of the department,
Assistant Professor, department of computer science, who
encouraged and guided me to complete this project in a
better manner.
I thank all other staff members of the department of
computer science, Sri venkateswara college of computer
applications and management, for being cooperative and
friendly throughout the course.
CONTENTS

SYNOPSIS

1. INTRODUCTION
1.1OVERVIEW OF THE PROJECT

2. SYSTEM CONFIGURATION
2.1 HARDWARE CONFIGURATION

2.2 SOFTWARE CONFIGURATION

2.3 SOFTWARE FEATURES

3. SYSTEM STUDY

3.1 EXISTING SYSTEM

3.2 NEED FOR PROPOSED SYSTEM

3.3 PROPOSED SYSTEM

3.4 ADVANTAGES OF PROPOSED SYSTEM

4. SYSTEM DESIGN

4.1 DATA FLOW DIAGRAM

4.2 E-R DIAGRAM

4.3 DATABASE DESIGN

4.4 INPUT DESIGN

4.5 OUTPUT DESIGN

5. SYSTEM IMPLEMENTATION AND TESTING

5.1 SYSTEM IMPLEMENTATION

5.2 SYSTEM TESTING


6. CONCLUSION

7. SCOPE OF FUTURE ENHANCEMENT

8. BIBLIOGRAPHY

APPENDICES

i. TABLES
ii. FORM
iii. SAMPLE CODE
SYNOPSIS

The project entitled “Cargo Logistics” is done to make the


manual process easier by making it a computerized system for
booking and import & export.

The Cargo Logistics get the booking through phone calls or


by personal from their dealers and deliver the import and export
goods to their destination based on their demand and previous
delivery date. This process is made computerized and the dealer’s
name, source, destination and goods details are stored in a
database. Based on this the billing for a dealer is made simple
and easier, since a dealer booked for container can be accepted
only after completing a certain period from the previous delivery.
This can be calculated and billed easily through this.
INTRODUCTION
OVERVIEW OF THE PROJECT:

“CARGO SYSTEMS” consists of five subsystems

• Container Parks

• Booking

• Transport

• Imports and Exports

• Garbage Management

CONTAINER PARKS:

Container parks take care of all activities that are


performed in cargo systems. It includes the container details and empty container
for booking process.

This system provides the facility to store the details of the


container such as container number, type of container, container name, dealer
number, dealer name, dealer contact number, dealer address (street, city, pin code),
customer number, customer contact number and date. Here container number and
container name are unique and container number generate automatically.

A container details are used to store the details of


container and its types. Container number has a unique number. The date will be
calculated automatically when an information stored in the database.
Empty containers are used to verify the containers are
empty. If the empty containers are not return within a particular day or damage in
container the customers wants to pay penalty. In empty containers also calculate
the date automatically.

These systems are maintained in a separate database and


generate report in a particular date for higher officials.

BOOKING:

Booking process includes the data entry for booking containers


and delivery it. It also maintains the details of reservation and cancellation in a
particular date.

This system provides facilities to maintain the details of


booking forms, reservation and cancellation such as container number, number of
container, driver address, driver contact number, driver name, goods name, email,
initial amount, date and booking number. Here booking number is unique and
generate automatically.

Reservation forms are used to enter the details of containers for


booking. The container is booked by container number only. Reservation is used
for import and exporting the goods.

Cancellation is used for cancel the container within a particular


date. If the customers not cancel within that particular date the initial amount will
not be refund. If the customers return within a particular day the half the amount
will be refund.
The information of booking is stored in the particular database
and maintained by the authorized person.

TRANSPORT:

Transport includes the driver details and transport record for


transport the imports and export goods in ship.

This system provides facilities to maintain the transport


management. It includes driver number, container number, vehicle number, type of
vehicle, driver name, driver address, driver contact number, goods name, source,
destination, starting and ending kilometers, amount, weight and customer service
charge.

Driver detail is used for transport management system. Driver


number is a unique number. In import and export malpractice is occur the driver
details is used find out the person easily.

Transport record is used for maintain the charges of transport. It


calculates the distance between source and destination. Transport amount will be
calculated by the weight and stored in the database.

IMPORT AND EXPORT:

Import and Export includes the invoice, cancellation and


clearance for import and export the goods from one place to another.

This system provides facilities to import and export for


transporting the goods. It includes invoice, exporter and importer name, source,
destination, ship name, export and import contact number, container number,
goods name, quantity, gross pay, buyer, security bond number, insurance, ship
number and goods clearance.

Invoice is used for maintain the export and import details. Ship
name is unique name in the process. Invoice stores the details about buyer. Invoice
is used to store the details of import and export in the database.

Cancellation is used to cancel the goods for import and export.

In cancellation ship number are unique number and booking


number is act as references number. Date will be generate automatically.

Goods clearance is used to verify the goods. Goods are verified


by custom officer. For the clearance customers shows the security bond number
and insurance number. If any malpractice is occurring the goods not get clearance
otherwise it gets clearance certificates. Customers want to pay the penalty for
malpractice.

GARBAGE MANAGEMENT:

Garbage management is used for clear the garbage from ships.


Garbage is wastage from the ship.

This system provides facilities to maintain the garbage. It


includes customer name, customer contact number, ship name, ship number,
license number, type of garbage, weight and amount.
In a garbage management ship name and ship number are unique.
It calculates the amount by weight of garbage. The amount will be varying by types
of garbage.

The bill entry form is used to store the data of garbage and date
will be calculated automatically. The data are stored in the database.
SYSTEM CONFIGURATION

HARDWARE CONFIGURATION

The hardware used for the development of the project is:

PROCESSOR : PENTIUM III 866 MHz

RAM : 128 MD SD RAM

MONITOR : 15” COLOR

HARD DISK : 20 GB

FLOPPY DRIVE : 1.44 MB

CDDRIVE : LG 52X

KEYBOARD: STANDARD 102 KEYS

MOUSE : 3 BUTTONS
SOFTWARE CONFIGURATION

The software used for the development of the project is:

OPERATING SYSTEM : Windows XP Professional

ENVIRONMENT : Visual Studio .NET 2005

.NET FRAMEWORK : Version 1.0

LANGUAGE : Visual Basic.NET

BACKEND : SQL
SOFTWARE SPECIFICATION

features of visual basic. net

Visual Basic. NET, the latest version of visual basic, includes


many new features. The Visual Basic supports interfaces but not
implementation inheritance.

Visual basic.net supports implementation inheritance, interfaces


and overloading. In addition, Visual Basic .NET supports
multithreading concept.

COMMON LANGUAGE SPECIFICATION (CLS):

Visual Basic.NET is also compliant with CLS (Common


Language Specification) and supports structured exception
handling. CLS is set of rules and constructs that are supported by
the CLR (Common Language Runtime). CLR is the runtime
environment provided by the .NET Framework; it manages the
execution of the code and also makes the development process
easier by providing services.

Visual Basic.NET is a CLS-compliant language. Any objects,


classes, or components that created in Visual Basic.NET can be
used in any other CLS-compliant language. In addition, we can
use objects, classes, and components created in other CLS-
compliant languages in Visual Basic.NET .The use of CLS ensures
complete interoperability among applications, regardless of the
languages used to create the application.

IMPLEMENTATION INHERITANCE:

Visual Basic.NET supports implementation inheritance. This


means that, while creating applications in Visual Basic.NET, we
can drive from another class, which is know as the base class that
derived class inherits all the methods and properties of the base
class. In the derived class, we can either use the existing code of
the base class or override the existing code. Therefore, with help
of the implementation inheritance, code can be reused.

CONSTRUCTORS AND DESTRUCTORS:

Constructors are used to initialize objects, whereas


destructors are used to destroy them. In other words, destructors
are used to release the resources allocated to the object. In Visual
Basic.NET the sub finalize procedure is available. The sub finalize
procedure is used to complete the tasks that must be performed
when an object is destroyed. The sub finalize procedure is called
automatically when an object is destroyed. In addition, the sub
finalize procedure can be called only from the class it belongs to
or from derived classes.

GARBAGE COLLECTION:
Garbage Collection is another new feature in Visual
Basic.NET. The .NET Framework monitors allocated resources,
such as objects and variables. In addition, the .NET Framework
automatically releases memory for reuse by destroying objects
that are no longer in use. In Visual Basic.NET, the garbage
collector checks for the objects that are not currently in use by
applications. When the garbage collector comes across an object
that is marked for garbage collection, it releases the memory
occupied by the object.

OVERLOADING:

Overloading is another feature in Visual Basic.NET.


Overloading enables us to define multiple procedures with the
same name, where each procedure has a different set of
arguments. Besides using overloading for procedures, we can use
it for constructors and properties in a class.

MULTITHREADING:

Visual Basic.NET also supports multithreading. An application


that supports multithreading can handle multiple tasks
simultaneously, we can use multithreading to decrease the time
taken by an application to respond to user interaction. To
decrease the time taken by an application to respond to user
interaction, we must ensure that a separate thread in the
application handles user interaction.
STRUCTURED EXCEPTION HANDLING:

Visual Basic.NET supports structured handling, which enables


us to detect and remove errors at runtime. In Visual Basic.NET, we
need to use Try…Catch…Finally statements to create exception
handlers. Using Try…Catch…Finally statements, we can create
robust and effective exception handlers to improve the
performance of our application.

THE .NET FRAMEWORK

The .NET Framework is a new computing platform that


simplifies application development in the highly distributed
environment of the Internet.

OBJECTIVES OF. NET FRAMEWORK:

1.To provide a consistent object-oriented programming


environment whether object codes is stored and executed locally
on Internet-distributed, or executed remotely.

2.To provide a code-execution environment to minimizes software


deployment and guarantees safe execution of code.

3. Eliminates the performance problems.

There are different types of application, such as Windows-based


applications and Web-based applications.

4. To make communication on distributed environment to ensure


that code be accessed by the .NET Framework can integrate with
any other code.
VISUAL STUDIO .NET

Visual Studio .NET is a complete set of development tools for


building ASP Web applications, XML Web services, desktop
applications, and mobile applications In addition to building high-
performing desktop applications, you can use Visual Studio's
powerful component-based development tools and other
technologies to simplify team-based design, development, and
deployment of Enterprise solutions.

Visual Basic .NET, Visual C++ .NET, and Visual C# .NET all
use the same integrated development environment (IDE), which
allows them to share tools and facilitates in the creation of mixed-
language solutions. In addition, these languages leverage the
functionality of the .NET Framework and simplify the development
of ASP Web applications and XML Web services.

Visual Studio supports the .NET Framework, which provides a


common language runtime and unified programming classes;
ASP.NET uses these components to create ASP Web applications
and XML Web services. Also it includes MSDN Library, which
contains all the documentation for these development tools.

FEATURES OF SQL DATA BASE

INTRODUCTION TO SQL Server

SQL Server is comprehensive operating environment that


packs h power of mainframe relation database management
system into user’s microcomputer. It provides a set of functional
program that user can use as tools to build structures and
perform tasks. Because applications are developed on SQL Server
are completely portable to the other versions of the programmer
can create a complex application in a single user, environment
and then move it to a multi-user platform. Users do not have to be
an expert to appreciate SQL Server but the better user
understands the program, the more productively and creatively
he can use the tools it provides.

Relational Database Management System

 SQL Server the right tool


 SQL Server gives you High Capacity
 Database management tools
 Structure of SQL Server Database

SQL Server database can be describe at two different


levels

 Physical Structure
 Logical Structure
Physical Structure:

a) One or more data files


b) Two or more log files
c) One control file

Logical Structure

a) Table spaces
b) Segments
c) Extents
d) Data Blocks

The data files contain all user data in terms of tables, index and views. The
log files contain the information to open and be recovered, of undone after a
transaction (Rollback).

The control file physical data, media information to open and manage data
files. If the control file is damaged the server will not be able to open or use the
database even if the database is undamaged.

DATABASE

The conventional data processing approach is to develop a program (or many


programs) for each application. This result in one or more data files for each
application. Some of the data may be common between files. However one
application may require the file to be organized on a particular field, while other
application may require the file to be organized on another field. A major drawback
of the conventional method is that the storage access methods are built in to the
program. Therefore, though the same data may be required by two applications, the
data will have to be sorted in two different places because each application depends
on the way that the data stored.
There are various drawbacks of conventional data file processing environment.
Some of them are listed below:
Data Redundancy:
Some data elements like name, address, identification code, are used in various
applications. Since data is required by multiple applications, it is stored in multiple
data files. In most cases, there is a repetition of data. This is referred to as data
redundancy, and leads to various other problems.

Data Integrity Problems:


Data redundancy is one reason for the problem of data integrity. Since the
same data is stored in different places, it is inevitable that some inconsistency will
creep in.

Data Availability Constraints:


When data is scattered in different files, the availability of information from a
combination of files is constrained to some extent.

Database Management System


A database management system (DBMS) consists of a collection of
interrelated data and a set of programs to access the data. The collection of data is
usually referred to as the database. A Database system is designed to maintain
large volumes of data. Management of data involves:

 Defining the structures for the storage of data


 Providing the mechanisms for the manipulation of the data
 Providing for the security of the data against unauthorized access

Users of the DBMS:


Broadly, there are three types of DBMS users:

 The application programmer


 The end user
 The database administrator (DBA)
The application programmer writes application programs that use the database.
These programs operate on the data in the database. These operations include
retrieving information, inserting data, deleting or changing data.

The end user interacts with the system either by invoking an application
program or by writing their queries in a database query language. The database
query language allows the end user to perform all the basic operations (retrieval,
deletion, insertion and updating) on the data.

The DBA has to coordinate the functions of collecting information about the
data to be stored, designing and maintaining the database and its security. The
database must be designed and maintained to provide the right information at the
right time to authorized people. These responsibilities belong to the DBA and his
staff.

ADVANTAGES OF A DBMS
The major advantage that the database approach has over the conventional
approach is that a database system provides centralized control of data. Most
benefits accrue from this notion of centralized control.

REDUNDANCY CAN BE CONTROLLED


Unlike the conventional approach, each application does not have to maintain
its own data files. Centralized control of data by the DBA avoids unnecessary
duplication of data and effectively reduces the total amount of data storage
required. It also eliminates the extra processing necessary to trace the required
data in a large mass of data present. Any redundancies that exist in the DBMS are
controlled and the system ensures that these multiple copies are consistent.

INCONSISTENCY CAN BE AVOIDED


Since redundancy is reduced, inconsistency can also be avoided to some extent.
The DBMS guarantee and that the database is never inconsistent, by ensuring that a
change made to any entry automatically applies to the other entries as well. The
process is known as propagating update.

THE DATA CAN BE SHARED


A database allows the sharing of data under its control by any number of
application program or users. Sharing of data does not merely imply that existing
applications can share the data in the database, it also means that new applications
can be developed to operate using the same database.

STANDARDS CAN BE ENFORCED

Since there is centralized control of data, the database administrator can ensure that
standards are maintained in the representation of the stored data formats. This is particularly
useful for data interchange, or migration of data between two systems.
SECURITY RESTRICTIONS CAN BE APPLIED

The DBMS guarantees that only authorized persons can access the database.
The DBA defines the security checks to be carried out. Different checks can be
applied to different operations on the same data. For instance, a person may have
the access rights to query on a file, but may not have the right to delete or update
that file. The DBMS allows such security checks to be established for each piece of
data in the database.

INTEGRITY CAN BE MAINTAINED

Centralized control can also ensure that adequate checks are incorporated in the DBMS to
provide data integrity. Data integrity means that the data contain in the database is both accurate
and consistent. Inconsistency between two entries can lead to integrity problems. However, even
if there is no redundancy, the data can still be inconsistent. For example a student may have
enrolled in 10 courses in a semester when the maximum number of courses one can enroll in is 7.
Another example could be that of a student enrolling in a course that is not being offered that
semester. Such problems can be avoided in a DBMS by establishing certain integrity checks to
be carried out whenever any update operation is done. These checks can be specified at the
database level, besides the application programs.

DATA INDEPENDENCE

In non-database systems, the requirement of the application dictates the way


in which the data is stored and the access techniques. Besides, the knowledge of
the organization of the data, the access techniques are built into the logic and code
of the application. These systems are data dependent. Consider this example,
suppose the university has an application that processes the student file. For
performance reason, the file is indexed on the roll number. The application would be
aware of the existing index, and the internal structure of the application would be
built around this knowledge. Now consider that the some reason, the file is to index
on the registration data. In this case it is impossible to change the structure of the
stored data without affecting the application too. Such an application is a data
dependent one.

It is desirable to have data independent applications. Suppose two


applications X and Y need to access the same file. However both the applications
require a particular field to be stored in different formats. Application X requires the
field “customer balance” to be stored in decimal format, while the application Y
requires it to be stored in binary format. This would pose a problem in an old
system. In a DBMS differences may exist in the way that data is actually stored, and
the way that it is seen and used by a given application.

FEATURES OF RDBMS:

 The ability to create multiple relations and enter data into them
 An interactive query language
 Retrieval of information stored in more than one table

NORMALIZATION
Normalization is a process of simplifying the relationship between data
elements in a record. It is the transformation of complex data stores to a set of
smaller, stable data structures.

Normalized data structures are simpler, more stable and are easier to
maintain. Normalization can therefore be defined as a process of simplifying the
relationship between data elements in a record.

PURPOSE FOR NORMALIZATION:


Normalization is carried out for the following four reasons:

 To structure the data so that there is no repetition of data, this helps in


saving space.
 To permit simple retrieval of data in response to query and report requests.
 To simplify the maintenance of the data through updates, insertions and
deletions.
 To reduce the need to restructure or reorganize data when new application
requirements arise.

STEPS OF NORMALIZATION:
Systems analysts should be familiar with the steps in normalization, since the
process can improve the quality of design for an application. Starting with a data
store developed for a data dictionary the analyst normalized a data structure in
three steps. Each step involves an important procedure to simplify the data
structure.

It consists of basic three steps.

1. First Normal Form, which decomposes all data groups into two-dimensional
records.
2. Second Normal form, which eliminates any relationships in which data
elements do not fully depend on the primary key of the record.
3. Third Normal Form which eliminates any relationships that contain transitive
dependencies.
EXISTING SYSTEM:

In earlier days, all the entries and calculations of cargo system


are done manually. In cargo system, booking, import and export, container parks,
transport and garbage management forms are calculated and store the data in file
systems. while doing the above process if there is any mistake then it will lead to
confusion for the concerned office.

Drawbacks:

• Manual work

• Security of information is difficult

• Calculation is difficult

• Time efficiency

• Space efficiency

• Error occurred frequently

• Need lot of papers for manual calculations


NEED FOR PROPOSED SYSTEM:

For the higher officials it is necessary that the reports arrive on


time form the corresponding office. The existing system is time consuming and
many times reports are found to be error.

When all the information is on the paper the retrieval time is


more. So in order to make it easy retrieved of data, computerization becomes
essential, because manual works is very tedious and time consuming and need lot
of papers to work may go wrong.

Accuracy, up-to-date and quick retrieval of information is not


possible in manual system. Computerization will become increasing the efficiency
and proper control over the administration. These are all the factors that led to
development of the computerization to cargo systems.
PROPOSED SYSTEM:

The proposed system is designed to eliminate all the


disadvantage of the existing one. It is designed keeping in mind all the drawbacks
of the present system in order to provide a permanent solution to the existing
problems.

This system provides various benefits for the cargo system.


This computerized the activities of container parks, booking, transport, imports and
exports and garbage management.

This computerization will greatly reduce the manpower and the


time required to go through all registers. It allows safe storage and immediate
retrieval of information. The access hierarchy with respective rights is applicable
throughout the system.
ADVANTAGE OF PROPOSED SYSTEM:

• Accuracy

• Time efficiency

• Space efficiency

• security of information

• Calculations are performed automatically

• Quick retrieval of information

LINEAR SEQUENTIAL MODEL

Linear sequential model was used for the development of this


project as this model is the easiest to implement for small project of short
duration. Linear sequential model is also known as classic life cycle or waterfall
model. This model suggests a systematic, sequential approach. The Linear
sequential model encompasses the following activities as shown in the figure
below.
Analysis Design Coding Test

Linear Sequential Model

This analysis phase is concerned with how the software is


analyzed for its successful developments. It includes finding which out operating
systems, database and the front end software will best suit for the software to work
according to our requirements.

The design of the software deals with both the input and output.
The input screen should be designed in such a way that it must be user friendly.
The output screen should be designed in such a way that desired output should be
presented in an attractive manner.

The development of the software mainly deals with coding. This


is an important phase as it decides how the developed software is going to work.
The better the coding are the better the software will perform. It should not
contain too many repetitions as the code may become too lengthy and
complicated.

This is the most important phase in software development is the


testing. In this phase the programmer checks for any possible errors in his software
after development. Less the errors are the better is the software. It decides

The robustness of the software is done by giving it all types of data


and checking if it works perfectly.

REQUIREMENT ANALYSIS
Requirement analysis phase of every project is the key phase
and on successful completion of it, one will be directed in a path following which
the project runs smoothly. So care should be taken while doing the requirement
analysis. Normally for any project existing system and proposed system and
analyzed under various aspects such as Environmental, Economical, Technical,
End user and the Duration of the project.

ENVIRONMENTAL

It is concerned with the type of environment in which the project is to


be worked such as the operating system and etc. This project is being worked on
windows platform which is more user-friendly than any other operating system.

ECONOMICAL

Before development of the project first of all it is advised


to be discussed regarding the financial aspects of the project. Once the financial
aspect is over, than the development of the project can be carried out. This project
was very much economical in all aspects. It was designed to help the cargo,
maintain data effectively with a low development cost.

TECHNICAL

This project was done in Microsoft Visual studio, which is good software for
development of application projects. Again during the development phase it is
easier to design the forms and all.

END USER

This software can be used by any user, even if he/she do not have any
computer knowledge, as it is very user friendly.
DURATION OF PROJECT
SYSTEM DESIGN

System Design is a solution, a “how to” approach to


the creation of new system It provides the understanding and
procedural details necessary for implementing the system
recommended in the feasibility study. A Design goes through the
logical and physical stages of development. Design is a creative
process that involves working with the unknown new system,
rather than analyzing the existing system. Thus, in analysis it is
possible to produce the correct model of existing system.

DATA FLOW DIAGRAM

Data Flow Diagram (DFD) is a modeling tool that


allows picturing system as a network of functional process to one
another by pipelines of data. They are also widely used for
representation of external and top-level design specification. The
DFD shows the interface between the system and external
terminators. Data Flow Diagram is also called as
”Bubble Chart”. The bubble represents the process, the line
represents the data flow and rectangle represents the entity.

Level 0:

Garbage
and
Transport

Cargo
Container Managem
information ent
System

Goods for
Delivery
Level 1:

Container
Container Capacity
informati
Product
on
Information

Proce
ss
Booki
Process ng
Container
Area of
Volume

Placing Process of
Garbage and Delivery goods to Transport
the
containe
r
Fina
l Check
Stag the
e goods
on
Garbage Delivery
contai
ENTITY RELATIONSHIP DIAGRAM

Entity Relationship Diagram (E-R Diagram) is a model that


describes the store layout of a system at a high level abstraction.
E-R Diagram enables to examine and highlights the data structure
and relationship between data stores in the DFDs. Based on the
information provided needed to access the database record
efficiently.

The E-R Diagram for each modules. They are

Container Parks

Booking

Transport

Import and Export

Garbage Management
CONTAINER PARKS
Type of
Container container
no

Date
Available
container
Size of
container
eeee

Container Details

Check
s

Dealer
Container no
no Dealer
name

Empty container
Available
check container Comp
No of Tdat
s ccccccontc ute Rdate
container e
iner

Empty Delay
container amount

BOOKING

Dealer
Containe Dealer Goods
name
r no no name

Date

Reservation

amo No of
Booking unt container
no

Initial
amount

Checks

Booking Container
no no

Dealer Goods
no Cancellation name
Refund
Date Rdat
Tdate
amount e
TRANSPORT

Type of
Vehicle
vehicle
Driver no
id
Goods
name
Container
no

Transport Details

Checks

Driver
Container name
no
Goods Driver
name id

Record

kms
Proces weight
Starting Ending Befor
s
Payment After
kms kms e
IMPORT AND EXPORT

Destinatio
E xport Sour n
No of
goods ce
container
name
Import
name
Export
contact no Import
contact no
Export
name Import
Export and Import goods
name
Date
of Date
Ship Ship
sendi of
no name
ng arriv
al
Perform
cv s

Insurance Ship Ship


Nam no
Name name
e

Custom Clearance

Penal Sending Receiving


ty date date
GARBAGE MANAGEMENT

Dealer
name

Type of
Custom officer garbage
name

Ship
no

Ship Date of
name taken

Garbage management

Weight of
Weig
garbage
ht

Amou
nt
DATABASE DESIGN

A database is a collection of inter-related data with minimum


redundancy to serve the user quickly and efficiently. The data are
stored in tables. We have learned that data provide the basic
information system. Without data there is no system, but the data
must be provided in the right form for input and the information
produced must be in a format acceptable to the user. The tables that
are used are USER TABLE, FEED TABLE, and WEBPAGE
TABLE.

Table Name: Web page Table.


Table Description: To maintain the Web page Details.

Primary Key: Webpage_id.


Field Name Data Type Size Description
User_id Varchar 15 User id
webpage_id Varchar 15 Web page id
webpage_url Varchar maximum Web page url
webpage_data Varchar 20 Web page data
Web_page VarBinary maximum
Web_des Varchar 20 Web page
description

Table Name: Feed Table.


Table Description: To maintain the Feed Details.

Primary Key: Feedname_id.

Field Name Data Type Size Description


User_id Varchar 15 User id
Feed_name Varchar 50 Feed name
Feed_url Varchar 50 Feed Url
Feed_title Varchar 50 Feed Title
Feed_author Varchar 50 Feed Author
Feed_data XML Feed data
Feed_keyword Varchar 50 Feed Keyword
FUNDAMENTAL DESIGN CONCEPTS

System design sits in the technical kernel of software


engineering and applied science regardless of the software
process model that is used. Beginning once the software
requirements have been analyzed and specified, tests that are
required in the building and verifying the software is done. Each
activity transforms information in a number that ultimately results
in validated computer software.

There are mainly three characteristics that serve as guide for


evaluation of good design,

• The design must implement all of explicit requirements


contained in the analysis model, and it must accommodate
all of the implicit requirements desired by the customer.
• The design must be readable, understandable guide for
those who generate code and for those who test and
subsequently support the software.
• The design should provide a complete picture of software,
addressing the data, its functional and behavioral domains
from the implementation perspective.
System Design is thus a process of planning a new
system or to replace or the complement of the existing system.
The design based on the limitations of the existing system and
the requirements specification gathered in the phase of system
analysis.

INPUT DESIGN

Input design is the process of converting the user-


oriented description of the computer based business information
into program-oriented specification. The goal of designing input
data is to make the automation as easy and free from errors as
possible.

The input design requirements such as user


friendliness consistent format and interactive dialogue for giving
the right message and help for the user at right time are also
considered for the development of the project.

The input design should make easier data entry. The


objective of input design is to create an input layout that is easy
to follow and avoid to operator errors. Effective input design
minimizes errors made by entry operator.
OUTPUT DESIGN

The Output Design is the most important and direct


source of information to the user. The output design is an ongoing
activity during study phase. The objectives of the output design
define the contents and format of all documents and reports in an
attractive and useful format.

Outputs from the computers are required primarily to


communicate the results of the processing to the users. They are
also used to communicate the results for later consultation.

The output may defined in terms of type of output,


content, format, location, frequency, response, volume, sequence
and action required.
SYSTEM IMPLEMENTATION

A software application in general is implemented after


navigating the complete life cycle method of a project. Various life
cycle processes such as requirement analysis, design phase,
verification, testing and finally followed by the implementation
phase results in a successful project management. The software
application which is basically a web based application has been
successfully implemented after passing various life cycle
processes mentioned above.

As the software is to be implemented in a high


standard industrial sector, various factors such as application
environment, user management, security, reliability and finally
performance are taken as key factors through out the design
phase. These factors are analyzed step by step and the positive
as well as negative outcomes are noted down before the final
implementation.
Security and authentication is maintained in both user
level as well as the management level. The data is stored in
Access 2000 as RDBMS, which is highly reliable and simpler to
use, the user level security is managed with the help of password
options and sessions, which finally ensures that all the
transactions are made securely.

The application’s validations are made, taken into


account of the entry levels available in various modules. Possible
restrictions like number formatting, date formatting and
confirmations for both save and update options ensures the
correct data to be fed into the database. Thus all the aspects are
charted out and the complete project study is practically
implemented successfully for the end users.

5.2 SYSTEM TESTING

Software testing is a critical element of software


quality assurance and represents the ultimate review of
specification, design and code generation. Once the source code
has been generated, software must be tested to uncover as many
errors as possible before delivery to the customer. In order to find
the highest possible number of errors, tests must be conducted
systematically and test cases must be designed using disciplined
techniques.

TYPES OF TESTING
White box Testing

White box testing some times called as glass box


testing is a test case design method that uses the control
structures of the procedural design to derive test cases.

Using White Box testing methods, the software


engineer can derive test case, that guarantee that all
independent paths with in a module have been exercised at least
once, exercise all logical decisions on their true and false sides,
execute all loops at their boundaries and within their operational
bounds, exercise internal data structures to ensure their validity.
“Logic errors and incorrect assumptions are inversely proportional
to the probability that a program path will be executed“.

The logical flow of a program is some times


counterintuitive, meaning that unconscious assumptions about
flow of control and data may lead to make design errors that are
uncovered only once path testing commences.

“Typographical errors are random“

When a program is translated into programming


language source code, it is likely that some typing errors will
occur. Many will be uncovered by syntax and typing checking
mechanisms, but others may go undetected until testing begins.
It is as likely that a type will exist on an obscure logical path as on
a mainstream path.

Black box Testing

Black box testing, also called as behavioral testing,


focuses on the functional requirements of the software. That is,
black box testing enables the software engineer to derive sets of
input conditions that will fully exercise all functional requirements
for a program.

Black box testing attempts to find errors in the following


categories:

1. Incorrect or missing functions


2. Interface errors
3. Errors in data structures or external data base
access
4. Behavior or performance errors
5. Initialization and termination errors
By applying black box techniques, a set of test cases
that satisfy the following criteria were been created: Test cases
that reduce, by a count that is greater than one, the number of
additional test cases that must be designed to achieve reasonable
testing and test cases that tell something about the presence or
absence of classes of errors, rather than an error associated only
with the specific test at hand.
Black-box testing is not an alternative to white - box
testing techniques. Rather it is complementary approach that is
likely to uncover a different class of errors than white - box
methods.

Validation Testing

Validation testing provides the final assurance that


software meets all functional, behavioral and performance
requirements. Validation testing can be defined in many ways, but
a simple definition is that validations succeed when the software
functions in a manner that is expected by the user. The software
once validated must be combined with other system element.
System testing verifies that all elements combine properly and
that overall system function and performance is achieved. After
the integration of the modules, the validation test was carried out
over by the system. It was found that all the modules work well
together and meet the overall system function and performance.

Integration Testing

Integration testing is a systematic technique for


constructing the program structure while at the same time
conducting test to uncover errors associated with interfacing. The
objective is to take unit - tested modules and build a program
structure that has been dictated by design. Careful test planning
is required to determine the extent and nature of system testing
to be performed and to establish criteria by which the result will
be evaluated.

All the modules were integrated after the completion


of unit test. While Top - Down Integration was followed, the
modules are integrated by moving downward through the control
hierarchy, beginning with the main module. Since the modules
were unit - tested for no errors, the integration of those modules
was found perfect and working fine. As a next step to integration,
other modules were integrated with the former modules.

After the successful integration of the modules, the


system was found to be running with no uncovered errors, and
also all the modules were working as per the design of the
system, without any deviation from the features of the proposed
system design.

Acceptance Testing

Acceptance testing involves planning and execution of


functional tests, performance tests and stress tests in order to
demonstrate that the implemented system satisfies its
requirements. When custom software is built for one customer, a
series of acceptance tests are conducted to enable the customer
to validate all requirements.

In fact acceptance cumulative errors that might


degrade the system over time will incorporate test cases
developed during integration testing. Additional testing cases are
added to achieve the desired level functional, performance and
stress testing of the entire system.

Unit testing

Static analysis is used to investigate the structural


properties of source code. Dynamic test cases are used to
investigate the behavior of source code by executing the program
on the test data. This testing was carried out during programming
stage itself.

After testing each every field in the modules, the


modulus of the project is tested separately. Unit testing focuses
verification efforts on the smallest unit of software design and
field. This is known as field - testing.
CONCLUSION

The implementation and testing has been done in a


step-by-step process. Each module has been developed and
tested individually to obtain the necessary required output in the
desired form. The project is full-fledged and user-friendly. The
system has greatly reduced the clerical overhead and drastically
reduced the time taken in the products. The system satisfies all
requirements needed by the user. I conclude the software as best
to my knowledge.

The software developed has been designed and run to


satisfy the requirements and needs of the organization as well as
the end users. The system reduces the manual work of
maintenance of the records. It has also resulted in quick retrieval
and reference of required information, which is vital to the
degrees of the organization.

The entire system is documented and can be easily


understood by the end users. The form are very user friendly and
also easy to handle even by the beginners with very little effort
and guidance.
7. SCOPE FOR FURTHER DEVELOPMENT

The project title “CARGO LOGISTICS ” is developed successfully from


mostly all the modules. The further development of this project can be to
host this project to the Internet.

Advantages

 Reduces the time consumption.


 Fast and Economical.
 Accurate Calculation.
BIBLIOGRAPHY

Books Referred:

1. Alex Homer , “Professional VB.NET 1.1”, 2004 Edition, Wrox Publications


2. Steven Holzner, “Visual Basic.NET Black Book”, 2003 Edition, Dreamtech
Publications
3. Roger S Pressman, “Software Engineering”, 2000 Edition, Dreamtech Publications

Introduction To Structured Query Language


Version 3.53

By: Jim Hoffman


jhoffman@one.net.

4.

Websites:

1. www.msdn.microsoft.com
2. www.vbcity.com
3. www.vbdotnetheaven.com
4. www.codeguru.com

Você também pode gostar