Você está na página 1de 85

A PROJECT REPORT ON

XPLORE- JOBS
SUBMITTED IN PARTIAL FULFILLMENT FOR THE AWARD OF
DEGREE OF MASTER OF COMPUTER APPLICATION

BY
AMIT KUMAR YADAV
ROLL NO-0715514006

Ewing Christian Institute of Management and


Technology Allahabad
JUNE 2010

1
Acknowledgement

First of all I would like to thank almighty God who gave me the
inspiration to take up this task.
With immense pleasure, I take this opportunity to express my sincere thanks and
deep gratitude to all those people who extended their wholehearted co-operation and
their help for me in completing this project successfully.
I express my humble to Mr. Belal Ahmad (Project leader), for his valuable
guidance and inspiration in my taking up this project. I think that with out him, I couldn’t
complete my project.
I am personally very thankful to Mr. Sandeep Karan for providing me the
information about .net and help me.
I would like to express my gratitude to Mr. Anurag Sewak (HOD) ECIMT Allahabad for
all his information related to project & support.
I am also grateful to the lab assistant and our seniors who have helped us
in completing this project.

At last but not the least I give my hearty thanks to my family members and well
wishers who supported me morally and encouragement constructive criticism given to
me during the project period.

Amit Kumar Yadav

2
Declaration

I, Amit Kumar Yadav, hereby declare that the report of the project entitled
“Xplore-Jobs” has not been presented as a part of any other academic work to
get any degree or certificate except to Ewing Christian Institute of
management & Technology, Allahabad for the fulfillment of the requirement of
the award of degree of MASTER OF COMPUTER APPLICATION.

Date: Amit Kumar Yadav


Place:

3
Table of Contents

1) Company Profile……………………………………………………………………………. 1
2)Abstract………………………………………………………………………………………..4
3) Introduction ……………………………………………..……………...….……………… 4
(i) Objectives of the Project ………………………….……………………………………5
(ii) Hardware and Software Tools used ………………………………………………….5
 Software Interfaces……………………………………………………5
 Hardware Interfaces………………………… ……………………...5
4) Requirement Analysis……………………………………………………………………..…6
 Purpose…………………………………………………………………6
 Existing System………………………………………………..………6
 Proposed System……………………………………………………...6
 Benefits…. ……………………………………………………………..7
5) User Requirement Specification ………………………………………..……………… ….8
• Product Perspective………………………………...………...8
• User Types ………………………………………...……….…8
6) System Analysis………………………………………………………………………… …9
• Introduction of User……………………..………………………………………………9
• Definition of System…………………………...………………………………………..9
• Identification of Need……………………………………………………… …………10
• Preliminary Investigation……………………………… ……………………………11
• Module Description…………………………………………………………………….12
7) Feasibility Study……………………………………………………………………………..13
• Technical Feasibility………………………………………………………………… 14
• Economical Feasibility………………………………………………………… ……14
• Operational Feasibility…………………………………………………… …………14
8) Project Planning……………………………………………………………………… ……15
• Team Structure………………………………………………………………… ……15
• Topic Understanding…………………………………………………………………. 15
• Modular Break-up of the System…………………………………………………….15
9) Process Logic for each Module………………………………………………… ……….17
10) Database Requirements…………………… ……………………………………………17
11) Programming Language & Development Tools………………………………………...18
ENVIRONMENT: Introducing the .NET Platform……………………………… …18
• The Execution Flow in .Net……………………………………………………… …22
• TECHNOLOGY: ASP.NET………………………………… ………… …………..25
• PROGRAMMMING LANUAGE: C#.Net………………… …… ………… ……..26
• SQL Server 2005………………………………………… ……… ………………..28
12) System Design………………………………………………………………………… …30
13) Design Methodology……………………………………………………………………….32
14) Data Flow Diagram……………………………………………………………………….. 32
• Zero Level DFD………………………………………………………………… …..32
• 1’st Level DFD………………………………………………………………………. .33
• 2’nd Level DFD……………………………………………………………… ………34
• ER Diagram……………………………………………………………………… …..35

4
15) Database Design…………………………………………………………………………..38
• Introduction to data dictionary……………………………………………………… .38
• List of Database tables………………………………………………………………..39
• Relationship in Tables…………………………………………………………………44

16) Optimization of Code………………………………………………………………….…..44


17) Screenshots………………………………………………………………………………..46
18) Testing………………………………………………………………………………… 65
19) Implementation…………………………………………………………………………..67
20) Maintenance Criteria……………………………………………………………………..69
21) Security……………………………………………………………………………………..71
Security Configuration……………………………………………………….………..71
Code Access Security……………………………………………………….………..73
Security Measures taken………………………………………………..…………….73
22) Limitations............................................................................................... .................74
23) Future Enhansments.................................................................................................74
24) Annexure……………………………………………………………………………………76
• DFD Notations…………………………………………………………………76
• ER Diagram Notations………………………………………………………..77
25) Bibliography...............................................................................................................78
26) Appendix....................................................................................................................79
27) Conclusion.................................................................................................................80

5
Company Profile
Newgen Infosoft Pvt. Ltd. is one of the leading IT vendors with proven capabilities in
Application Development Services. As preferred application development outsourcing
vendors, we have delivered significant benefits to our customers. Newgen Infosoft’s
matured application development process encompasses all the phases of software
development life cycle (SDLC), starting from translating business needs into project
requirements through implementation and post-production user support. A mature
delivery model supported by technical and domain excellence characterizes Newgen
Infosoft 's Application Development Services. This helps our clients achieve flexibility,
scalability, quality and a reduced time-to-market. We work closely with our customers to
ensure that we meet the service levels on the most critical parameters of 'on time',
'within budget' and 'defect free' application development. Our suite of Application
Development outsourcing service offerings includes:
• Customized application development
• New application development,
• Rapid application development.
Mission & Values

Mission is to coup with the world’s emerging and established software leaders to help
them bring great products to market in less time and at less cost. To do this, Newgen
Infosoft is committed to providing:

• World-class software engineers who partner with and extend the capabilities of
our clients’ existing engineering teams to take on new areas of development.
• Rigorous product engineering method and platform based on distributed agile
software development and open source tools to ensure complete visibility and
accountability at each stage during the product development lifecycle.
• Structured partnerships that fully incorporate client objectives while guaranteeing
quality, cost savings and time-to-market satisfaction.

Furthermore we adhere to a strict business philosophy based on the following values:


Integrity
Newgen infosoft team believes in treating people with respect. We believe in always
doing what we say we will and when we say we will do it. We always hold ourselves
to the highest ethical standards and take personal responsibility for our words and
actions.
Openness
Newgen infosoft team greatly values our people and seeks to empower them. We
believe in the free flow of information, regardless of rank or power, so that everyone
has access to the most complete data in order to make the best decisions. We
maximize transparency to create an environment where every individual is accessible
and encouraged to contribute, and where each thought is valued and factored into
making decisions.
Teamwork

6
Newgen infosoft team believes that a good team is stronger than the sum of its parts.
True teamwork and true partnership assumes an intimate understanding and
alignment of each other’s goals and requires each party to actively care for and trust
the other. We always strive to treat our clients, investors and colleagues as partners.
Innovation
Newgen infosoft team believes in always learning and innovating. We encourage and
reward those who challenge conventional wisdom, take risks and speak out. We
believe that inspiring people to grow is good for the individual, good for our clients and
good for the business.

Solutions
Application Development

Application development outsourcing helps organizations adapt to the rapid


technology advancements and evolving business processes. Outsourcing software
development helps them fulfill their need for newer customized solutions or
transformation of existing systems to state-of-the-art environments, while retaining
the embedded business processes, rules and logic.
Web Technology Solutions

In this competitive world Web became mainstream technology, it has transformed the
way companies do business. From procurement to payment processing, Web-based
solutions are creating new opportunities, streamlining processes and integrating
operations. In the past few years, the Web has opened new opportunities apart from
doing eCommerce on the Web, like social networking sites, e-Learning applications,
etc. The right partner — one with proven expertise in extending business processes
to the Web can help you maximize process improvement and operational savings
that can be attained from implementing Web solutions.
Newgen Infosoft’s Web development services deliver measurable value,
incorporating a broad range of technologies. We provide high-performance e-
Business and eCommerce solutions to help our clients provide the high levels of
quality and service required for competitive differentiation.
Newgen infosoft team builds and implements end-to-end e-Business solutions that
seamlessly integrate with diverse business applications. For example, Newgen
Infosoft ’s portfolio of successful Web projects includes Web Portals, Social
Networking sites, Customer relationship management applications, content
management and Workflow solutions, e-Learning portals, storefronts, integration with
back-end applications, Web-enabling of legacy applications.
Newgen infosoft team’s expertise spans a wide range of industries including
insurance, financial services, manufacturing, telecommunications, retail, energy and
utilities. Newgen Infosoft’s comprehensive suite of Web technology solutions
includes:
Architecture Services
SOA Consulting Services

7
Portal and Content Management
Web 2.0
Workflow and Business Process Management
Security and User Management
Web solutions bring together expertise in middleware integration, application servers,
portal development frameworks, and content management solutions, on the latest
technology platforms such as Microsoft .NET and J2EE.
Application Development
Application development outsourcing helps organizations adapt to the rapid technology
advancements and evolving business processes
Why newgensoft're Special
NewGen INFOSOFT s is a leading Company of Professionals specializing in Web
Design, Web Development, SEO, Flash Design, ERP Consulting, CRM Solution,
Content Writing, Software Development and PHP Solutions with supporting services,
one stop shopping under one roof, we can optimize your website(s).
ERP Consulting
We build and implement end-to-end e-Business solutions that seamlessly integrate with
diverse business applications.
CRM Solutions
In this competitive world Web became mainstream technology, it has transformed the
way companies do business.

Address: D-55, A1 Block


2nd floor,
Sector-7
Noida-201301
Website: www.newgeninfosoft.com

8
Abstract

Title-“Xplore-jobs”
The title of the project is start with the word “Xplore” which is new style to write the
word “explore “. Explore means “search”. My project guide suggest me to give the name
of project in new style. So I decided to write Xplore in place of Xplore in the project title
“Xplore-jobs”
Training Place: - D-55, A1 Block
2nd floor,
Sector-7
Noida-201301
Website: www.newgeninfosoft.com
Project guide: Mr. Belal Ahmad
Software & Language Tools
Platform: .net
Language: C#
RDBMS: SQL sever 2005
Technology: asp.net
Data Access Tool: ADO.NET

INTRODUCTION
“Xplore-Jobs” is a web-portal made for Consultancy. Objective of this project is to
automate the recruitment process of the consultancy. The project entitled “Xplore-
Jobs” helps in finding solutions to the manual systems being used in Consultancy. This
web-portal will be most useful for Consultants for searching of matching Job with
Jobseekers etc. The web-portal has few sections, which is discussed a below.
• An Administrator is provided through which user can enter Recruiter’s records/type of
jobs/Jobseeker’s description. A registration form is provided through which user can
enter details of company like location of company / types of job / status of job /
Qualification of Jobseekers. This data once entered can be edited/deleted as
required when there will be vast entries of data user can scroll the data.
• There are sections provided for Recruiter Requirement/Resource.

Recruiter
In Recruiter sections, user can enter the details of companies through the help of
portal. The portal has forms that will ask details like company
id/name/location/address/city /country/contact person/contact number/contact email-id.
Once after entering the details the data will be saved by clicking save button and the
details will be stored in the Recruiter database.

Requirements
In Requirements section, various vacancies of various companies are stored. All
the details like Job id, Job title, Job type, Job period, Job location, Functional Skills,
Technical Skills, company id, contact person, contact number, contact email-id are
stored in the database.

9
Resource
In Resource section, details of various Jobseekers are stored. All the Personal details
and Technical and Educational background, work experience, Nature of work and
position held in last job are stored in the database.

Search
In Search section, the jobseeker can search job according to his profile or according to
his need. And recruiter can search candidate according to need of his company.

OBJECTIVES OF THE PROJECT


Objective of this project is to automate the recruitment process of the
consultancy. The project entitled “Xplore-Jobs” helps in finding solutions to the
manual systems being used in Consultancy. The system analyses the problem of
dealing with the great deal of manual work. Manually maintenance of Jobseeker
and Recruiter details can create a lot of problem.

Hardware and Software Tools used


The Web-portal Requirements Specification is produced at the culmination of the
analysis task. The function and performance allocated to Web-portal as part of system
engineering are refined by establishing a complete information description, a detailed
functional description, a representation of system behavior, an indication of performance
requirements and design constraints, appropriate validation criteria, and other
information pertinent to requirements.
The Introduction of the Web-portal requirement specification states the goals and
objectives of the Web-portal, describing it in the context of the computer-based system.
Actually the introduction may be nothing more than the software scope of the planning
document. The information description provides a detailed description of the problem
that the Web-portal must solve. Information content, flow, and structure are documented.
Hardware, Software and human interfaces are described for external system elements
and internal software functions. For this project we need some special type of
environment for setup. This is as follows:

Software Interfaces

Following software are required for developing Web based application:

I. Operating System Windows NT, 2000, XP Prof Etc.

II Environment VisualStudio.Net
.

II Front-end Tool C# with .net, CSS, Java script, Ajax.


I.

I Technology ASP.NET.
V
.

10
V Web server Tools IIS 5.0.
.

V Backend Tool SqlServer2005.


I.

V Data Access Tool ADO.NET.


II
.

V Reports Crystal Report.


II
I.

Hardware Interfaces

It's a web – based project, so a robust hardware configuration is required. The hardware
requirements are:

I. Processor PIII 700 MHz and above.

II Motherboard Intel 845 and above


.

II RAM 256 to 768 MB


I.

I Hard Disk 2.5 GB for Visual Studio .NET and


V 1 GB for Microsoft
. SQLServer2005.

V Network Card Standard Ethernet card for


. networking.

V I/O Devices Keyboard, mouse and Color


I. monitor

V Wires Twisted pair for networking.


II
.

Requirement Analysis

11
Purpose

“Xplore-jobs” is a web-portal which automates recruitment process for a consultancy. It


is self-contained and new web based project. This is developed on .net plate form with
C# language. The web pages of this web-portal softly run on various browser for
example IE 7.0 and above, Firefox, Google chrome, opera etc. It uses the asp.net
technology.

Existing System

In the existing manual system, huge expenditure and a lot of time is spent in
communicating. So there is a need of integrated automated system, which has some
centralized control over the entire process.
Manually maintenance of Jobseeker details and Recruiter details can create a lot of
problem. Such as: -
• Slow process
• Time taking
• Difficulty to retrieve information
• Difficulty to analyze the related data
• More staff
• More paper work

Proposed System
In today's world of computing one has to scope with the fast trend of changing
world and globalization a consultancy has to interact with Recruiters (various
companies), keep records and knowledge of various vacancies in these companies. It
also keeps resumes of several kinds of Jobseekers. And maintain resources
(Jobseekers and match the bio-data according to the job requirement) and then inform
the Jobseeker about the job and Recruiters. Computerized and web based “Xplore-
Jobs” will help a user i.e. a consultant in many ways. The key features of the project
are:

• To facilitate easy maintenance of records of various Recruiters


(Companies), Jobs and Jobseekers.
• To check details of prospective Jobseekers through quick search
provided in the portal.
• To check for matching Job with Jobseekers.
• To facilitate preparation of records in the mechanized process and
thereby producing accurate documents/data’s for recording details.
• Quick access of all records
• To match the suitable candidate to appropriate job
• Reduce manual work
• Generation of Quick report
• Prevent and reduce human error

BENEFITS

12
In order to quantify the benefits of your job placement software, you first have to know
what to look for. Job tracking solutions can help you in different ways. Here are some of
the areas to look for benefits in implementing a software system:
Paper Reduction
“Xplore-Jobs” can reduce the cost associated with creating and distributing paper
requirements and resumes. Cost savings result from:
• Direct savings from reduced paper, printing, and distribution costs
• Indirect savings from time spent handling paper documents
Paperwork Transfer
A big expense with a manually maintain job records and requirement system is that of
transferring of records from one person to another. For example, the employee needs to
get to know about the requirement of an organization from the HR consultants or from
any other source. They had to forward the resume and then had to move over to the
consultant manager of any consultant. The manager checks and verifies the resumes
and then selects the appropriate one with to that of the requirement of the organization
and then forwards them to the concerned HR department of the particular organization.
A very conservative estimate of the direct time spent on simply transferring the records
around would be wasted per employee each time.

Improved Data Quality


Manual resumes must be then entered into a computer in order to be processed for
further process or may be maintained into database for future requirement. Whenever
data is keyed into a computer from a hand-written document, errors will be made. These
errors can cost you in many direct and indirect ways:
• Incorrect selection may take place. When employees are appointed it
may be the wrong selection that may had occurred due to the consultant or
adjustment is costly for organization.
• Without accurate information, you lose the ability to perform useful
decision support. You should have accurate information about the
background of the employee as well as the employer, departments worked in,
areas the organization is working in etc. for your decision support.

Fewer Inquiries to the Human Resources Department


A good automated Human Resource Management System will provide useful
information to the jobseekers online that they traditionally request from the HR
departments. By making this information available online, your HR departments will
spend less time answering questions from jobseekers and recruiters. A good system will
be able to track the following information for each jobseeker and make available:
• The professional record of the jobseeker
• History of the jobseeker’s experience
• The major skills of the jobseeker
• The personal information about it
• The areas the organization work in
The following information can be tracked for each recruiter
• The Contact persons responsible for recruitment
• Assigned Projects of the organization
• Training sessions if it providing any
• Position title and the experience it is seeking
• Performance Reviews of the organization.

13
Human Resource Management features make creating and maintaining complete
employee files simple and efficient. Its powerful record keeping, monitoring and reporting
capabilities will save you time, while reducing your exposure to employment related
lawsuits by ensuring you have proper documentation. In addition, HRM reduces the
potential for employee grievances by assuring fairness across your workforce. With
HRM, you will identify trends early and make adjustments as needed.

User Requirement Specification

1. Product Perspective
The Xplore-jobs is the new, self-contained web-portal. The Xplore-jobs is using .net
platform. All components follow Model-View-Controller pattern. The user can retrieve
information related to jobs and consultancy and about companies.
All pages of the system are following a consistent theme and clear structure.
The occurrence of errors should be minimized through the use of checkboxes and scroll
down in order to reduce the amount of text input from user. Error message should be
located beside the error input which clearly highlight and tell user how to solve it. If
system error, it should provide the contact methods. The page should display the project
process in different color to clearly reflect the various states. Each level of user will have
its own interface and privilege to mange and modify the project information.
User interface elements are easy to understand. Part of user interface is well-
organized on screen and the parts are concatenated right. When users look at the
interface, they understand which pane is used for which purpose. Each task of an
interface is specified clearly and users use them correctly. For example, when users
press to any button on interface, they can know which operations are done by pressing
this button.
The user interface is easy to learn. When users use the user interface, they
can know which element is used to which operations. The interface actions and
elements are consistent. When users press any button, required actions is done by the
system.
The screen layout and color of the user interface is appealing. When users look
at the screen, it will have a nice vision. Colors will be selected clearly, thus eyes of users
won’t be tired.
Since the application must run on the web server, so client server architecture
will needed. Internet connection is must and all the hardware shall require to connect the
PC will be hardware interface for the system.The main interface would be the monitor,
Keyboard and mouse.

2. User Types
• Administrator

Administrator power is provided to consultancy. Administrator can create


moderator and can block any company or jobseeker. Administrator can send new job
information to jobseeker by e-mail. Administrator can generate bill for providing the
services to companies.

• Job Seeker

14
Jobseeker can search job according to his profile and educational background.
Then he can apply on line to that job. He can also block the company to not send
resume to them. User can change his password and can edit his profile. He can
register to “Xplore-jobs” for free to take the services of consultancy online.

• Recruiter

A registered recruiter can post jobs. He can also see updated resume of job-
seeker. He can edit his profile also. He can delete any job posted by him.

System Analysis
System Analysis refers to the process of examining a business situation with the
intent of improving it through better procedures and methods Requirement analysis is
the first technical step in Web-portal process. It is a process of discovery, refinement,
modeling and specification. It is the systematic use of proven principles, techniques,
languages and tools for the cost-effective analysis, documentation and on-going
evolution of user needs and the specification of external behavior of a system to satisfy
those user needs.
The very first thing is the problem recognition. After the need of the project is
identified the implementation of the project is identified – who is going to use the Web-
portal, what all are the points needed in modeling the design of the Web-portal.

Introduction of user
The term user is widely used in the system analysis and design. The term end-user is
widely used by the analysts to refer to people who are not professional information
systems specialists but who can use computers to perform their Jobs. We can group
end-user into four categories.

• Hands-on Users actually interact with the system. They fed in data or receive
output, perhaps using a terminal.
• Indirect Users benefits form the results or reports produced by these systems but
do not directly interact with the hardware or software. These users may be managers
of business functions using the system.
• End-Users are not alike. Some are intermittent users. The end-user can also be a
competitor, not an employee of the firm.
User manages have management responsibilities for application systems.
• Senior Manager Users are fourth types of users and is talking increased
responsibility for the development of information systems.

Definition of System

In the broad sense, a system is simply a set of components that interact to


accomplish some purpose. Systems are all around us. As computers are used more
and more by persons who are not computer professionals, the face of systems

15
development is taking on an additional dimension. Users themselves are undertaking
development of some of the systems they use, as the executive in the vignette
emphasized. These different situations are represented by three distinct approaches to
the development of computer information systems: -
• Systems Development Life Cycle.
• Structured Analysis Development Method.
• Systems Prototype Method.
Systems development, a process consisting of two major steps of systems analysis and
design, starts when management of sometimes systems development personnel
realizes that a particular business system needs improvement. Systems development is
classically thought of as the set of activities that analysts, designers and users carry out
to develop and implement an information system. Different parts of the project can be in
various phases at the same time, with some components undergoing analysis while
other advanced stages.

Systems development consists of following activities:


• Preliminary investigation.
• Determination of system requirements.
• Design of system.
• Development of Web-portal.
• System testing.
• Implementation and evaluation.

System analysis is conducted with the following objectives in mind:


• Identify the user’s need.
• Evaluate the system concept for feasibility.
• Perform economic and technical analysis.
• Allocate functions to hardware, software, people, database and other system
elements.
• Establish cost and schedule constraints.
Create a system definition that forms the foundation for all subsequent engineering
work. Both hardware and software expertise are required to successfully attain the
objectives listed above.
As our Web-portal “Xplore-Jobs” is going to be used by the people who may or
may not be computer literate, we had tried to make it user-friendly.

Identification of Need

This step is initiation of system analysis .An overview of the customer’s


requirement has been done. The basic need of the user to opt for such kind of
project is analyzed. Manually maintenance can create a lot of problem. Such as:
1. Slow process
2. Time consuming.
3. Lacks accuracy.
4. Difficulty to hide information from unauthenticated staff.
5. Difficulty to retrieve information
6. Difficulty to analyze the related data
7. More staff

16
8. More paper work
Computerizing the A Human Resource Management System will help a user i.e.
a consultant to quick access of all records; match the suitable candidate to appropriate
job. It also maintains all the files in databases to provide quick access and save the time.
The objective of the project is to develop Web-portal for handling the record of
consultancy and easy to update according user’s requirement. The main aim of the
project is to increasing the efficiency of the management process and to better maintain
the records of both Jobseekers and Recruiters. And when need records should available
easily. The user is accessed to the main menu from which he/she can select appropriate
Jobseeker for job requirement of several companies.
Information is needed in organizations for planning, staffing and controlling
purposes. Regardless of the nature of the information required, the information should
possess the characteristics of accuracy, timeliness, completeness and relevancy. In the
recent years, need for information improvement by reports lacking one or more of these
characteristics and by increased paper work volume, rising costs, and pressures from
outside changes.
Fortunately computers thrive on repetitive large volume processing tasks, are
fast and accurate. The processing capability in many organizations has been strained by
1. Growth in size and complexity of the organizations
2. The increased requirements for data from external sources and
3. The demand of administrators for more information.
More than a million new pages of data are generated each minute of the day in
offices. Compare to other processing methods, the use of computers may make it
possible for certain administrative costs to be reduced while the level of processing
activity remains stable. The increased cost and clerical labor materials and other
expenses associated with the data processing operation require eventual managerial
attention.
We all agree that meaningful information is timely information. But with an increase
in volume and size of an organization, there is only a reduction in the speed of
processing. Rapid changes are taking place in the world socially, economically and
technically. Such changes have a significant impact on the environment in which
organizations must operate, on the planning that managers must do, and on the
information that they must have. But if a data processing operation is strained to or
beyond the capacity for which if was originally planned, inaccuracies will begin to
appear. Inadequate control will permit inadequate performance. It is due to these
pressures (increased paper work volume, costs, pressure from outside changes, and
demand for timeliness & demand for quality) that most of the organizations today are
opting for computers to do data processing for them. The project is mainly an
information-processing sub-package. It aims at providing information about rooms
available in the hotel. Each time a customer checks in the availability of a room is
checked and then allotted, and his personal detail is stored. The problem if handled
manually the information might lose the characteristics of either timeliness or accuracy.
Hence the need for this project was realized.
All the data has to be first fed in the computer. Once it is stored in the files any query
regarding this data can be answered satisfactorily. The retrieval process involves much
less time and the information is accurate. Any updating is easily accommodated.

Preliminary Investigation

17
The user is a consultant, which has to keep records of his Recruiters, their details, the
details of their requirements. The consultant has also maintained the resource pool of
Jobseekers and then matches the suitable candidate according to the requirement.

Module Description

This project consists of different interfaces. Different modules which makeup this system
are briefly described below:

Login Module

In this module, there are three types of login- admin login, job-seeker, and recruiter
login. Here check authentication and authorization of user. If the password and user-ID
is valid then he is allowed to enter, otherwise “Invalid User/Password” message is
displayed.

Job-Seeker Registration
In this module a job-seeker can register easily by filling the form. He can submit all detail
like educational, personal. He chooses a unique id as user name. He enters the
password for security feature. He can submit his resume.

Recruiter Registration
In this module the recruiter can register by filling the form. He has to give some
necessary data like company name, website, email-ID, address, contact no. etc

Place Requirement
Recruiter can post the job according to his requirement. In this module the recruiter can
easily post the job. Consultancy sends this information to job-seeker. Consultancies
contact to several companies and gather information regarding several vacancies and
store them into their database.

Place Resumes
Jobseekers gather information about the several consultancies and prepare their
resumes according to job requirement mentioning their technical and educational skills
and send resumes to consultancies.

Jobseeker Account
In this module jobseeker can preview his profile, can edit his profile like contact details,
educational details, password etc. He can block any company where he not wants to
send to resume. He can also see recommended jobs.

Recruiter account
In this module, recruiter can see his profile and can edit. He can see posted jobs by him.
He can contact to consultancy. He can see jobseekers information sent by consultancy.

18
Contacts of Consultancies to Jobseeker

Consultancies match the job with resumes and inform the Jobseeker through phone or
email.

Contacts of Consultancies to Companies

After selecting suitable candidate consultancies give their information to companies.

Selection

Company selects appropriate Jobseeker which is sent by consultancies and fixes time
for interviews. After final selection information is send to Jobseeker.

Prepare Invoice

Companies sending information about the selection of Jobseeker to consultancies and


consultancies make invoices according to selected Jobseeker and send to company.

Payment
Company checks invoices against requirement which is send by consultancies and
make payment accordingly either through cash or cheque.

Validation of Data Entered by the User and Error Handling

In this module, the validity of data entered by the user during the various processes is
checked through various validation checks. For example, there shouldn’t be any
characters entered in the numeric fields, likewise if there is any error occurs that it
should handle that particular error and give the required messages.

Feasibility Study
Depending on the results of the initial investigation, the survey is expanded to a
more detailed feasibility study. Feasibility study is a test of system proposal according to
its work ability, impact on the organization, ability to meet user needs, and effective use
of resources. The objective for this phase is not to solve the problem but to acquire a
sense of scope. During the study, the problem definition is crystallized and aspects of
the problem to be included in the system are determined.

Information processing systems are capital investments because resources are


being spent currently in order to achieve benefits to be received over a period of time
following completion. There should be a careful assessment of each project before it is
begun in terms of economic justification, technical feasibility, operational impact and
adherence to the master development plan.

We started the project by listing the possible queries that the user might want to
be satisfied. And on these lines we guided the project further. The three main points,
kept in mind at the time of project, are:
• Possible (To build it with the given technology and resources)

19
• Affordable (given the time and cost constraints of the organization)
• Acceptable (for use by the eventual users of the system)

The three major areas to consider while determining the feasibility of a project are: ---

TECHNICAL FEASIBILITY
This involves financial consideration to accommodate technical enhancements. If the
budget is a serious constraint, then the project is judged not feasible. The analyst thinks
about the technical feasibility of the system. In the proposed system, the application has
made in this way so that it is technically feasible for the sales promotion. There are some
certain types of reports being used in the system to make the application more users
friendly i.e. the Data Report is used. The system has made in such away that it is being
used for a single user as well as multi-user environment. The operating systems are
Windows NT 4.0/Windows 2000/Windows XP/2003. Thus we see that the system is
more technically feasible regarding to operating system. We are using Microsoft SQL
Server 2000 as backend for maintaining the database. SQL Server is a relational
database management system of an object that is formed by the database and an
instance of the SQL Server. SQL Server 2000 is Windows based RDBMS. This is one of
the powerful RDBMS due to its menu driven facility. This provides a better service to the
user in the sense of taking the backup of data and then restores it. We can easily mirror
the database in the SQL Server 2000.

ECONOMICAL FEASIBILITY
An evaluation of development cost weighed against the ultimate income or benefit
derived from the developed system. Today, software is the most expensive element of
virtually all computer-based systems. A large cost estimation error can make the
difference between profit and loss. Estimation of resources, cost, and schedule for a
software engineering effort requires experience, access to good historical information,
and the courage to commit to quantitative predictions. Estimation carries inherent risk
and this risk leads to uncertainty. Project complexity has a strong effect of the
uncertainty inherent in planning. Complexity, however, is a relative measure that is
affected by familiarity with past effort. Project size is another important factor that can
affect the accuracy and efficiency of estimates. As size increases, the interdependency
among various elements of the software grows rapidly. Software cost and effort
estimation will never be an exact science. Too many variables-human, technical,
environmental, political-can affect the ultimate cost of software and effort applied to
develop it. However, software project estimation can be transformed from a black art to a
series of systematic steps that provide estimates with acceptable risk. The system is not
too costly according to the features of the application. The cost of the project is
balanced. The cost of the project might be increase or decrease as according to
requirement of the customer. The system has developed systematically.

OPERATIONAL FEASIBILITY
This application is very easy to operate as it is made user friendly. Main consideration is
user’s easy access to all the functionality of the application.

20
Documentation of the Feasibility Study
The findings of a feasibility study are generally documented in what is called a feasibility
report. The degree of detail in such reports would be greatly dependent on the native of
the project. The content of this project would be as given in:
Introduction
Statement of problem
Implementation Environment
Constraints
Management summery and recommendations
Important findings
Comments
Recommendations
Impact
Alternatives
Alternative system configurations
Criteria used in selecting the final approach
System description
Abbreviated statement of scope.
Feasibility of allocated elements
Cost Benefit Analysis
Evaluation of technical Risk
Legal ramification’s
Others Project Specific Topics

PROJECT PLANNING
I took assignment for developing a computerized system of “Xplore-Jobs”.
Planning of this project will include following things:
• Team Structure.
• Topic understanding.
• Modular break-up of the system
• Processor logic for each module
• Database requirements

TEAM STRUCTURE
The project team comprises of five members who worked as developers and a project
leader who assigned the whole task and provided the finest details of the problem. The
project coordinator supervises the whole project work and sort out the problems
occurred during the development phase.

TOPIC UNDERSTANDING
It is vital that the field of application as introduced in the project may be totally a new
field. So as soon as I took this project, I carefully went through the project to identify the
requirements of the project.

21
MODULAR BREAK-UP OF THE SYSTEM
It consists of following phases:
• Identify the various modules in the system
• List them in the right hierarchy
• Identify their priority of development

Description of the Modules


This project consists of different interfaces, which will be accessed through a MDI
(Multiple Document Interfaces) window. Different modules which makeup this system
are briefly described below:
Login Module

In this module, there are three types of login- admin login, job-seeker, and recruiter
login. Here check authentication and authorization of user. If the password and user-ID
is valid then he is allowed to enter, otherwise “Invalid User/Password” message is
displayed.

Job-Seeker Registration
In this module a job-seeker can register easily by filling the form. He can submit all detail
like educational, personal. He chooses a unique id as user name. He enters the
password for security feature. He can submit his resume.

Recruiter Registration
In this module the recruiter can register by filling the form. He has to give some
necessary data like company name, website, email-ID, address, contact no. etc

Place Requirement
Recruiter can post the job according to his requirement. In this module the recruiter can
easily post the job. Consultancy sends this information to job-seeker. Consultancies
contact to several companies and gather information regarding several vacancies and
store them into their database.

Place Resumes
Jobseekers gather information about the several consultancies and prepare their
resumes according to job requirement mentioning their technical and educational skills
and send resumes to consultancies.
Jobseeker Account
In this module jobseeker can preview his profile, can edit his profile like contact details,
educational details, password etc. He can block any company where he not wants to
send to resume. He can also see recommended jobs.

Recruiter account
In this module, recruiter can see his profile and can edit. He can see posted jobs by him.
He can contact to consultancy. He can see jobseekers information sent by consultancy.

22
Contacts of Consultancies to Jobseeker
Consultancies match the job with resumes and inform the Jobseeker through phone or
email.

Contacts of Consultancies to Companies


After selecting suitable candidate consultancies give their information to companies.

Selection
Company selects appropriate Jobseeker sended by consultancies and fixes time for
interviews. After final selection information is send to Jobseeker.

Prepare Invoice
Companies sending information about the selection of Jobseeker to consultancies and
consultancies make invoices according to selected Jobseeker and send to company.

Payment
Company checks invoices against requirement send by consultancies and make
payment accordingly either through cash or cheque.

Validation of Data Entered by the User and Error Handling


In this module, the validity of data entered by the user during the various processes is
checked through various validation checks. For example, there shouldn’t be any
characters entered in the numeric fields, likewise if there is any error occurs that it
should handle that particular error and give the required messages.

PROCESS LOGIC FOR EACH MODULE


In the first module, validity of password is checked against a particular user.
In the second module, whenever a new entity is entered it should be checked for the
duplicate data. In the third module and fourth module, just like the first module it should
have the proper checks for every entity being modified or updated. In last module, again
the validation checks are made and the different reports are generated to ease the
business processes and decision-making.

DATABASE REQUIREMENTS
• Identify the various tables required.
• Fields for these tables.
• The various key fields (for example Primary key and foreign key).
• Identify the various constraints like not null, Unique etc.

23
PROGRAMMING LANGUAGE &
DEVELOPMENT TOOLS
ENVIRONMENT: Introducing the .NET Platform
The .NET Framework is a managed type-safe environment for application development
and execution. The .NET Framework manages all aspects of your program’s execution.
It allocates memory for the storage of data and instructions, grants or denies the
appropriate permissions to your application, initiates and manages application execution,
and manages the reallocation of memory from resources that are no longer needed. The
.NET Framework consists of two main components: the common language runtime and
the .NET Framework class library.
The common language runtime can be thought of as the environment that
manages code execution. It provides core services, such as code compilation, memory
allocation, thread management, and garbage collection. Through the common type
system (CTS), it enforces strict type-safety and ensures that code is executed in a safe
environment by also enforcing code access security.
The .NET Framework class library provides a collection of useful and reusable
types that are designed to integrate with the common language runtime. The types
provided by the .NET Framework are object-oriented and fully extensible, and they allow
you to seamlessly integrate your applications with the .NET Framework. The .NET
Framework is designed for cross-language compatibility, which means, simply, that .NET
components can interact with each other no matter what supported language they were
written in originally. So, an application written in Microsoft Visual Basic .NET might
reference a dynamic-link library (DLL) file written in Microsoft Visual C#, which in turn
might access a resource written in managed Microsoft Visual C++ or any other .NET
language. This language interoperability extends to full object-oriented inheritance. A
Visual Basic .NET class might be derived from a C# class, for example, or vice versa.
This level of cross-language compatibility is possible because of the common
language runtime. When a .NET application is compiled, it is converted from the
language in which it was written (Visual Basic .NET, C#, or any other .NET-compliant
language) to Microsoft Intermediate Language (MSIL or IL). MSIL is a low-level
language that the common language runtime can read and understand. Because all
.NET executables and DLLs exist as MSIL, they can freely interoperate. The Common
Language Specification (CLS) defines the minimum standards to which .NET language
compilers must conform. Thus, the CLS ensures that any source code successfully
compiled by a .NET compiler can interoperate with the .NET Framework.
The CTS ensures type compatibility between .NET components. Because .NET
applications are converted to IL prior to deployment and execution, all primitive data
types are represented as .NET types. Thus, a Visual Basic Integer and a C# int are both
represented in IL code as a System.Int32. Because both languages use a common type
system, it is possible to transfer data between components and avoid time-consuming
conversions or hard-to-find errors.
Visual Studio .NET ships with languages such as Visual Basic .NET, Visual C#,
and Visual C++ with managed extensions, as well as the JScript scripting language. You
can also write managed code for the .NET Framework in other languages. Third-party
tools and compilers exist for Fortran, Cobol, Perl, and a host of other languages. All of
these languages share the same cross-language compatibility and inheritability. Thus,

24
you can write code for the .NET Framework in the language of your choice, and it will be
able to interact with code written for the .NET Framework in any other language.

V C# Vb.net Jscript. …
I net
S
U Common Language Specification (CLS)
A
L
Window
s Forms Web Form Web Service
S
T
U ADO.Net and XML
D
I
O .Net Framework Classes

. Common Language Runtime


Ne
t Windows COM+ Services
t

Microsoft .NET Architecture Hierarchy

Features of the .NET Platform


• Multilanguage Development
• Platform and Processor Independence
• Versioning Support
• Security

Components of the .NET Architecture

As we mentioned earlier, there is a lot to the .NET Framework. In this section,


We identify the individual components and describe their features and how they
Fit into the overall picture.

. NET Runtime

The heart of the .NET Framework is the CLR. Similar in concept to the Java Virtual
Machine, it is a runtime environment that executes MSIL code. Unlike the Java
environment, which is the concept of one language for all purposes, the .NET platform
supports multiple programming languages through the use of the Common Language
Specification, which defines the output required of compilers that want to target the CLR.

Managed/Unmanaged Code

Because all code targeted at the .NET platform runs with the CLR environment, it is
referred to as managed code. This simply means that the execution of the code and its
behavior is managed by the CLR. The metadata available with managed code contains

25
the information required to allow the CLR to manage its safe execution. By safe
execution we mean memory and security management, type safety, and inter-language
interoperability. Unmanaged code can write to areas of memory it does not own, execute
instructions at arbitrary locations in memory, and exhibit any number of other bad
behaviors that cannot be managed or prevented by the CLR. Most of the applications
running on Windows today are unmanaged.

Intermediate Language

The .NET intermediate language, MSIL, is defined in the Common Language


Specification. It is an amalgam of a low-level language similar in many ways to a
Machine language and a higher object language. You can write applications directly
In MSIL, much as you can write directly in assembly language. Thankfully, this is
Not necessary for most purposes.

Compiling

Running your C# code through the C# compiler produces two important pieces of
Information: code and metadata. The following sections describe these two items and
then finish up by examining the binary building block of .NET code: the assembly.

Microsoft Intermediate Language (MSIL)

The code that is output by the C# compiler is written in a language called Microsoft
Intermediate Language or MSIL. MSIL is made up of a specific set of instructions that
Specify how your code should be executed. It contains instructions for operations such
as variable initialization, calling object methods, and error handling, just to name a few.
C# is not the only language in which source code changes into MSIL during the
compilation process. All .NET-compatible languages, including Visual Basic .NET and
Managed C++, produce MSIL when their source code is compiled. Because all of the
.NET languages compile to the same MSIL instruction set, and because all of the .NET
languages use the same runtime, code from different languages and different compilers
can work together easily.

MSIL is not a specific instruction set for a physical CPU. It knows nothing about
the CPU in your machine, and your machine knows nothing about MSIL. How, then,
does your .NET code run at all, if your CPU can't read MSIL? The answer is that the
MSIL code is turned into CPU-specific code when the code is run for the first time. This
process is called "just-in-time" compilation, or JIT. The job of a JIT compiler is to
translate your generic MSIL code into machine code that can be executed by your CPU.
You may be wondering about what seems like an extra step in the process. Why
generate MSIL when a compiler could generate CPU-specific code directly? After all,
compilers have always done this in the past. There are a couple of reasons for this.

First, MSIL enables your compiled code to be easily moved to different hardware.
Suppose you've written some C# code and you'd like it to run on both your desktop and
a handheld device. It's very likely that those two devices have different types of CPUs. If
you only had a C# compiler that targeted a specific CPU, then you'd need two C#
compilers: one that targeted your desktop CPU and another that targeted your handheld
CPU. You'd have to compile your code twice, ensuring that you put the right code on the

26
right device. With MSIL, you compile once. Installing the .NET Framework on your
desktop machine includes a JIT compiler that translates your MSIL into CPU-specific
code for your desktop.

Installing the .NET Framework on your handheld includes a JIT compiler that
translates that same MSIL into CPU-specific code for your handheld. You now have a
single MSIL code base that can run on any device that has a .NET JIT compiler. The JIT
compiler on that device takes care of making your code run on the device. Another
reason for the compiler's use of MSIL is that the instruction set can be easily read by a
verification process. Part of the job of the JIT compiler is to verify your code to ensure
that it is as clean as possible. The verification process ensures that your code is
accessing memory properly and that it is using the correct variable types when calling
methods that expect a specific type. These checks ensure that your code doesn't
execute any instructions that could make the code crash.

The MSIL instruction set was designed to make this verification process relatively
straightforward. CPU-specific instruction sets are optimized for quick execution of the
code, but they produce code that can be hard to read and, therefore, hard to verify.
Having a C# compiler that directly outputs CPU-specific code can make code verification
difficult or even impossible. Allowing the .NET Framework JIT compiler to verify your
code ensures that your code accesses memory in a bug-free way and that variable types
are properly used.

27
The Execution Flow In .Net

28
Common Type System
The .NET applications, regardless of their source languages all share a common type
system. What this means is that you no longer have to worry when doing development in
multiple languages about how a data type declared in one language needs to be
declared in another. Any .NET type has the same attributes regardless of the language it
is used in. Furthermore, all .NET data types are objects, derived from System. Object.
Because all data types derive from a common base class, they all share some basic
functionality, for example the ability to be converted to a string, serialized, or stored in a
collection.

. NET Base Class Library (BCL)


If I could have bought a library that offered everything the .NET Base Class Library
offers when I started programming, a year’s salary would have seemed reasonable—
there really is that much to it. Almost everything in the .NET environment is contained
within the BCL. Let’s look at a “Hello World” example:
Using System;
Class Hello
{Public static void Main ()
{ Console.WriteLine ("Hello World");}
}
The only function contained in this simple program is a call to the WriteLine
method of the Console class. What is really unique about the .NET environment is
that .NET languages don’t have to implement even the most basic functions; they are
available in the BCL. Because all .NET languages share the same common set of
libraries, the code being executed by your C# program is the same code being executed
by a program written in another language. This means that all languages that target
the .NET environment essentially share the same capabilities, except they have different
syntax.

Assemblies
Assemblies are the means of packaging and deploying applications and components
In .NET. Just like a compiled application or component today, assemblies can be made
up of either single or multiple files. An assembly contains metadata information (covered
in the next section), which is used by the CLR for everything from type checking and
security to actually invoking the components methods. All of this means that you don’t
need to register .NET components, unlike COM objects.
Sometimes, you will use C# to build an end-user application. These applications are
packaged as executable files with an extension of .EXE. Windows has always worked
with .EXE files as application programs, and C# fully supports building .EXE files.
However, there may be times when you don't want to build an entire application. Instead,
you may want to build a code library that can be used by others. You may also want to
build some utility classes in C#, for example, and then hand the code off to a Visual
Basic .NET developer, who will use your classes in a Visual Basic .NET application. In
cases like this, you won't be building an application. Instead, you'll be building an
assembly. An assembly is a package of code and metadata. When you deploy a set of
classes in an assembly, you are deploying the classes as a unit; and those classes
share the same level of version control, security information, and activation
requirements. Think of an assembly as a "logical DLL." If you're familiar with Microsoft

29
Transaction Server or COM+, you can think of an assembly as the .NET equivalent of a
package. There is two types of assemblies: private assemblies and global assemblies.
When you build your assembly, you don't need to specify whether you want to build a
private or a global assembly. The difference is apparent when you deploy your
assembly. With a private assembly, you make your code available to a single
application. Your assembly is packaged as a DLL, and is installed into the same
directory as the application using it. With a deployment of a private assembly, the only
application that can use your code is the executable that lives in the same directory as
your assembly. If you want to share your code among many applications, you might
want to consider deploying your code as a global assembly. Global assemblies can be
used by any .NET application on the system, regardless of the directory in which it is
installed. Microsoft ships assemblies as a part of the .NET Framework, and each of the
Microsoft assemblies is installed as a global assembly. The .NET Framework contains a
list of global assemblies in a facility called the global assembly cache, and the .NET
Microsoft Framework SDK includes utilities to both install and remove assemblies from
the global assembly cache.

Metadata
The compilation process also outputs metadata, which is an important piece of the .NET
code sharing story. Whether you use C# to build an end-user application or you use C#
to build a class library to be used by someone else's application, you're going to want to
make use of some already-compiled .NET code. That code may be supplied by
Microsoft as a part of the .NET Framework, or it may be supplied by a user over the
Internet. The key to using this external code is letting the C# compiler know what classes
and variables are in the other code base so that it can match up the source code you
write with the code found in the precompiled code base that you're working with.Think of
metadata as a "table of contents" for your compiled code. The C# compiler places
metadata in the compiled code along with the generated MSIL. This metadata accurately
describes all the classes you wrote and how they are structured. All of the classes'
methods and variable information is fully described in the metadata, ready to be read by
other applications. Visual Basic .NET, for example, may read the metadata for a .NET
library to provide the IntelliSense capability of listing all of the methods available for a
particular class. If you've ever worked with COM (Component Object Model), you may
be familiar with type libraries. Type libraries aimed to provide similar "table of contents"
functionality for COM objects. However, type libraries suffered from some limitations, not
the least of which was the fact that not all of the data relevant to the object was put into
the type library. Metadata in .NET does not have this shortcoming. All of the information
needed to describe a class in code is placed into the metadata. You can think of
metadata as having all of the benefits of COM type libraries without the limitations.

Just In Time Compilation


The .NET CLR utilizes Just In Time (JIT) compilation technology to convert the IL code
back to a platform/device–specific code. In .NET, you currently have three types of JIT
compilers: ss.com
Pre-JIT: This JIT compiles an assembly’s entire code into native code at one stretch.
You would normally use this at installation time.
Econo-JIT: You would use this JIT on devices with limited resources. It compiles the IL
code bit-by-bit, freeing resources used by the cached native code when required.
Normal JIT: The default JIT compiles code only as it is called and places the resulting
native code in the cache. In essence, the purpose of a JIT compiler is to bring higher
performance to interpreted code by placing the compiled native code in a cache, so that

30
when the next call is made to the same method/procedure, the cached code is executed,
resulting in an increase in application speed.

Garbage Collection
Memory management is one of those housekeeping duties that take a lot of
programming time away from developing new code while you track down memory leaks.
A day spent hunting for an elusive memory problem usually isn’t a productive day.
The .NET hopes to do away with all of that within the managed environment with
the garbage collection system. Garbage collection runs when your application is
apparently out of free memory, or when it is implicitly called but its exact time of
execution cannot be determined. Let’s examine how the system works. When your
application requests more memory, and the memory allocator reports that there is no
more memory on the managed heap, garbage collection is called. The garbage collector
starts by assuming everything in memory is trash that can be freed. It then walks though
your application’s memory, building a graph of all memory that is currently referenced by
the application. Once it has a complete graph, it compacts the heap by moving all the
memory that is genuinely in use together at the start of the free memory heap. After this
is complete, it moves the pointer that the memory allocator uses to determine where to
start allocating memory from the top of this new heap. It also updates all of your
application’s references to point to their new locations in memory. This approach is
commonly called a mark and sweep implementation. The exception to this is with
individual objects over 20,000 bytes. Very large objects are allocated from a different
heap, and when this heap is garbage collected,
They are not moved, because moving memory in this size chunks would have an
adverse effect on application performance.

TECNOLOGY: The ASP.NET


ASP.NET adds many features to and enhances many of the capabilities in classic ASP.
ASP.NET isn’t merely an incremental improvement to ASP; it’s really a completely new
product, albeit a new product designed to allow the same development experience that
ASP developers have enjoyed. ASP.NET is a set of components that provide developers
with a framework with which to implement complex functionality. Two of the major
improvements of ASP.NET over traditional ASP are scalability and availability. ASP.NET
is scalable in that it provides state Services that can be utilized to manage session
variables across multiple Web servers in a server farm. Additionally, ASP.NET
possesses a high performance process model that can detect application failures and
recover from them. Along with improved availability and scalability, ASP.NET provides
the following additional benefits:

Simplified development: ASP.NET offers a very rich object model that developers can
use to reduce the amount of code they need to write.

Language independence: ASP pages must be written with scripting. In other words,
ASP pages must be written in a language that is interpreted rather than compiled.
ASP.NET allows compiled languages to be used, providing better performance and
cross-language compatibility.

31
Simplified deployment: With .NET components, deployment is as easy as copying a
component assembly to its desired location.

Cross-client capability: One of the foremost problems facing developers today is


writing code that can be rendered correctly on multiple client types. For example, writing
one script that will render correctly in Internet Explorer 5.5 and Netscape Navigator 4.7,
and on a PDA and a mobile phone is very difficult, if not impossible, and time
consuming. ASP.NET provides rich server-side components that can automatically
produce output specifically targeted at each type of client.

Web services: ASP.NET provides features that allow ASP.NET developers to


effortlessly create Web services that can be consumed by any client that understands
HTTP and XML, the de facto language for inter-device communication.
Performance: ASP.NET pages are compiled whereas ASP pages are interpreted.
When an ASP.NET page is first requested, it is compiled and cached, or saved in
memory, by the .NET Common Language Runtime (CLR). This cached copy can then
be re-used for each subsequent request for the page. Performance is thereby improved
because after the first request, the code can run from a much faster compiled version.

PROGRAMMMING LANUAGE: C#.Net


Introducing C#
C#, the new language introduced in the .NET Framework, is derived from C++.
However, C# is a modern, objected-oriented (from the ground up) type-safe language.

Language features
The following sections take a quick look at some of the features of the C# language
Classes
All code and data in C# must be enclosed in a class. You can't define a variable outside
of a class, and you can't write any code that's not in a class. Classes can have
constructors, which execute when an object of the class is created, and a destructor,
which executes when an object of the class is destroyed. Classes support single
inheritance, and all classes ultimately derive from a base class called object. C#
supports versioning techniques to help your classes evolve over time while maintaining
compatibility with code that uses earlier versions of your classes. As an example, take a
look at a class called Family. This class contains the two static fields that hold the first
and last name of a family member as well as a method that returns the full name of the
family member.
Class Class1
{
Public string FirstName;
Public string LastName;
Public string FullName ()
{
Return FirstName + LastName;
}
}
Note Single inheritance means that a C# class can inherit from only one base
class.
C# enables you to group your classes into a collection of classes called a namespace.

32
Namespaces have names, and can help organize collections of classes into logical
groupings. These classes are contained within the Microsoft namespace.

Data types
C# lets you work with two types of data: value types and reference types. Value types
hold actual values. Reference types hold references to values stored elsewhere in
memory. Primitive types such as char, int and float, as well as enumerated values and
structures, are value types. Reference types hold variables that deal with objects and
arrays. C# comes with predefined reference types (object and string), as well as
predefined value types (sbyte, short,int, long, byte, ushort, uint, ulong, float, double,
bool, char, and decimal). You can also define your own value and reference types in
your code. All value and reference types ultimately derive from a base type called object.
C# allows you to convert a value of one type into a value of another type. You
can work with both implicit conversions and explicit conversions. Implicit conversions
always succeed and don't lose any information (for example, you can convert an int to a
long without losing any data because a long is larger than an int). Explicit conversions
may cause you to lose data (for example, converting a long into an int may result in a
loss of data because a long can hold larger values than an int). You must write a cast
operator into your code to make an explicit conversion happen.
You can work with both one-dimensional and multidimensional arrays in C#.
Multidimensional arrays can be rectangular, in which each of the arrays has the same
Dimensions, or jagged, in which each of the arrays has different dimensions. Classes
and structures can have data members called properties and fields. Fields are variables
that are associated with the enclosing class or structure. Properties are like fields, but
enable you to write code to specify what should happen when code accesses the value.
If the employee's name must be read from a database, for example, you can write code
that says, "When someone asks for the value of the Name property, read the name from
the database and return the name as a string."

Functions
A function is a callable piece of code that may or may not return a value to the code that
originally called it. An example of a function would be the FullName function shown
earlier, in this chapter, in the Family class. A function is generally associated to pieces of
code that return information whereas a method generally does not return information.
For our purposes however, we generalize and refer to them both as functions.
C# and the CLR work together to provide automatic memory management. You
don't need to write code that says "allocate enough memory for an integer" or "free the
memory that this object was using." The CLR monitors your memory usage and
automatically retrieves more when you need it. It also frees memory automatically when
it detects that it is no longer being used (this is also known as Garbage Collection).C#
provides a variety of operators that enable you to write mathematical and bitwise
expressions. Many (but not all) of these operators can be redefined, enabling you to
change how the operators work. Classes can contain code and data. Each class
member has something called an accessibility scope, which defines the member's
visibility to other objects.

Variables
Variables can be defined as constants. Constants have values that cannot change
during the execution of your code. The value of pi, for instance, is a good example of a
constant, because its value won't be changing as your code runs. Enum type

33
declarations specify a type name for a related group of constants. C# provides a built-in
mechanism for defining and handling events. If you write a class that performs a lengthy
operation, you may want to invoke an event when the operation is completed. Clients
can subscribe to that event and catch the event in their code, which enables them to be
notified when you have completed your lengthy operation. The event handling
mechanism in C# uses delegates, which are variables that reference a function. Note an
event handler is a procedure in your code that determines the actions to be performed
when an event occurs, such as the user clicking a button. If your class holds a set of
values, clients may want to access the values as if your class were an array. You can
write a piece of code called an indexer to enable your class to be accessed as if it were
an array. Suppose you write a class called Rainbow, for example, that contains a set of
the colors in the rainbow. Callers may want to write MyRainbow [0] to retrieve the first
color in the rainbow. You can write an indexer into your Rainbow class to define what
should be returned when the caller accesses your class, as if it were an array of values.

Interfaces
C# supports interfaces, which are groups of properties, methods, and events that specify
a set of functionality. C# classes can implement interfaces, which tell users that the class
supports the set of functionality documented by the interface. You can develop
implementations of interfaces without interfering with any existing code, which minimizes
compatibility problems. Once an interface has been published, it cannot be changed, but
it can evolve through inheritance. C# classes can implement many interfaces, although
the classes can only inherit from a single base class.

Attributes
Attributes declare additional information about your class to the CLR. In the past, if you
Wanted to make your class self-describing, you had to take a disconnected approach in
which the documentation was stored in external files such as IDL or even HTML files.
Attributes solve this problem by enabling you, the developer, to bind information to
classes — any kind of information. For example, you can use an attribute to embed
documentation information into a class. Attributes can also be used to bind runtime
information to a class, defining how it should act when used. The possibilities are
endless, which is why Microsoft includes many predefined attributes within the .NET
Framework.

SQL SERVER 2005


A database is stored in a very structured manner. Each database requires some way for
a user to interact with the information within. Such interaction is performed by a
database management system (DBMS). SQL Server is a member of a large category of
products known as database management systems (DBMS). The general purpose of a
DBMS is to provide for the definition, storage, and management of data in a centralized
area that can be shared by many users. SQL Server’s database management system is
patterned on the relational model. Relational databases allow us to store vast amounts
of data with far sampler maintenance and smaller storage requirements than the
equivalent flat database. Relations among tables in a relational database are established
using keys. A primary key is a field that uniquely identifies a record so it can be
referenced from a related table. A foreign key is a field that holds identification values to
relate records stored on other tables.
Querying the database

34
With each query of the database, we form a virtual table that contains the results of our
query. Database queries are made with a specific language named SQL (structured
query language).
SQL Server 2000 has many performance improvements and features which
allows us to build and manage large databases, query them fast, insert data into them at
high rates, partition them for fast loading and backup and store very large objects or
whole files. Central, Shared, Accessible, Backed up, Version etc.. SQL, as a relational
data language supports certain basic functions to control, define and manipulate data.
SQL uses the term row to refer to a database record and the term column to refer to
database field
The Data type that a Cell can hold

Data Type Description


CHAR (size) This data type is used to store character strings values
of fixed length. The size in brackets determines the
: number of characters the cell can hold. The maximum
number of characters this data type can hold is 255
characters
VARCHAR (size) : This data type is used to store variable length
alphanumeric data. The maximum this data type can
hold is 2000 character.
int (P, S) : The ‘int’ data type is used to store numbers (fixed of
floating point). Numbers of virtually any magnitude
may be stored up to 10 digits of precision.
DATE : This data type is used to represent data and time. The
standard format is DD-MM-YY as in 24-JAN-03. To
enter dates other than the standard format, use the
appropriate functions. Date Time stores date in the 24-
hour format.
LONG : This data type is used to store variable length character
strings containing up to 2GB. LONG data can be store
arrays of binary data in ASCII format.

SYSTEM DESIGN
System development can generally be through of as having two major
components – Analysis and Design. Systems development consists of following
activities:
• Preliminary investigation.
• Determination of system requirements.
• Design of system.
• Development of software.
• System testing.
• Implementation and evaluation.

35
Analysis
System analysis is conducted with the following objectives:
• Identify the user’s need.
• Evaluate the system concept for feasibility.
• Perform economic and technical analysis.
• Allocate functions to hardware, software, people, database and other system
elements.
• Establish cost and schedule constraints.
Both hardware and software expertise are required to successfully attain the objectives
listed above.

System Design
Software design is a multi step process, which focuses on distinct attributes of program:
data structure, software architecture, interface representation, and procedural detail. The
design process translates requirements into a representation of the software that can be
assessed for quality before coding begins. According to requirement, the design is
documented. The design must be translated into a machine-readable form. The code
generation step perform this task .If design is performed in a detailed manner; code
generation can be accomplished.

The design phase is the first step while moving from the problem domain to
solution domain. This phase begins when the requirement specification document for the
newly developing system is available. The aim of this phase is to produce a model or a
representation of the system, which is useful to have an overall look of the system
without even developing the entire system. This model of the system is a plan for
developing the system. There are three characteristics for the evaluation of a good
design.
• The design should implement all the requirements of the user, which are specified in
the SRS and it must accommodate all the requirements that may not be specified by
the Recruiter in the analysis phase.
• The design should be readable and understandable for the people who involve in
developing and testing the system.
• The design should provide a complete picture of the system, addressing the data
functional domains from the implementation perspective.

36
DESIGN METHODOLOGY
The following approach is used to design this system, which is called “iterative
waterfall model”. It is same as waterfall model but the deference is that we can go reverse
to any phase during any phase:-

1. System / information engineering and modeling.


2. Software requirement analysis.
3. Design.
4. Code generation.
5. Testing.
6. Maintenance.

System
Engineerin

Analysis
Design
Code
Testing

Maintenance

Data Flow Diagram

Zero Level DFD:

Jobseeker Recruiter
Xplore-jobs

37
1’st Level DFD

DATABASE
Xplore-Jobs

Pay to
consulta
nt Conta
ct
Invoice Place
Place Applicant resume
requireme informatio
nt n

Jobseeker

Recruiter

Interviewin
g

2’nd Level DFD

38
RECRUITER Place Xplore-Jobs
requirement (HRMS)

Generate
RECRUITER job
specification
category
wise

Job entry

Sent to
consultant

Xplore-Jobs
(HRMS)

JOBSEEKER Place Xplore-Jobs


resume
(HRMS)

Gather
JOBSEEKER information
about
consultant

39
Prepare
resume
according to
job profile

Enter
personal
and
technical
Send information
resume
Xplore-Jobs
(HRMS)

Xplore-Jobs
Contact JOBSEEKER
(HRMS)
Xplore-Jobs
(HRMS) Match the
job
requiremen
t with
resume

Contact
through
phone/e-
mail

JOBSEEKER

Xplore-Jobs
(HRMS) Select
applicant
according to
job profile
40
Send
information
about the
applicant to

RECRUITER

Take
RECRUITER JOBSEEKER
Interview of

RECRUITER
Selecting
applicant send
by consultant

Fix
Interviews

Final
Selection

Inform to
JOBSEEKER applicant
Xplore-Jobs RECRUITER
Invoice

41
Sending
information
RECRUITER Xplore-Jobs
about
selection of
applicant

Xplore-Jobs Prepare in- RECRUITER


novice of
Applicant

RECRUITER Payment to Xplore-Jobs


consultant

RECRUITER

Check the
invoice
against the
requirement

Make
payment
accordingly

Xplore-Jobs
DATABASE DESIGN
A database management system (DBMS) consists of a collection of interrelated data
and a set of programs to access those data. The collection of data, usually referred to as
the database, contains information about one particular enterprise. The primary goal of a
DBMS is to provide an environment that is both convenient and efficient to use in
retrieving and sorting database information.

Database systems are designed to manage large bodies of information. The


management of data involves both the definition of storage of information and the
provision of mechanisms for the manipulation of information. We used relational
database management system (RDBMS) for developing this system. The goal of a
relational- database management system (RDBMS) design is to generate a set of

42
relation schemas that allows us to store information without unnecessary redundancy. It
also allows us to retrieve information easily.
Redundancy
Redundancy means repetition of information i.e., same information may be written or
stored in many places (files). This redundancy may lead to data inconsistency i.e., the
various copies of the same data. When we access this inconsistent data, system may
give wrong information. To reduce the data redundancy we use the concept of
normalization.
Normalization
Normalization of data is a process in which unsatisfactory relation schemas are
decomposed by breaking up their attribute into smaller relation schemas that possess
desirable properties. Normal forms provide a formal framework for analyzing relation
schemas based on their keys and the functional dependencies among attributes to
database designers. The concurrent process model is often used as the paradigm for
the development of Recruiter server system that is composed of a set of functional
component. When applied to Recruiter /server, the concurrent process model defines
activity in two dimensions--a system dimension and a component dimension. System
level issues are addressed using three activities: design, assembly, and use. The
component dimension is addressed with two activities: design and realization.
Concurrency is achieved in two ways:
• System and activities occur simultaneously and can be modeled using the state-
oriented approach described previously.
• A typical Recruiter server application is implemented with many components, each

Of which can be designed and realized concurrently.

Introduction to data dictionary


Data dictionaries are an integral component of structured analysis, since data flow
diagrams by themselves do not fully describe the subject of the investigation/. The data
flow diagrams provide the additional details about the project/system.

Data Dictionary (Definition)


A data dictionary is a catalog- a repository- the elements in a system. These elements
center on the data and the way they are structured to meet user requiremen6ts and
organization needs. A data dictionary consists of a list of all the elements composing the
data flowing through a system. The major elements are data flow, data stores, and
processes. The data dictionary stores details and descriptions of these elements.

Describing Data Elements


Each entry in the data dictionary consists of a set of details describing the data used or
produced in the system. Each item is identified by a data name, description, alias, and
length and has specific values that are permissible for it in the system being studied.

LIST OF TABLES

CityRecord Table

43
CompanyLogin table

Companydetails table

44
Jobdetails table

Jobuserdetails table

jobseekerdetails table

45
Jobseekerqualificationdetail table

46
resumeRecord table

Qualificationset table

skillset table

47
48
RELATIONSHIP IN TABLES

49
OPTIMIZATION OF CODE
The code of any application must be optimized. An optimized code effect the efficiency
of the code and then the application .For application of code, we have used generalized
function and that function are used in the entire application .An optimized code is also
useful in terms of storage space .It require less hard disk space then the normalize code.
It increase the speed of the compiler to compile the program .It needs less memory to
run the application .An optimized program is also useful for the compiler when they are
writing to give more assistance to their customer.

50
SCREENS SHOTS

51
Home page of the site

52
Jobseeker Login page of the site

53
Registration page of the Jobseeker

54
Registration page of the Jobseeker

55
Registration page of the Jobseeker

56
Registration page of the Jobseeker

57
User Account after login

58
Page for block the company by user

59
Logout page of the site

60
Recruiter Login page of the site

61
Recruiter Registration

62
Recruiter Registration

63
Successful Recruiter Registration page of the site

64
Forgot password page of the site

65
66
Search Job Page

67
Contact page of the site

68
Post Job page of the site

69
TESTING STRATEGIES
TESTING TECHNIQUE AND TESTING Strategies

There are following rules that can serve well as testing objectives:
i) Testing is a process of executing a program with the
intent of finding an error.
ii) A good test case is one that has a high probability of
finding an as-yet-undiscovered error.
iii) A successful test is one that uncovers as-yet-
undiscovered error.
There are two types of testing techniques:

i) White box testing.


ii) Black box testing.
White box testing:
White box test focus on the program control
structure. Test cases are derived to ensure that all statement in the
program has been executed at least once during testing and that all
logical condition has been exercised. Basic path testing, a white
box testing, makes use of program graph to derive the set of
linearly independent test that will ensure coverage.

Condition Testing:
Condition testing is tests case design method that exercise the
logical conditions contain in a program module. a simple condition
is a Boolean variable or a relational expression.
Branch Testing:
I have used Branch testing is probably the for compound condition,
the true and false (in project i.e. null values) for each branch.
Data Flow Testing:
I have used data flow testing due to check the path of program
according to the locations of definitions and uses of variables in the
program.
Loop Testing:
In our project I have use only simple loop. And I have use m pass
through the loop where m<n.

BLACK BOX TESTING:


Black box testing focuses on the functional requirements of the
software. That is, black-box testing enable the software engineer to

70
derive set of input conditions that will fully exercise all functional
requirements for a program.
Graph-Based Testing Method:
I have used graph-based testing method for removing errors
associated with relationships. The first step in this testing is to
understand the objects that are modeled in software and the
relationship that connect these objects.
Equivalence Partitioning:
This testing is used for the following reason:
1. specific numeric values
2. range of values
3. set of related values
4. Boolean condition
For example
Check in phone number, code generation, share type, payment type,
password etc.

Boundary Value Analysis:


Boundary value analysis is a test case design technique that
complements equivalence partitioning. Rather than selecting any
element of equivalence class the selection of test cases at the
edges of the class. Rather than focusing solely on the input
condition,
The point of equivalence partitioning as;
1) An input condition specifies a range boundary by values a and
b, test cases should be design with values a and b and just
above and just below a and b.
2) An input condition specific a number of values, test cases
should be developed that exercise the minimum and maximum
number.
TESTING STRATEGIES
System Testing
every aspect of During system testing the system is used
experimentally to ensure that the software does not fail i.e. it will
run according to its specifications and in the way users expect.
Special test data are input for the processing and the results
examine. A limited number of users try to use it in unforeseen ways.
It is preferable to discover any surprise before the organization
implements the system and depend upon it.

In many organizations persons other than those who wrote the


original programs to ensure more complete and unbiased testing
and more reliable testing perform testing.
The norms that were followed during the phase were that after the
developer of the software has satisfied regarding the software
under consideration he is required to release the program source
code. A setup name release is used to copy the name file from the
developers’ user area to a project area in the directory named with
developer user name. Here all the final testing used to be done by

71
persons other than the developer himself .if some changes were
d e s i r e d i n t h e p r o g r a m the developer were required to use another setup.
Retrieve, which copied back the latest version of the program to developer areas.
As in this system data is entered at different levels I considered providing various types
of checks like range check, validity check, completeness check etc. in different data
entry screens according to the requirements.
Since the user are not familiar to the new system the data screens were designed in
such a way that were-.
• consistent
• easy to use
• has a fast response time
The following convention were used while designing the various screen:
Unit Testing :
In unit testing I have testing a single program module in an isolated environment.
Testing of the processing procedures is the main focus.
Integration Testing :
Because of interfaces among the system modules, we use integration
testing. In other word’s it ensures that the data moving between the
modules is handled as intended .
System Testing :
System testing is the testing of the system against its initial objective. It
is done either in a simulated environment or in live environment.
Test Review :
Test review is the process that ensures that testing is carried out as
planned. Test review decides whether or not the program is ready to be
shipper out for implementation
Security Testing :
Security testing attempt to verify that protection mechanism built into a system
will, in fact, protect it form penetration mechanisms.

SYSTEM IMPLEMENTATION

Implementation is the process of having system personnel check out


and put new equipment into use train users install the new
application and construct any files of data needed to use it.
Depending on the size of the organization that will be involved in
using the application and the risk associated with its use
developers may choose
To pilot the operation in only one area of the firm say in one
department or with only one or two persons. Sometimes they will
run the old and new systems together to compare the results. In still
other situations developers will stop using the old system one-day
and begin to use the new one the next day. Anyway each
implementation strategy has its merits
Depending on the Clinic situation in which it is considered.
Regardless of the implementation strategy used developers strive to
ensure that the systems initially used is trouble free.

72
Once installed applications are often used for many years. However
both the organization and the user will change and the environment
will be different over weeks and months. Therefore the application
will be different over weeks and months. Therefore the application
will undoubtedly have to be maintained. Modifications and changes
will be made to software, files or procedures to meet emerging user
requirements. Since organization systems and the ShareBazar
environment undergo continual change, the information systems
should keep pace. In this sense implementations an ongoing
process.

DATA LOADING
1-The whole database created earlier for development purpose was
dropped and then using case tools the entire database was freshly
created.
2-Required data files executables scripts were released into client
server from the developer machine.

USER TRAINING
Users of the proposed system had already the feel of the system
during the development stages. They were given user
documentation, which gave them the exact steps to be performed for
getting their job done starting from getting the terminals on. Most of
the users were quick to get their job done in a right way after the
very first training class. They were given the explicit advantages of
the new system and also the areas it was having shortcomings.
After this was the stabilizing the system as the users started to give
in new suggestions and requirements. For us the maintenance
phase had begun.

Scope of future enhancement


It is unreasonable to consider a computer based information system
complete or finished; the system continues to evolve throughout its
life cycle, even if it’s successful. Due to the creative nature of the
design, there remain some lapses inaccurate communications
between the users and the developers. So, certain aspects of the
system must be modified as operational experience is gained with
it. As users work with the system, they develop ideas for change
and enhancements.
Maintenance of the project is very easy due to its modular design and concept
any modification can be done very easily. All the data are stored in the software as per
user need & if user wants to change he has to change that particular data, as it will be
reflected in the software every where. Some of the maintenance applied is: -

73
(1) BREAKDOWN MAINTENANCE:-
The maintenance is applied when an error occurs & system
halts and further processing can not be done .At this time user can view documentation
or consult us for rectification & we will analyze and change the code if needed. Example:
- If user gets a error “report width is larger than paper size” while printing report &
reports can not be generated then by viewing the help documentation & changing the
paper size to ‘A4’ size of default printer will rectify the problem.”
(2) PREVENTATIVE MAINTENANCE: -
User does this maintenance at regular intervals for smooth
functioning (operation) of software as per procedure and steps mentioned in the manual.
Some reasons for maintenance are: -
(a) Error Correction: - Errors, which were not caught during testing, after the system
has, been implemented. Rectification of such errors is called corrective
maintenance.
(b) New or changed requirements:- When Clinic requirements changes due to
changing opportunities.
(c) Improved performance or maintenance requirements: -Changes that is made to
improve system performance or to make it easier to maintain in the future are
called preventive maintenance.Advances in technology (Adaptive maintenance):
- Adaptive maintenance includes all the changes made to a system in order to
introduce a new technology.

COST ESTIMATION OF THE PROJECT

I am using the COCOMO model for estimating the cost of the system .It is
regarded as an semidetached system. Since this project is somewhat
small, COCOMO estimate might be inaccurate. COCOMO is designed for
use on system larger than 2 KDL.
This model estimates the total effort in term of person-month of technical project staff. It
does not include the cost of the secretarial staff that might be needed. The basic steps
in this model are:

1) Obtain an initial estimate of the development ef f o r t from the


e s t i m a t e of t h o u s a n d s o f d e l i v e r e d l i n e s o f s o u r c e c o d e ( K D L ) .
2 ) D e t e r m i n e a s e t o f m u l t i p l y i n g f a c t o r f r o m d i f f er e n t a t t r i b u t e o f t h e
project.

74
3 ) A d j u s t t h e e f f o r t e s t i m at e b y m u l t i p l y i n g t h e i n i t i a l e s t i m a t e w i t h a l l
t h e m u l t i p l y i n g f a ct o r .

T h e i n i t i a l e st i m a t e i s d e t e r m i n e d b y a n e q u a t i o n o f t h e f or m u s e d i n
the static, single-variable modes, using KDL as measure of size. To
d e t e r m i n e t h e i n i t i a l e f f or t E i i n p e r s o n - m o n t h s t h e e q u a t i o n u s e d i s of
the type
b
Ei = a*(KDL)
T h e r e a r e 1 5 d i f f e r e n t a t t r i b u t e , c a l l e d c o s t d r i v e r a t t r i b u t e s t h at
d e t e r m i n e t h e m u l t i p l y i n g f a c t or s . T h e s e f a c t o r s d e p e n d o n pr o d u c t ,
computer, personal, and technology.

A l l 1 5 f a ct o r a r e m u l t i p l i e d t o g e t h e r t o g e t t h e e f f o r t a d j u s m e n t
f a c t or ( E A F ) . T h e f i n a l c o st e s t i m a t e , E , i s o b t a i n e d b y m u l t i p l y i n g t h e
initial estimate by the EAF.
E = EAF * Ei
Cost estimation
T h e s i z e e st i m a t e s f or t h e s e i n l i n e s o f c o d e a r e.
6803=6.803 KDL
C a t e g o r y o f pr o j e c t i s s e m i d e t a c h e d s o c o n s t r a i n t of a & b a f o l l o w s
A=3.0 & b =1.12
So, E i = 3 . 0( 6 . 8 0 3 ) 1 . 1 2
= 3. 0 ( 8 . 5 6 3 )
= 25.689
R a t i n g o f m u l t i p l i e r f or d i f f er e n t c o s t d r i v e r s .

C ost d r i v er R at i ng V al ue s

Software reliability very high 1.40


Data base size high 1.08
Product complexity high 1.15
Computer turn around time very high 1.15
Analyst capability high 0.86
Application experience nominal 1.00

75
Programmer capability high 0.86
Programming language exp. High 0.95
Morden prog. Practice high 0.95
Use of software tools low 1.10
Development schedule nominal 1.00

The effort adjustment factor(EAF) is


E A F = 1. 4 * 1 . 0 8 * 1 . 1 5 * 1 . 1 5 * . 8 6 * 1 * . 8 6 * . 9 5 * . 9 5 * 1 . 1 * 1

=1.46
the initial effort of the project is

E = Ei*EAF

= 1.46*20.789
=30.52 PM

SECURITY
There are several reasons why you should think of security when you want to
present a Web application.
Reasons for Security
One reason is that you might want to prevent access to some areas of your Web
server. Different groups of users might have different access rights to different
areas or virtual directories of your application.You also need security when you
have to record and store secure relevant user data. This data has to be protected
against public and unwanted access.

SECURITY CONFIGURATION
The Web.Config file has already been introduced. So it may be enough to
mention that all security-related configuration information in ASP.NET is
contained in this file. You have the ability to configure three fundamental
functions for ASP.NET security: authentication, authorization, and impersonation.
Therefore your Web.Config will have three additional sequences enclosed in the
parent <security> tag.
Authentication, Authorization, Impersonation
• Authentication—all your Web clients communicate with your Web application
through IIS. So you can use IIS authentication (Basic, Digest, and

76
NTLM/Kerberos) in addition to the ASP.NET built-in authentication solutions
(Passport and Cookie).
• Authorization—once a client request is authenticated, authorization
determines whether this identity is allowed to have access to the requested
resource.
• Impersonation—through impersonation ASP.NET applications can get the
identity of a client and behave like the client on whose behalf they are now
operating.

Code Access Security


In ASP.NET you can make use of .NET Framework features. So you have
access to the security solutions that the runtime provides. These are, for
example, code access or role-based security. Code access security is a way to
protect your server from malicious mobile code and allows benevolent mobile
code to run safely. Code access security gives you an answer to the question:
“Are you the code you told me you were?”
The presented security solutions only work for ASP.NET resources. Other files
types such as .gif, .txt, or .asp are still accessible, but you can map such files
explicitly to the ASP.NET security system by configuring IIS.
AUTHENTICATION
ASP.NET supports three authentication providers. These providers validate the
credentials a client sends along with a request against some authority. If the
credentials are valid, an authenticated identity is given to the client.
The three providers are:
• Windows authentication, which works hand in hand with IIS security.
• Passport authentication, which is a centralized authentication service
provided by Microsoft.
• Cookie authentication, which issues a cookie to the request/response that
contains the credentials for reacquiring the identity.

AUTHORIZATION
Once a client request is authenticated, the system determines whether access to
the requested resource can be granted or not.
ASP.NET distinguishes two types of authorization:
• File authorization
File authorization is active when using Windows authentication. To determine
whether access should be granted or not, a check against an access control
list (ACL) is done.
• URL authorization
Identities are mapped to pieces of the Uniform Resource Identifier (URI)
namespace using URL authorization to selectively allow access to parts of
the namespace.
IMPERSONATION
When using impersonations, IIS and Windows file access security come into
play. IIS authenticates the user using Basic, Digest, or Windows NTLM/Kerberos
authentication. IIS then passes a token to ASP.NET; the token is either
authenticated or unauthenticated.
ASP.NET now impersonates the given token, so that the ASP.NET application
can operate with the identity of the requesting client. The access to the requested

77
resource is permitted according to NTFS settings (obviously, the Web server file
system must be formatted as NTFS).

CODE ACCESS SECURITY


Besides the ASP.NET built-in security features, a developer can make use of
several security solutions of the .NET Framework. Here we focus on one of them:
code access security. With code access security you can admit code originating
from one computer system to be executed safely on another system. Therefore
the code’s identity and origin has to be verified.
To determine whether the code should be authorized or not, the runtime’s
security system walks the call stack, checking for each caller whether access to a
resource or performing an operation should be allowed.
In the .NET Framework you must specify the operations the code is allowed to
perform.

SECURITY MEASURES TAKEN


Only the registered organization will be able to use the application. The system
provides the facility to take backup and restoring of the data. There is no redundancy in
the system at any stage. The structure of each table has designed in such a way that to
remove any kind of encryption or decryption firewalls in the processing of the data. If a
transaction completes, then the record is saved in the concerned tables otherwise not.

There is a lot error handling technique has been used in the system. There is no
deadlock situation comes at any stage in the system. Because each and every user has
its own permissible rights to add/update/delete operation in the tables.

In the “Xplore-Jobs”’ we have taken mainly the three security preservers


through Web-portal coding, to protect the data and system by unauthorized users:
• User of the system will have to go through the login screen and must enter the User
Id and Password to enter to the Main Menu. If any one who doesn’t no the password
and gives wrong user id and password then system will inform about wrong
information entered. If he/she cancels this login prompt then application will unload
itself.
• Most important is the use of SQL Server 2000 system. It has its own password.
Only
valid user can enter in SQL Server database that knows user id and password. We
also create trigger in SQL Server for security. It get fired when any updation,
modification, deletion will be done on any table. It keeps record of date, time, user id,
name of table, old value, and altered value on which above operation is done, and thus
we can keep record of any discrepancy done.

• The use of Windows-XP will not provide access the system to unauthorized users.

• The above security will stop the unauthorized user to use the system. We make two
provisions in case of hardware failure. These are:

78
 For hardware failure especially in the case of hard disk failure, SQL
Server has the Option, of MIRROR hard disk, which back up the record in another
hard disk.
 In the case of server failure Windows XP has the facility to run another
CPU, which are, connected parallel to server and back up all the data of server and
invoked in the case of server failure automatically.

LIMITATIONS
Like any other standard application software (or any type of application in real
world), this developed software too has limitations. Although compared to the market,
this softwares “Xplore-Jobs” this application right now has numerous limitations and
but in future with enhancement it can be substituted for a real “Human Resource
management System” web-portal. Here is listing of known confines under this
application:
• As this is a web-based application, like modern web-portals present on the internet
or web this web-portal is not providing excellent user interface and GUI features.
However, attempts have been maintained to provide a sort of good user interface.
• Although security algorithms are being used to encrypt or decrypt the user name
and password but still there are chances that the confidential information provided by
the user may get leaked or tampered.
• As this a live project still the administrator part has to make more affective to perform
functional tasks like checking the valid of any genuine organization or user, making
payments etc.
• Companies are registered through online registration so there could be some fake
organization so such checks with its primary information is still to be made which is
not available in this project now.
• Whole of the registration process and the job searching is done on the website, so
there are chances that user may not be very familiar with computer system.

FUTURE ENHANCEMENTS
The project involves transforming the already existed manually operating system, so that
it can be accessed easily. Efforts have been made to cover all user requirements to the
extents possible and to make it user friendly. Input screens have designed in such a way
that user have practically no possible in entering the information.

Advantages of the proposed system


The user can access the system from anywhere .If the user wants any query
about the Recruiters, Jobseekers or requirement then a few keystrokes on a
computer keyboard by a operator can avail all the details helping him/her not to
wander department to department to access the details. The paper work is

79
greatly minimized. No training is given to the users (operators). With the interface
a layman can have pleasant experience working with the application.
A detailed analysis of Web-portal requirements would provide necessary
information for estimates, but analysis often takes weeks or months to complete.
Therefore, We must examine the product and the problem it is intended to solve at very
beginning of the project, At a minimum, the scope of the product must be established
and bounded. Scope is defined by answering the following questions-

Context- how does the Web-portal to be built into a larger system, product, or business
context and what constraints are imposed as a result of the context.
Information Objectives– What customer–visible data objects is produced as output
from the Web-portal? What data objects are required for input?
Function and Performance– What function does the Web-portal perform to transform
input data into output? Are any special performance
In recent times in India and also other parts of the world record handling and
maintaining the accuracy have become cumbersome processes. This Web-portal will
assist the concerned Recruiter in maintaining their records and searching proper
matching as per requirement only at mouse click. Manually find out the certain records
is a very tedious Cumbersome and risky job. This Web-portal can well assist in proper
Real Estate Records Management.

Annexure

80
DFD Notations:
DFD is also called bubble chart and it is a simple notation that shows how a system can
be represented in terms of input data to the system, various processing carried out on
these data and output data generated by the system.

Primitive DFD Symbols

Dataflow
Entity
Process

Data store

Output

1.) Process Symbol:-


A process is represented using a circle .Circles are annotated by the name of
corresponding functions.

2). External Entity Symbol:-


An external entity symbol is represented by the rectangle. These are external to the
software and interact with the system by inputting data to the system or by consuming
data produced by the system.

3). Data Flow Symbol:-


A directed arc or an arrow is used to show the flow of data between two processes
or between a process and an entity. Data flow symbols are annotated with the
corresponding data names.

81
4). Data Store Symbol:-
Open boxes are used to represent the data stores. A data store represents a
logical file, data structure or a physical file on disk. Each data store is connecting to the
process by means of a data flow symbol. The direction of data flow shows weather data
is written into or read from the store.

5). Output Symbol:-


This box represents data production during human computer interaction.

ER Diagram Notations
Entity
Entity An entity is an object or concept about which you
Want to store information.

Weak Entity
Entity A weak entity is dependent on another entity to exist

Attributes
Attribute Attributes are the properties or characteristics of an
Entity.

Key Attributes
Attribute A key attribute is the unique, distinguishing
Characteristic of the entity. For example, Student’s
Roll No. might be the Student’s key attribute.

Multi valued attribute


Attribute A multi valued attribute can have more than one
value. For example, an employee entity can have
multiple skill values.

Derived attribute

82
Attribute A derived attribute is based on another attribute. For

example, an employee’s monthly salary is based on

the employee’s annual salary.

Relationship Relationships
Relationships illustrate how to entities share
information in the database structure.

Relationship Weak Relationship


To connect a weak entity with others, you should

BIBLIOGRAPHY

• BOOKS
 Developing web based applications with Microsoft
PHI publications
 ASP.NET 2.0 Website Programming: Problem - Design – Solution By Marco
Bellinaso
 ASP.NET Database Programming Weekend Crash Course™ by Jason Butler and
Tony Caudill
 System Analysis & Design
By Elias M. Awad
 SQL SERVER the complete reference…(George Koch, Kevin Loney)
 C# Bible by Wiley Publishing, Inc.
 Beginning Ajax with ASP.NET by Wallace B. McClure, Scott Cate, Paul
Glavich, Craig Shoemaker

• WEB SITE
 www.msdn.microsoft.com

83
 www.support.microsoft.com
 www.altavista.com
 www.developer.com/net

APPENDIX

Text Box
A text Box control, sometimes called an edit field or edit control, displays information
entered at design time, entered by the user, or assigned to the control in code at run
time.
Command Button
Use a Command Button control to begin, interrupt, or end a process. When chosen, a
Command Button appears pushed in and so is sometimes called push button.
List Box
A List Box control displays a list of items from which the user can select one or more. If
the number of items exceeds the number that can be displayed, a scroll bar is
automatically added to the List Box control.
Label
A Label control is a graphical control you can use to display text that a user can’t change
directly.
Frame
A Frame control provides an identifiable grouping for controls. You can also use a
Frame to subdivide a form functionally – for example, to separate groups of Option
Button controls.
Drop down Box
A Drop down Box control combines the features of a text box and a list box. This control
allows the user to select an item either by typing text into the combo box, or by selecting
it form the list.

Timer
A Timer control can execute code at regular intervals by causing a Timer event to occur.
The Timer control, invisible to the user, is useful for background processing.
Picture Box
The primary use for the Picture Box control is a display a picture to the user. The actual
picture that is displayed is determined by the picture property. The picture property
contains the file name (and optional path) for the picture file that you wish to display.
Data Grid Control
The DataGrid control displays and operates on tabular data. It allows complete flexibility
to sort, merge, and format table containing strings and pictures. When bound to a Data
control, DataGrid displays read-only data.

84
Line Control
A Line control is a graphical control displayed as horizontal, vertical, or diagonal line.
You can use Line control at design time to draw lines on forms.
Shape Control
The Shape control is a graphical control displayed as a rectangle, square, oval, circle.
Date and Time Picker Control
A Data and Time Picker (DTP) Control provides a simple and intuitive interface through
which to exchange data and time information with a user. For example, with a DTP
control you can ask the user to enter a data and then retrieve his or her selection with
ease.
Option Button
An Option Button control displays an option that can be turned on or off.

Image Control
Use the Image control to display a graphic. An Image control can display a graphic form
an icon, bitmap or metafile, as well as enhanced metafile, JPEG, or GIF files.
Check Box Control
A Check Box indicates whether a particular condition is on or off. We use check boxes
in an application to give users true/false or yes/no options. Because check boxes work
independently of each other, a user can select any number of check boxes at the same
time.

CONCLUSION
Web based application is the magic of today’s world. The object of “Xplore-Jobs”
project is to harness the power of Internet for our practical and potential one. This
report explains to extensively cover the concept and plant a seed of inquisitiveness in
the mind of users.
We hope that the HR persons of IT industry and IT Jobseekers would maximum
utilize of our project and we keep on adding new facilities which would make it very
useful for other discipline jobseekers and HR persons among other industries no
matter which field they are from. The basic idea f this project is to explain the
fundamental concepts of B2B web-portal and also building of computer knowledge
would highly unstable.

85

Você também pode gostar