Você está na página 1de 20

LITERATURE REVIEW 2.

0 INTRODUCTION The literature review covers a review and survey on different areas of the research study. These areas are very important as they will in one way or the other be employed in the research development. Areas covered include system design technology and methodologies, system development life cycle, system development process activities and models. These areas also elaborate on system analysis methods and processes in details so as to enable reliable and accuracy in data gathering and analysis processes of the research. Also covered is the system implementation activities which includes system deployment, installation, testing and maintenance activities and models. Several models, techniques, approaches, strategies and methodologies were reviewed and compared in relation to the research area. A review of an already developed system that is closely related to the project was carried out also. First to be reviewed is the review of a related exiting Sales/Food service management Information system. 2.1 A REVIEW OF FOOD SERVICE MANAGEMENT INFORRMATION SYSTEM V(FMIS V) The Foodservice Management Information System (FMIS V) sold by Genesistems, Inc. since 1980 on mini and super mini computers is now available on low cost personal computers and popular networks under FMIS V. According to Genesistems' President Eric Muench, new programming languages have provided a method of allowing Genesistems' proven FMIS system to operate with the same speed and flexibility on the new popular personal computers that was formerly available only on larger computers. This brings the cost of an automated solution for the foodservice operator down to a price that is affordable. "The manager must be able to determine prices and schedules, make forecasts, perform an ongoing audit of inventory and other company assets, and monitor performance. More and more managers are turning to the computer to provide this information on a timely basis," he said.

"Traditionally, foodservice institutions have had weak in-house accounting systems based on tedious manual procedures," Muench continued. "The result has been poor cost control. Food cost information is generally outdated before manual computations can even be completed. FMIS V solves these and other problems at a reasonable cost." FMIS V consists of the following modules: general ledger, accounts payable, payroll, bank reconciliation, inventory control, recipe control, sales analysis, and management report writing. Telecommunications input is available for certain cash registers. All modules are integrated and provide full accounting information automatically to the general ledger for up-to-date financial statements. General Ledger The General Ledger module is the center of the accounting system. It is a powerful yet easy to use module that can accommodate a single unit restaurant as well as a large multiple unit operation. The General Ledger is automatically updated from all other modules being operated. Both 12 and 13 period accounting are supported. The Trial Balance Report and General Ledger Report provide the necessary documentation and audit trails required of a professional accounting system. Financial Statements can be designed to your specifications by you within the General Ledger module. The optional Management Report Writer gives you the added ability to print complex financial statements that consolidate or compare multiple time periods and units if necessary. Account budgets may be set up and used in forecasting and comparisons to actual activity. Accounts Payable The Accounts Payable module is designed to allow you to better manage your vendor invoices and payments. Inventory purchases that are entered will be automatically updated to the Inventory, Recipe, and Sales Analysis modules without any additional work. Invoices may be entered in summary, detail, or a combination of the two. By entering invoices, you are creating the capability of accumulating unpaid invoices easily at any time. A purchase history by vendor is also maintained, and check payment can be accomplished easily in a method that is convenient for your operation. This module lets you stay on top of your outstanding invoices so that invoices are never paid for twice.

Payroll The Payroll module is designed for time entry, printing payroll checks, general ledger distribution and year-end W-2 forms. It can operate on a daily, weekly, bi-weekly, semi-monthly, or monthly basis with all input verified, copied, and employee records updated during the End-Pay-Period procedure. Other useful options are included such as payroll history inquiry, earnings summary report, employee payroll history, tip allocation, tip reporting and is integrated to the optional Federal Magnetic Media Reporting module. The module is easy to use due to its one-step nature. After set-up with a General Ledger file and initial data entry, payroll tracking becomes relatively easy. Time is entered, then the register is printed. If corrections are necessary, they can be made to the appropriate entries and the register re-printed. After everything balances, checks and reports are printed and then the pay period can be closed. This module is designed to operate in conjunction with other modules that may be installed. Programs are explained as if the General Ledger module were included. Information is transferred to all integrated modules as a function of the End-PayPeriod procedure or is transferred each month through the End-of-Month posting procedure. Bank Reconciliation The Bank Reconciliation module is used to manage your bank accounts. It is automatically updated as checks are written and deposits are entered. A simple method of canceling checks allows you to reconcile the account to the bank statement in very little time. Multiple bank accounts can be maintained simply and easily. A historical check register is maintained for up to five years for your review. Other features includes Accurate, "on demand" financial statements tighten management control and eliminate monthly accounting fees, True, double entry accounting with forced balancing of entries eliminates costly posting errors, Comparisons of business units permit management to make intelligent analysis and take effective action, Reporting accommodates easy consolidation of multiple units or companies for corporate requirements, Simple invoice entry organizes and validates invoices for accuracy and automatically updates the Inventory module if necessary,

Accounts Payable Cash Requirements Report provides immediate access to a list of currently due invoices and the total cash required, Controlled payment of Accounts Payable invoices eliminates duplicate payments, conserves cash, and accrues interest, Selection and printing of Accounts Payable computer checks saves time and eliminates errors, Bank Reconciliation provides an easy way to control and reconcile any bank accounts. Inventory Control The Inventory Control module is designed to allow you a fast and easy way to keep track of your inventory. You are able to track what you have purchased and what prices you are paying from various suppliers for any length of time. In-house batch production items can be processed along with multiple location transfers. Inventory is first categorized into major classifications that you choose such as meat, dairy and produce. Inventory can be kept on a perpetual basis by entering your purchases for those items and taking a physical count monthly or as frequently as desired to get your actual usage on each item. Inventory may also be kept on a periodic basis which does not require entering all your purchases. The periodic method allows for entry of a physical count and last cost at any point in time and will automatically extend the inventory for you. Both methods provide inventory count sheets by specific storage location and fast inventory count entry methods. The two methods can also be combined to allow detailed control of high cost items and less detailed control of less significant items. Recipe Control The Recipe Control module works hand in hand with the Inventory Control module. It provides you with an organized method of entering your recipes. You can take advantage of the ability to monitor your costs at all times before cost increases erode your profit margins. Unlimited levels of sub-recipes can be maintained very easily. Recipes can include a plate cost for items that you may not want to set up. Recipes can be costed in seconds at Last Cost or Average Cost and can be printed or displayed on the screen. Each recipe can also have detailed preparation instructions set up for use as a training manual. Other features include Quick, accurate food and beverage cost percentages can spot increasing costs before it is too late, "What If" capability for quick, profitable decisions on effect of price and cost changes to a menu or individual item, Easy, timely, accurate trend information on profit margins and popularity of

menu items, Regular variance reporting on Actual versus Potential Inventory Usage flags items to watch for excessive use, Prompt, accurate comparisons of multi-unit sales for better management analysis and decisions, Server analysis tells you who is and who isn't selling items such as specials and desserts, Usage, waste and pilferage information is available at any time for management corrective action to maximize profits, Inventory Use and Purchase History allows more accurate inventory planning, Provides a clear, precise way of standardizing recipes for easier employee use, Inventory transfers between multiple units are tracked for proper allocation of charges and better management relations, Inventory Production allows the tracking of inhouse prep items to show actual inventory usage and real costs, Friendly, flexible set up allows you to track only information you need and not data that you don't care about. Sales Analysis The Sales Analysis module completes the operations triangle. Both Inventory and Recipe Control are related heavily to Sales Analysis. Menu items are set up and defined at this point. A menu item can refer to a recipe or directly to an inventory item. Daily sales can be entered manually or transferred from a point of sale device if one is available. Sales history is maintained on a daily basis for any number of years. Entering your sales will generate your potential or optimal use of each inventory item and will give you an actual versus potential usage variance. Sales trends can be tracked in a wide variety of methods using the Management Report Writer. Sales Analysis gives you the capability to stay on top of your margins and control them before they can hurt you. Management Report Writing The Report Writer module allows the creation of custom reports wanted by individual companies. The flexibility and adaptability of this module allows for seemingly unlimited variations of report types. This module is limited only by your imagination. Thirty-six columns are available for mathematical and statistical computations (only limited by your printer's capability). Data to be printed on these reports can be drawn from a variety of sources. The most common source is General Ledger and the Report Writer is particularly suited to producing complex financial statements. Reports can also be produced based on data from Sales Analysis or from the Statistics section of the Management Report Writer.

2.2

SYSTEMS DESIGN Systems design is the process of defining the architecture, components, modules, interfaces, and data for a system to satisfy specified requirements. One could see it as the application of systems theory to product development. There is some overlap with the disciplines of systems analysis, systems architecture and systems engineering. If the broader topic of product development "blends the perspective of marketing, design, and manufacturing into a single approach to product development," then design is the act of taking the marketing information and creating the design of the product to be manufactured. Systems design is therefore the process of defining and developing systems to satisfy specified requirements of the user. Until the 1990s systems design had a crucial and respected role in the data processing industry. In the 1990s standardization of hardware and software resulted in the ability to build modular systems. The increasing importance of software running on generic platforms has enhanced the discipline of software engineering. System design comprises of several activities, phases and sections. These sections has several outputs and deliverables. All the results of the system design phases is to ensure an effective breakdown and decomposition of the system to be developed. First of the phases is the logical design

2.2.1 Logical design The logical design of a system pertains to an abstract representation of the data flows, inputs and outputs of the system. This is often conducted via modelling, using an over-abstract (and sometimes graphical) model of the actual system. In the context of systems design are included. This is aimed at complete definition of the processes activities of the system so as to ensure a proper definition of the logic of the system. 2.2.2 Physical design The physical design relates to the actual input and output processes of the system. This is laid down in terms of how data is input into a system, how it is verified/authenticated, how it is processed, and how it is displayed as output. Physical design, in this context, does not refer to the tangible physical design of an information system. To use an analogy, a personal computer's physical design involves input via a keyboard, processing within the CPU, and output via a monitor, printer, etc. It would not concern the actual layout of the tangible hardware, which for

a PC would be a monitor, CPU, motherboard, hard drive, modems, video/graphics cards, USB slots, etc. it involve detail design of user and a product database structure processing and control processor .The H/S personal specification a develop for the proposed system. 2.2.3 System Design Methodologies System design methodology is a standard, techniques, rule applied in the development of a system design. These standards are guided and enforced by standard organisations I order to attain a uniform and advance techniques in system design development. There are several system design methodologies. The most common ones are explained below. Rapid application development (RAD) Rapid application development also known as (RAD) is a methodology in which a systems designer produces prototypes for an end-user. A prototype is a small working or non functional working sample of a product. This could sometimes be a graphical illustration with description of the product. The end-user reviews the prototype, and offers feedback on its suitability. This process is repeated until the end-user is satisfied with the final system. Joint application design (JAD) Joint application design also known as (JAD) is a methodology which evolved from RAD, in which a systems designer consults with a group consisting of the following parties:

Executive sponsor Systems Designer Managers of the system Some times the system operators and users JAD involves a number of stages, in which the group collectively develops an agreed pattern for the design and implementation of the system. In this type system design methodology there are several sessions of meetings with appropriate documentation and analysis.

2.2.4 Systems design: Topics/Definition of Terms

Requirements analysis - analyzes the needs of the end users or customers Benchmarking is an effort to evaluate how current systems perform Systems architecture - creates a blueprint for the design with the necessary specifications for the hardware, software, people and data resources. In many cases, multiple architectures are evaluated before one is selected.

Design designers will produce one or more 'models' of what they see a system eventually looking like, with ideas from the analysis section either used or discarded. A document will be produced with a description of the system, but nothing is specific they might say 'touchscreen' or 'GUI operating system', but not mention any specific brands;

Computer programming and debugging in the software world, or detailed design in the consumer, enterprise or commercial world - specifies the final system components.

System testing - evaluates the system's actual functionality in relation to expected or intended functionality, including all integration aspects.

2.3

SYSTEMS DEVELOPMENT LIFE CYCLE A life cycle is a diagrammatical illustration of all the activities, processes,activities and operation that are carried out step by step from the beginning to the end of a particular development project or object. The Systems Development Life Cycle (SDLC), or Software Development Life Cycle in systems engineering, information systems and software engineering, is a process of creating or altering information systems, and the models and methodologies that people use to develop these systems. In software engineering the SDLC concept underpins many kinds of software development methodologies. These methodologies form the framework for planning and controlling the creation of an information system: the software development process. Systems Development Life Cycle (SDLC) is a process used by a systems analyst to develop an information system, including requirements, validation, training, and user

(stakeholder) ownership. Any SDLC should result in a high quality system that meets or exceeds customer expectations, reaches completion within time and cost estimates, works effectively and efficiently in the current and planned Information Technology infrastructure, and is inexpensive to maintain and cost-effective to enhance. Computer systems are complex and often (especially with the recent rise of Service-Oriented Architecture) link multiple traditional systems potentially supplied by different software vendors. To manage this level of complexity, a number of SDLC models or methodologies have been created, such as "waterfall"; "spiral"; "Agile software development"; "rapid prototyping"; "incremental"; and "synchronize and stabilize". SDLC models can be described along a spectrum of agile to iterative to sequential. Agile methodologies, such as XP and Scrum, focus on lightweight processes which allow for rapid changes along the development cycle. Iterative methodologies, such as Rational Unified Process and Dynamic Systems Development Method, focus on limited project scope and expanding or improving products by multiple iterations. Sequential or big-design-up-front (BDUF) models, such as Waterfall, focus on complete and correct planning to guide large projects and risks to successful and predictable results. Other models, such as Anamorphic Development, tend to focus on a form of development that is guided by project scope and adaptive iterations of feature development. In project management a project can be defined both with a project life cycle (PLC) and an SDLC, during which slightly different activities occur. According to Taylor (2004) "the project life cycle encompasses all the activities of the project, while the systems development life cycle focuses on realizing the product requirements". 2.4.1 History The Systems Life Cycle (SLC) is a methodology used to describe the process for building information systems, intended to develop information systems in a very deliberate, structured and methodical way, reiterating each stage of the life cycle. The systems development life cycle, according to Elliott & Strachan & Radford (2004), "originated in the 1960s,to develop large scale functional business systems in an age of large scale business conglomerates. Information systems activities revolved around heavy data processing and number crunching routines".

Several systems development frameworks have been partly based on SDLC, such as the Structured Systems Analysis and Design Method (SSADM) produced for the UK government Office of Government Commerce in the 1980s. Ever since, according to Elliott (2004), "the traditional life cycle approaches to systems development have been increasingly replaced with alternative approaches and frameworks, which attempted to overcome some of the inherent deficiencies of the traditional SDLC". 2.4.2 Systems development phases The System Development Life Cycle framework provides a sequence of activities for system designers and developers to follow. It consists of a set of steps or phases in which each phase of the SDLC uses the results of the previous one. A Systems Development Life Cycle (SDLC) adheres to important phases that are essential for developers, such as planning, analysis, design, and implementation, and are explained in the section below. A number of system development life cycle (SDLC) models have been created: waterfall, fountain, spiral, build and fix, rapid prototyping, incremental, and synchronize and stabilize. The oldest of these, and the best known, is the waterfall model: a sequence of stages in which the output of each stage becomes the input for the next. These stages and phases are elaborated and carefully studies and evaluated so as to attain a well grounded understanding on their several areas and domain as they will eventually be employed in the research development. These stages can be characterized and divided up in different ways, including the following:

Project planning, feasibility study: Establishes a high-level view of the intended project and determines its goals. An important task in creating a software product is extracting the requirements or requirements analysis. Customers typically have an abstract idea of what they want as an end result, but not what software should do. Incomplete, ambiguous, or even contradictory requirements are recognized by skilled and experienced software engineers at this point. Frequently demonstrating live code may help reduce the risk that the requirements are incorrect. Once the general requirements are gathered from the client, an analysis of the scope of the development should be determined and clearly stated. This is often called a scope document.

Certain functionality may be out of scope of the project as a function of cost or as a result of unclear requirements at the start of development. If the development is done externally, this document can be considered a legal document so that if there are ever disputes, any ambiguity of what was promised to the client can be clarified Systems analysis, requirements definition: Defines project goals into defined functions and operation of the intended application. Analyzes end-user information needs.

Systems design: Describes desired features and operations in detail, including screen layouts, business rules, process diagrams, pseudocode and other documentation.

Implementation: The real code is written here. Integration and testing: Brings all the pieces together into a special testing environment, then checks for errors, bugs and interoperability.

Acceptance, installation, deployment: The final stage of initial development, where the software is put into production and runs actual business.

Maintenance: What happens during the rest of the software's life: changes, correction, additions, moves to a different computing platform and more. This, the least glamorous and perhaps most important step of all, goes on seemingly forever. In the following example these stage of the Systems Development Life Cycle are divided in ten steps from definition to creation and modification of IT work products: The tenth phase occurs when the system is disposed of and the task performed is either eliminated or transferred to other systems. The tasks and work products for each phase are described in subsequent chapters. Not every project will require that the phases be sequentially executed. However, the phases are interdependent. Depending upon the size and complexity of the project, phases may be combined or may overlap.

2.4.3 System analysis The goal of system analysis is to determine where the problem is in an attempt to fix the system. This step involves breaking down the system in different pieces to analyze

the situation, analyzing project goals, breaking down what needs to be created and attempting to engage users so that definite requirements can be defined. Requirements analysis sometimes requires individuals/teams from client as well as service provider sides to get detailed and accurate requirements; often there has to be a lot of communication to and from to understand these requirements. Requirement gathering is the most crucial aspect as many times communication gaps arise in this phase and this leads to validation errors and bugs in the software program. 2.4.4 Design In systems design the design functions and operations are described in detail, including screen layouts, business rules, process diagrams and other documentation. The output of this stage will describe the new system as a collection of modules or subsystems. The design stage takes as its initial input the requirements identified in the approved requirements document. For each requirement, a set of one or more design elements will be produced as a result of interviews, workshops, and/or prototype efforts. Design elements describe the desired software features in detail, and generally include functional hierarchy diagrams, screen layout diagrams, tables of business rules, business process diagrams, pseudocode, and a complete entity-relationship diagram with a full data dictionary. These design elements are intended to describe the software in sufficient detail that skilled programmers may develop the software with minimal additional input design. 2.4.5 Testing Software testing is an integral and important phase of the software development process. This part of the process ensures that defects are recognized as soon as possible. The code is tested at various levels in software testing. Unit, system and user acceptance testings are often performed. This is a grey area as many different opinions exist as to what the stages of testing are and how much if any iteration occurs. Iteration is not generally part of the waterfall model, but usually some occur at this stage. In the testing the whole system is test one by one

Following are the types of testing:


Defect testing the failed scenarios, including defect tracking Path testing Data set testing Unit testing System testing Integration testing Black box testing White box testing Regression testing Automation testing User acceptance testing Performance testing

2.4.6 Operations and maintenance The deployment of the system includes changes and enhancements before the decommissioning or sunset of the system. Maintaining the system is an important aspect of SDLC. As key personnel change positions in the organization, new changes will be implemented, which will require system updates. Implementation is the part of the process where software engineers actually program the code for the project. Documenting the internal design of software for the purpose of future maintenance and enhancement is done throughout development. This may also include the writing of an API, be it external or internal. The software engineering process chosen by the developing team will determine how much internal documentation (if any) is necessary. Plan-driven models (e.g., Waterfall) generally produce more documentation than Agile models. Deployment starts after the code is appropriately tested, is approved for release and sold or otherwise distributed into a production environment. Software Training and Support is important and a lot of developers fail to realize that. It would not matter how much time and planning a development team puts into creating software if nobody in an organization ends up using it. People are often

resistant to change and avoid venturing into an unfamiliar area, so as a part of the deployment phase, it is very important to have training classes for new clients of your software. Maintaining and enhancing software to cope with newly discovered problems or new requirements can take far more time than the initial development of the software. It may be necessary to add code that does not fit the original design to correct an unforeseen problem or it may be that a customer is requesting more functionality and code can be added to accommodate their requests. If the labor cost of the maintenance phase exceeds 25% of the prior-phases' labor cost, then it is likely that the overall quality of at least one prior phase is poor. In that case, management should consider the option of rebuilding the system (or portions) before maintenance cost is out of control.
Comparison of Methodology Approaches Control Time Frame Users MIS staff SDLC Formal Long Many Many RAD MIS Short Few Few Open Source Objects Weak Standards Medium Few Hundreds Any Varies Split Both Windows In Objects In Objects Vital JAD Joint Few Few DSS Prototyping End User User User Short One None DSS Crucial None Weak None One or Two One or Two DSS

Medium Short

Both Transaction/DSS Transaction Both Minimal Minimal Weak Interface Documentation and training Integrity and security Reusability Vital Vital Limited Limited Vital Some Internal Unknown Maybe

Crucial Crucial Limited Weak Limited Weak Limited Weak

2.4.7 Strengths and weaknesses Few people in the modern computing world would use a strict waterfall model for their Systems Development Life Cycle (SDLC) as many modern methodologies have superseded this thinking. Some will argue that the SDLC no longer applies to models like Agile computing, but it is still a term widely in use in Technology circles. The SDLC practice has advantages in traditional models of software development, that lends itself more to a structured environment. The disadvantages to using the SDLC methodology is when there is need for iterative development or (i.e. web development or e-commerce) where stakeholders need to review on a regular basis the software being designed. Instead of viewing SDLC from a strength or weakness perspective, it

is far more important to take the best practices from the SDLC model and apply it to whatever may be most appropriate for the software being designed. A comparison of the strengths and weaknesses of SDLC:
Strength and Weaknesses of SDLC Strengths Control. Monitor Large projects. Detailed steps. Documentation. Well defined user input. Ease of maintenance. Development and design standards. Tolerates changes in MIS staffing. Weaknesses Increased development time. Increased development cost. Systems must be defined up front. Hard to estimate costs, project overruns. User input is sometimes limited.

Evaluate costs and completion targets. Rigidity.

An alternative to the SDLC is Rapid application development, which combines prototyping, Joint Application Development and implementation of CASE tools. The advantages of RAD are speed, reduced development cost, and active user involvement in the development process. 2.5 SOFTWARE DEVELOPMENT MODELS Several models exist to streamline the development process. Each one has its pros and cons, and it's up to the development team to adopt the most appropriate one for the project. Sometimes a combination of the models may be more suitable. 2.5.1 Waterfall model The waterfall model shows a process, where developers are to follow these phases in order: 1. Requirements specification (Requirements analysis) 2. Software design 3. Implementation and Integration 4. Testing (or Validation) 5. Deployment (or Installation) 6. Maintenance

In a strict Waterfall model, after each phase is finished, it proceeds to the next one. Reviews may occur before moving to the next phase which allows for the possibility of changes (which may involve a formal change control process). Reviews may also be employed to ensure that the phase is indeed complete; the phase completion criteria are often referred to as a "gate" that the project must pass through to move to the next phase. Waterfall discourages revisiting and revising any prior phase once it's complete. This "inflexibility" in a pure Waterfall model has been a source of criticism by supporters of other more "flexible" models. 2.5.2 Spiral model The key characteristic of a Spiral model is risk management at regular stages in the development cycle. In 1988, Barry Boehm published a formal software system development "spiral model", which combines some key aspect of the waterfall model and rapid prototyping methodologies, but provided emphasis in a key area many felt had been neglected by other methodologies: deliberate iterative risk analysis, particularly suited to large-scale complex systems. The Spiral is visualized as a process passing through some number of iterations, with the four quadrant diagram representative of the following activities: 1. formulate plans to: identify software targets, selected to implement the program, clarify the project development restrictions; 2. Risk analysis: an analytical assessment of selected programs, to consider how to identify and eliminate risk; 3. the implementation of the project: the implementation of software development and verification; Risk-driven spiral model, emphasizing the conditions of options and constraints in order to support software reuse, software quality can help as a special goal of integration into the product development. However, the spiral model has some restrictive conditions, as follows: 1. The spiral model emphasizes risk analysis, and thus requires customers to accept this analysis and act on it. This requires both trust in the developer as well as the willingness to spend more to fix the issues, which is the reason why this model is often used for large-scale internal software development.

2. If the implementation of risk analysis will greatly affect the profits of the project, the spiral model should not be used. 3. Software developers have to actively look for possible risks, and analyze it accurately for the spiral model to work. The first stage is to formulate a plan to achieve the objectives with these constraints, and then strive to find and remove all potential risks through careful analysis and, if necessary, by constructing a prototype. If some risks cannot be ruled out, the customer has to decide whether to terminate the project or to ignore the risks and continue anyway. Finally, the results are evaluated and the design of the next phase begins. 2.5.3 Iterative and incremental development Iterative development prescribes the construction of initially small but ever-larger portions of a software project to help all those involved to uncover important issues early before problems or faulty assumptions can lead to disaster. Iterative processes can assist with revealing design goals of a client who does not know how to define what they want.

Nimble Rabbit is a flexible development process or framework with a set of practices and overlapping roles that are intended to help define what the developers do well, and the solutions to improve those processes which fall short. Nimble Rabbit is specifically designed to work well in a telecommuting/distributed environment. It is intended to avoid many of the false assumptions and traps of existing agile frameworks such as Scrum.

2.5.4 Agile development Agile software development uses iterative development as a basis but advocates a lighter and more people-centric viewpoint than traditional approaches. Agile processes use feedback, rather than planning, as their primary control mechanism. The feedback is driven by regular tests and releases of the evolving software. There are many variations of agile processes:

In Extreme Programming (XP), the phases are carried out in extremely small (or "continuous") steps compared to the older, "batch" processes. The (intentionally incomplete) first pass through the steps might take a day or a week, rather than the

months or years of each complete step in the Waterfall model. First, one writes automated tests, to provide concrete goals for development. Next is coding (by a pair of programmers), which is complete when all the tests pass, and the programmers can't think of any more tests that are needed. Design and architecture emerge out of refactoring, and come after coding. Design is done by the same people who do the coding. (Only the last feature merging design and code is common to all the other agile processes.) The incomplete but functional system is deployed or demonstrated for (some subset of) the users (at least one of which is on the development team). At this point, the practitioners start again on writing tests for the next most important part of the system.

Scrum

2.5.6 Code and fix "Code and fix" development is not so much a deliberate strategy as an artifact of naivet and schedule pressure on software developers. Without much of a design in the way, programmers immediately begin producing code. At some point, testing begins (often late in the development cycle), and the inevitable bugs must then be fixed before the product can be shipped. 2.5.7 Process improvement models Capability Maturity Model Integration The Capability Maturity Model Integration (CMMI) is one of the leading models and based on best practice. Independent assessments grade organizations on how well they follow their defined processes, not on the quality of those processes or the software produced. CMMI has replaced CMM. ISO 9000 ISO 9000 describes standards for a formally organized process to manufacture a product and the methods of managing and monitoring progress. Although the standard was originally created for the manufacturing sector, ISO 9000 standards have been applied to software development as well. Like CMMI, certification with ISO 9000 does not guarantee the quality of the end result, only that formalized business processes have been followed. ISO 15504 ISO 15504, also known as Software Process Improvement Capability Determination (SPICE), is a "framework for the assessment of software processes". This standard is

aimed at setting out a clear model for process comparison. SPICE is used much like CMMI. It models processes to manage, control, guide and monitor software development. This model is then used to measure what a development organization or project team actually does during software development. This information is analyzed to identify weaknesses and drive improvement. It also identifies strengths that can be continued or integrated into common practice for that organization or team. 2.5.8 Formal methods Formal methods are mathematical approaches to solving software (and hardware) problems at the requirements, specification and design levels. Examples of formal methods include the B-Method, Petri nets, Automated theorem proving, RAISE and VDM. Various formal specification notations are available, such as the Z notation. More generally, automata theory can be used to build up and validate application behaviour by designing a system of finite state machines. Finite state machine (FSM) based methodologies allow executable software specification and by-passing of conventional coding Formal methods are most likely to be applied in avionics software, particularly where the software is safety critical. Software safety assurance standards, such as DO178B demand formal methods at the highest level of categorization (Level A). Formalization of software development is creeping in, in other places, with the application of Object Constraint Language (and specializations such as Java Modeling Language) and especially with Model-driven architecture allowing execution of designs, if not specifications. Another emerging trend in software development is to write a specification in some form of logic (usually a variation of FOL), and then to directly execute the logic as though it were a program. The OWL language, based on Description Logic, is an example. There is also work on mapping some version of English (or another natural language) automatically to and from logic, and executing the logic directly. Examples are Attempto Controlled English, and Internet Business Logic, which does not seek to control the vocabulary or syntax. A feature of systems that support bidirectional English-logic mapping and direct execution of the logic is that they can be made to explain their results, in English, at the business or scientific level.

The Government Accountability Office, in a 2003 report on one of the Federal Aviation Administrations air traffic control modernization programs,[3] recommends following the agencys guidance for managing major acquisition systems by

establishing, maintaining, and controlling an accurate, valid, and current performance measurement baseline, which would include negotiating all authorized, unprized work within 3 months;

conducting an integrated baseline review of any major contract modifications within 6 months; and

preparing a rigorous life-cycle cost estimate, including a risk assessment, in accordance with the Acquisition System Toolsets guidance and identifying the level of uncertainty inherent in the estimate.

2.6

CONCLUSION After an extensive and exhaustive research and review on the different areas above, taking into consideration their pros and cons, several decisions were made for this research development. The decisions covered the strategies, technology and the methodology as well as the system development process and activities. The research combine a blend of technology and methodology in the different areas of the research. The traditional system development model was employed as the general development model. This model (Waterfall model) was the best option due to several factors guiding the research, some of these factors include time, cost and flexibility. The several rules of these phases which include Requirements specification (Requirements analysis), Software design, Implementation and Integration, Testing (Verification or Validation), Deployment (or Installation) and Maintenance were adhered to strictly ad meticulously. A structured analysis and design approach was used for the system analysis and design of the project. However a closely or more of and object oriented programming structure was employed for the construction of the research system. Several testing models were employed in the research testing phase. The major covered were the unit test, the component test, acceptance testing and system testing. Details are covered I the forth chapter of the research documentation

Você também pode gostar