Você está na página 1de 67

INTRODUCTION

This paper is an introduction to the Agile school of software development, and is primarily
targeted at IT managers and CXOs with an interest in improving development productivity. What
is Agile? How can Agile help improve my organization? First, I introduce the two broad schools
of thought when it comes to software development: traditional sequential, a.k.a. “the waterfall
method”, and iterative methods of which Agile is a subset. My objective is to demonstrate the
short-comings of the waterfall approach while providing a solution in iterative, and more
specifically, Agile methods.

The essence of waterfall software development is that complex software systems can be built in a
sequential, phase-wise manner where all of the requirements are gathered at the beginning, all of
the design is completed next, and finally the master design is implemented into production quality
software.

This approach holds that complex systems can be built in a single pass, without going back and
revisiting requirements or design ideas in light of changing business or technology conditions. It
was first introduced in an article written by Winston Royce in 1970, primarily intended for use in
government projects.

Waterfall equates software development to a production line conveyor belt. “Requirements


analysts” compile the system specifications until they pass the finished requirements specification
document to “software designers” who plan the software system and create diagrams documenting
how the code should be written.

The design diagrams are then passed to the “developers” who implement the code from the design.
Under the waterfall approach, traditional IT managers have made valiant efforts to craft and adhere
to large-scale development plans.
These plans are typically laid out in advance of development projects using Gantt or PERT charts
to map detailed tasks and dependencies for each member of the development group months or
years down the line. However, studies of past software projects show that only 9% to 16% are
considered on-time and on-budget2 . In this article, I attempt to summarize current thinking among
computer scientists on why waterfall fails in so many cases.

It also explore a leading alternative to waterfall: “Agile” methods that focus on incremental and
iterative development where requirements, design, implementation, and testing continue
throughout the project lifecycle.

1. The Scope of Life Cycles


As we described in the book The Enterprise Unified Process (EUP) the scope of life cycles can
vary dramatically. For example, Figure 1 depicts the Scrum construction life cycle whereas Figure
2 depicts an extended version of that diagram which covers the full system development life cycle
(SDLC) and Figure 3 extends that further by addressing enterprise-level disciplines via the EUP
life cycle. The points that I'm trying to make are:

System development is complicated. Although it's comforting to think that development is


as simple as Figure 1 makes it out to be, the fact is that we know that it's not. If you adopt a
development process that doesn't actually address the full development cycle then you've
adopted little more than consultant ware in the end. My experience is that you need to go
beyond the construction life cycle of Figure 1 to the full SDLC of Figure 2 (ok, Retirement
may not be all that critical) if you're to be successful

There's more to IT than development. To be successful at IT you must take a multi-


system, multi-life cycle stage view as depicted in Figure 3. The reality is that organizations
have many potential projects in the planning stage (which I'll call Iteration -1 in this article),
many in development, and many in production.
Figure 1 uses the terminology of the Scrum methodology. The rest of this article uses the
terminology popularized in the mid-1990s by the Unified Process (Sprint = Iteration, Backlog =
Stack, Daily Scrum Meeting = Daily Meeting). Figure 1 shows how agilest treat requirements like
a prioritized stack, pulling just enough work off the stack for the current iteration (in Scrum
iterations/sprints are often 30-days long, although this can vary). At the end of the iteration the
system is demoed to the stakeholders to verify that the work that the team promised to do at the
beginning of the iteration was in fact accomplished.

Figure 1. The Scrum construction life cycle.


The Scrum construction life cycle of Figure 1, although attractive proves to be a bit naive in
practice. Where does the product backlog come from? Does it get beamed down from the Starship
Enterprise? Of course not, it's actually the result of initial requirements envisioning early in the
project. You don't only implement requirements during an iteration, you also fix defects
(disciplined agile teams have a parallel testing effort during construction iterations where these
defects are found), go on holiday, support other teams (perhaps as reviewers of their work), and
so on. So you really need to expand the product backlog into a full work items list. You also release
your system into production, often a complex endeavor.

A more realistic life cycle is captured Figure 2, overviewing the full agile SDLC. This SDLC is
comprised of six phases: Iteration -1, Iteration 0/Warm Up, Construction, Release/End Game,
Production, and Retirement. Although many agile developers may balk at the idea of phases,
perhaps Gary Evan's analogy of development seasons may be a bit more palatable, the fact is that
it's been recognized that processes such as Extreme Programming (XP) and Agile Unified Process
(AUP) do in fact have phases (for diagrams, see XP life cycle and AUP life cycle respectively).
The Disciplined Agile Delivery (DAD) lifecycle also includes phases (granted, I lead the
development of DAD). Furthermore, the Agile MSF calls its phases/seasons "tracks".
Figure 2. A detailed agile SDLC.
2. Software Development is more like New Product Development than
Manufacturing
Software development is a highly complex field with countless variables impacting the system.
All software systems are imperfect because they cannot be built with mathematical or physical
certainty. Bridge building relies on physical and mathematical laws. Software development,
however, has no laws or clear certainties on which to build. As a result, software is almost always
flawed or sub-optimized.

Also consider that the building blocks of software projects is usually other software systems (e.g.,
programming languages, database platforms, etc.), and those systems that act as building blocks
contain bugs and cannot be relied on with certainty.

Because the foundations of software development are inherently unstable and unreliable,
organizations developing software must realize variables exist that are largely outside of
management control. It is therefore fair to say that software development is more akin to new
product research and development than it is to assembly-line style manufacturing.

Software development is innovation, discovery, and artistry; each foray into a development project
presents new and difficult challenges that cannot be overcome with one-size-fits-all, cookiecutter
solutions. The waterfall methodology assumes that up-front planning is enough to take into
account all variables that could impact the development process. In fact, waterfall projects allocate
copious effort detailing every possible risk, mitigation plan, and contingency.

But is it possible to predict any and all variables that could possibly affect a software project? The
empirical answer is “no” considering the limited success of waterfall projects. Waterfall therefore
equates software development to an assembly line; defined processes can be established that, when
used sequentially, result in a successful project each time.

The first step is X, the second is Y, and the result is always Z. Can research really be relegated to
a series of steps that when performed in sequence result in a new product? If this formulaic
approach were adequate, medical researchers could simply plug variables into equations to
discover new medicines.
On the contrary, since the late 1970s product development companies lead by Toyota, Honda,
Fujitsu, 3M, HP, Canon, and NEC, supplanted the sequential “Phased Program Planning” (PPP)
approach to new product development with a flexible, holistic approach where the traditional
phases of development overlap throughout the product lifecycle.

The results were a dramatic improvement in cost and development time to market and ultimately
lead to the popular rise of “lean development” and “just-in-time manufacturing”. Following the
lead of Japanese auto makers, in the 1990s sequential, waterfall-style approaches to new product
development were effectively abandoned outside the software development industry.

But longstanding insistence from IT managers to categorize software development as a


straightforward assembly line progression has kept the software industry from evolving to better
methods, the benefits of which other new product development industries have been reaping for
decades.

It’s ironic that a cutting edge technology field like software is so far behind more traditional
engineering fields in terms of development methods. Almost no software system is so simple that
the development can be entirely scripted from beginning to end. The inherent uncertainty and
complexity in all software projects requires an adaptive development plan to cope with uncertainty
and a high number of unknown variables.

3. Incremental and Iterative Development


The simple ability to revisit the “phases” of development dramatically improves project efficiency.
The idea of revisiting phases over and over is called “incremental and iterative development” (IID).
The development lifecycle is cut up into increments or “iterations” and each iteration touches on
each of the traditional “phases” of development.
For example, with IID requirements is an ongoing process that is periodically revisited. As new
requirements surface and as the scope changes, IID processes continually capture the requirements
iteration after iteration Interestingly, Winston Royce (of waterfall process fame) later noted that
his ideas were incorrectly interpreted and that a “single pass” framework would never work (his
article actually advocates at least a second pass).

IID allows for multiple “passes”, or iterations, over a project lifecycle to properly address
complexities and risk factors. This concept of iterative development hails from the “lean
development” era of the 1980s described above where Japanese auto makers made tremendous
efficiency and innovation increases simply by removing the phased, sequential approach and
implementing an iterative approach, where prototypes were developed for short-term milestones
(see Figure 3).

Each phase was actually a layer that continued throughout the entire development lifecycle; the
requirements, design, and implementation cycle was revisited for each short-term milestone. This
“concurrent” development approach created an atmosphere of trial-and-error experimentation and
learning that ultimately broke down the status quo and led to efficient innovation.

Although direct analogies between industries are never seamless, the success of lean development
has influenced a broad class of “iterative” software methods including the Unified Process, Evo,
Spiral, and Agile methods.

Figure 3 Iterative approach: Overlapping phases of development


4. Agile Project Management: Empirical Process

Scrum, a popular agile project management method, introduced the concept of empirical process
control for the management of complex, changing software projects. Scrum holds that
straightforward defined processes alone cannot be used to effectively manage complex and
dynamic software projects. Risk factors and emerging requirements complicate software
development to a point where defined processes fall short. Although it has been attempted in the
past, there cannot be a single exhaustive library of defined processes to handle every situation that
could possibly surface during a software project. In fact, the manufacturing industry has long
known that certain chemical processes, for example, are too difficult to script and define. Instead,
an empirical or adaptive management approach is employed to measure and adjust the chemical
process periodically to achieve the desired outcome.16 As a result, in the Scrum process, project
plans are continuously inspected and adapted based on the empirical reality of the project.

Agile project management approaches balance the four variables in software development while
keeping in mind the limits associated with new product development. In software development
there are four broad control factors. These factors are interconnected, when one changes at least
one other factor must also change.

Cost – or Effort. Available money impacts the amount of effort put into the system.

Schedule – A software project is impacted as the timeline is changed.


Requirements – The scope of the work that needs to be done can be increased or decreased to affect
the project.
Quality – Cut corners by reducing quality.17

Because software development is often considered a sequential, linear process, middle and upper
management often assumes that all four of these factors could be dictated to the development team
under the waterfall approach. However software development cannot be described by a simple
linear process because it cannot be predicted accurately in advance. It is therefore unreasonable to
assume that management can control all four of these factors. In reality, management can pick
values for three of the four factors at most, and the development process dictates the fourth. The
highly complex and uncertain nature of software development makes this expectation of full
control unrealistic.

5. Lean Thinking
Another effective way to analyze how agile methods increase efficiencies is to apply lean
manufacturing principles to software development. Although cross-industry analysis can be
tenuous, agile methods have their conceptual roots in the Japanese manufacturing productivity
boom of the 1980s19.

Consider for example the “small batch” principle: things produced in smaller batches are of higher
quality and efficiency because the feedback loop is short; controls can be adjusted more frequently,
and resources are utilized efficiently to avoid “queuing” (see “queuing theory” and the theory of
constraints).

Second, Agile methods encourage delaying irreversible decisions until the last responsible
moment. Many software development organizations that implement agile software development
are finding they get something they never expected: options. Rather than locking into decisions at
the beginning of a project, organizations can reduce risks by leaving options open to decide at a
better time when more accurate information is available.

Third, the concept of frequent or continuous integration keep software development teams
synchronized. Teams can work independently for a while but the code base never diverges for long
periods of time, thereby reducing the risks associated with large integrations at the tail end of
projects.20
Chapter 2: Industry Profile
Software Development Company

Vaal triangle Systems, LLC develops and sells SigmaNEST, a comprehensive software solution
for nesting, NC programming and cutting of wood, steel sheet, plate and tube materials.
SigmaNEST is the leading CAD/CAM nesting system for plasma, laser, punch, oxyfuel, waterjet,
router, knife, tube/pipe and combination cutting machines.

SigmaNEST ensures superior material utilization, machine motion optimization, and maximum
part quality balanced with cutting speed, work flow integration, material handling, accurate
estimates and information management.

Vaal triangle Systems, based in Cincinnati, OH, was founded in 1993 and has an extensive global
support network with branches in North and South America, Europe, Asia, Australia, India and
Africa.

Vaal triangle Systems is multi regional multi branch third party administrators (TPA) with a capital
of us 2,500,000 offers a complete line of healthcare management services and solutions for the
insurance and self-funded markets. Vaal triangle Systems is specializes in applying advanced
communication and information technology to improve sheet metal management. That saves time,
reduce paperwork and improve claims outcomes.

Using the most sophisticated and detailed computer systems for swift services, to control
malpractice, reduce abuse and to minimize cost. Vaal triangle Systems continuously uses up-to-
date technology to provide “before the fact” claim management, and administer health insurance
schemes using electronic claim technology. Vaal triangle Systems was able to export it's know-
how to many Arab countries (Jordan, Syria, Libya, Egypt and Oman), and established business
partnership with local Omani insurance company.
1. Products & Services(it is all about innovation)

Vaal triangle Systems LLC provides nesting software solutions. The Company offers fabriction,
integration, business system interfaces, custom development, consulting, support, design,
operations planning, management, and shop floor operation services. Vaal triangle Systems
International operates in the State of Ohio.

Punching Machine

With its modern approach to punching, SigmaNEST optimizes production through our
AutoDynamic Nesting. This technology balances tool changes and tool choices to maximize
machine run time by eliminating tool changes. We offer a variety of packages for our punch
machine modules to suit your business needs. Use the right technology at the right time.

Features
 Automatic and semi-automatic manual punch tooling options for flexibility
 Flexible microjoint (tabbing) strategies for part/sheet stability
 Easy and powerful tool and tool life management
 Full support for part removal devices from drop doors to robotics
 Support for rolling tools (ie Wilson Wheel)
 Managing work order priorities and reporting on machine cycle time
 Reduced machine cycle time and less material waste through common line punching
 Best material optimization with easy nesting and punching in clamp zones
 Powerful configurable post processors to run your machine the way you want
 Single interface for tool library and turret configuration
 Interactive part and nest mode tooling
 Manual repositioning
 Automatic tool sorting
 Manual tabbing and micro-joints
 Drop door support
 Tool safety zones
Benefits
 One software programs all major profile cutting and punching machines
 Save time with reduced tool changes with auto-sort
 Maximum flexibility in file conversion and importation
 Custom shapes can be saved into standard part library for future use
 Using a single software for multiple machines reduces programming and training time
required
 Save engineering time through simplified programming and more efficient machine output

Supported Machines
 Amada
 Baykal
 Durma
 Ermaksan
 EUROMAC
 Farley
 Ficep
 FINN-POWER
 LVD
 Muratec
 Peddinghaus
 Strippit
 Trumpf
 WA Whitney

Laser Machine

Whether you have a fiber or traditional laser, SigmaNEST is the most advanced laser cutting
software for manufacturing environments. SigmaNEST laser cutting software was created through
years of targeted research and development with nesting and NC programming. Our supported
machines includes those with specialty laser types including punch, multi-axis, and tube.
Features
 Feature avoidance to avoid crossing over dangerous areas
 Support for Punch Laser
 Automatic rule-based cutting condition selection for highest quality parts
 Integrated crop-cutting and scrap cutting to minimize handling time for waste material
 Powerful configurable post processors to run your machine the way it was intended
 Advanced material handling
 Support for robotic removal system
 Support for 3D Tool Path
 Support for tube and pipe

Benefits
 Cutting technology management functions like automatic pulsing, power ramping, feed
rate control, focal height, assist gas and pressure adjustment
 Pre-piercing, pierce “on-the-fly” and pierce reduction, as well as options for fine, fast and
normal piercing
 Material and time savings with bridge cutting, common-line cutting and chain cutting
 Microwelds for Trumpf laser cutting machines
 Various levels of cut quality using Appropriate Quality Cutting (AQC) technology
 Repositioning and automatic cut pick up for cutting on plates that extend beyond the cutting
area
 Selective vaporizing at low wattage to cut protective layer
 Corner ramping precision

Supported Machines

Supported Laser Cutting Machines


Amada, Bristow Laser, Burny, Bystronic, Cincinnati, Durma Makina, Ermaksan, ESAB,
Farley, Finn-Power, Hankwang, Kohtaki, Koike, Laser Lab, LVD, Mazak, Messer-
Griesheim, Microstep, Mitsibushi, MultiCAM, NTC America, Salvagnini, Trumpf,
Tanaka, WA Whitney.
Supported 5 Axis Machines
Mazak, NTC, Prima Power, Trumpf, Omax, Wardjet, Flow
Supported Tube Machines
Amada, BLM, Bystronic, FabriGear, Other 3D tube and pipe laser cutting machines

Oxyfuel Machine

SigmaNEST’s powerful features for oxyfuel cutting machines will improve efficiency, reduce
scrap, and produce clean parts. Our nesting software will handle everything your oxufuel machine
needs including heavy plate processing, intricate bevel cutting, and multiple torches. We can also
perform advanced beveling with 5-axis support and automatic beveling.

Features
 Nesting strategies to take advantage of multiple torches
 Changes between single-torch and multi-torch nests
 Post processors that intelligently manage your torch spacing, whether it be automatic or
manual
 Thermal locks to keep the plate intact when cutting thick plate
 Pre-piercing with the torch or by drilling to improve process reliability and extend
consumable life
 Chain & bridge cutting to eliminate unnecessary pierces
 Nest on multiple side-by-side sheets

Heavy Plate Cutting


 Chain cutting for fewer pieces
 Thermal locks for skeletal integrity
 Powerful lead-ins for best possible cut starts
 Extended torch tip life and better quality cuts through pre-piercing
 Remnant management for best utilization of your high-value material
 Feed rate ramping on cutting machine for highest quality holes and slots
 Easy edge starts with click and drag pierce points

Bevel Cutting
 V, Y and K bevels
 Automatic height sensing management
 Swarf cutting
 Variable angle bevel
 Blind bevel
 Automatic multi-pass sequencing
 Triple-torch start-up windows
 Full position vector post (X, Y, Z, I, J, K)
 Rotator mapping in post or control
 Bevel feature recognition
 Multiple corner loop types
 5-Axis bevel support
 Nesting for bevels to ensure clearance for maximum material utilization
 Kerf in computer or control
 Part settings for bevels, grain constraints and lead-in/out can be saved to the part library
for future use
 Detailed graphical reports support accurate production planning, routing and scheduling

Supported Machine
AKS, Alltra, Burny, C&G, ESAB, FICEP, Messer Cutting Systems, Farley, Koike,
MultiCAM, Peddinghaus, Praxair, Whitney.
Waterjet

Enhance the power of your waterjet cutting machines with SigmaNEST. Automatic corner
ramping eliminates tail wash and gouging of material, producing higher quality parts. SigmaNEST
supports multi-nozzle cutting on new and remnant material. Create the perfect balance of feed rate
based on material, thickness, number of nozzles, orifice size and machinability index.

Features
 Information is stored with the part file for later use
 Interactive layer mapping on CAD files
 Different processes per layer
 Different quality settings per layer
 Variable feed rate based on material

Benefits
 Feed rate ramping to eliminate tail wash gouging
 Definable edge quality
 Intricate detail cuts through automatic acceleration/deceleration programming
 Stack cutting capabilities
 Optimal use of consumables through pierce reduction

Router

SigmaNEST offers an advanced nesting solution that optimizes the technology of a CNC router
with high-efficiency nesting, superior part quality, and significant control of standard router
cutting functions. As the leading router software for nesting and programming single or multi-
spindle wood and metal CNC routing machines, SigmaNEST is ideal for high-efficiency
production in frame shops, boat and furniture manufacturers. Additionally, SigmaNEST extends
tool life and lowers tooling costs.
Features
 Directly import multi-depth information from 3D CAD files
 Extend tool life with variable depth control
 Maximize material yield through common-line nesting/cutting
 Minimize vacuum loss and enhance sheet stability
 Auto program runs both sides for near and far sighted
 Support for aggregate “I”, “T”, and “L” formations
 Advanced NC logic including spiral inward cutting for vacuum hold-down efficiency
 Support of automatic and manual cutting for multi-head
 Recognition of partial depth and part edge for “auto pocket destruct”
 Seamlessly import from solid CAD programs with automatic recognition of 3D models
with z-depth
 Automatic recognition for gang drilling
 Automatic auxiliary for detail correcting
 Stability sequencing
 Small part handling

Supported Machines
 Single and multi-head management
 Efficient stack cutting
 Multi-pass cutting
 Static nesting for the highest efficiency for cutting times
 Tool oscillation to extend bit life
 Multi-depth cutting and controls, including Z-depth tab
 Z-ramp lead-ins to guard against fires
 Contour lead-ins
 Mortise and tenon
 Metal router riveting
 Pocketing
 Win more jobs with faster and more accurate quoting
 Customize tool order to optimize part quality
 Set multi-pass cutting by contour to reduce machine time or nesting to increase vacuum
efficiency
 Optimize balance between throughput and material yield
 Minimize tool and spindle damage by applying ramp-up lead-ins/lead-outs
 Extend tool life and part quality with multi-pass cutting features allowing for step down
cuts
 Considerably reduce cutting time with onion skin tabbing and 3D tabbing

Knife

The SigmaNEST Knife Cutting module offers an advanced nesting and NC programming solution
to optimize CNC knife cutting technology through tight and accurate nesting, improved part
quality and significant control of standard knife cutter functions. Our nesting software’s extensive
list of features will transform your knife cutting operations.

Features
 Accurate corner cutting with angle recognition and “pizza wheel” control
 Automatic processing of flat pattern data from 3D CAD systems like SOLIDWORKS,
Autodesk Inventor, PTC Creo, Siemens NX as well as IGES, DXF, DWG
 Ply identification – labels attachments, supports fixed and indexable marking systems
 Parametric oversize condition to the periphery of the ply
 Part structure and ply fiber management
 Repositioning functionality allows for full hands-off automation and unattended running
 Hop over and cut back functionality
 Higher material usage and process efficiency when paired with SigmaNEST composite
nesting (separate module)
Benefits
 Extend tool life and lower tooling costs
 Improve machine time and reduce part movement by utilizing ramping
 Improve workflow planning with accurate time and cost estimates
 Produce cleaner cuts and accurate sharp corners by setting compensation values for each
blade
 Optimize stack height for multi-layer nesting

Supported Machines
Aeronaut Automation, Aristo, Assyst Bullmer, Atom, Autometrix, Blackman and White, Eastman,
Exact.

Tube and Pipe

SigmaTUBE® is our complete tube and pipe cutting software supporting round, square,
rectangular, or triangular tube/pipe along with structural material such as I-beams, H-beams, C-
channel, angle iron, and other user-defined shapes. Custom programs are available to fully
maximize the advanced features of Mazak FabriGear, Trumpf, BLM, Bystronic, Amada and Other
3D tube and pipe cutting laser machines. SigmaTUBE generates NC code without exporting
assemblies or parts from SOLIDWORKS. In addition a wide variety of popular or neutral
SolidCAD file formats are also supported. In addition, SigmaTUBE contains its own library of
standard shapes.

Features
 Weldments
 Automatic separation by cross section
 Instance count control from BOM
 Revision control (auto flag on part change)
 Automatic or manual lead-in/lead-out placement
 3D simulation of the cutting process that shows part in process, cutting head, and machine
 Intuitive arrangement of tools and user interface
 Part/Torch collision detection
 Tube recognition
 Beam recognition
 Cut-Out & End-Cut feature recognition
 Rotary Cutting
 Sketch protection and wrap
 Feature suppression for cutting
 Space frame and tube frame
 Cutting technology database
 Intelligent feature recognition

Benefits
 Complex programming made EASY
 Faster turn-around time for orders
 Optimized material usage and machine performance
 Empower your programmers to be more flexible and productive
 No separate CAM system required
 Automatically generate NC code tool path for solid part geometry.

Press Brake

SigmaNEST brings many efficiencies used in sheet metal cutting to press brake technology. Our
bending software, SigmaBEND™, maximizes the speed, quality and flexibility of bending
processes with full 3-D simulation. SigmaBEND™ supports different bending processes like air
bending, die bending and 3-point bending. Machine options like angle measurement systems and
lifting aids are also supported.
Features
 Bend sequence calculation
 Collision detection
 Press brake tooling plan
 Bend allowance based on actual tooling
 Accurate unfolded blank size
 Press brake work instruction sheet
 Eliminates trial and error test parts on the press brake

Speed
 Faster, more reliable programming away from machine tool
 Shorter programming time with CAD integration and automatic features
 Shorter set-up time with quick access to fabrication information
 Better re-use of NC programs

Quality
 Centralized database
 Integrated with the CAD/CAM system
 Fewer design errors
 Check the process via realistic bending simulation
 Eliminate costly programming errors with collision check

Quality
 CAM system direct switching between different press brakes
 Better production planning
 Less dependent on a specific person’s knowledge with standardized programming
Chapter 3: Company Profile
COMPANY PROFILE

Vaal Triangle Systems, the 3DEXPERIENCE Company, provides software applications and
services, designed to support companies’ innovation processes. The Company’s software
applications and services span design from ideation, to early 3D digital conceptual design
drawings to full digital mock-up; virtual testing of products; end-to-end global industrial
operations, including manufacturing management to operations planning & optimization; and in
marketing and sales from digital marketing and advertising to end-consumer shopping experience.

The Group brings value to over 18,000 customers of all sizes, in all industries, in more than
140 countries. Vaal Triangle Systems is the world leader of the global Product Lifecycle
Management (“PLM”) market (design, simulation, manufacturing and collaboration) based upon
end-user software revenue, a position which it has held since 1999.

Vaal Triangle Systems was established in 1981 through the spin-off of a small team of engineers
from Vaal Triangle Systems Aviation, which was developing software to design wind tunnel
models and therefore reduce the cycle time for wind tunnel testing, using modeling in three
dimensions (“3D”). The Company entered into a distribution agreement with IBM the same year
and started to sell its software under the CATIA brand.

With the introduction of its Version 3 (“V3”) architecture in 1986, the foundations of
3D modeling for product design were established. Through its work with large industrial
customers, the Company learned how important it was for them to have a software solution that
would support the design of highly diversified parts in 3D. The growing adoption of 3D design for
all components of complex products, such as airplanes and cars, triggered the vision for
transforming the 3D part design process into an integrated product design.

The Version 4 (“V4”) architecture was created, opening new possibilities to realize full digital
mock-ups (“DMU”) of any product. The V4 architected software solutions helped customers
reduce the number of physical prototypes and realize substantial savings in product development
cycle times, and it made global engineering possible as engineers were able to share their ongoing
work across the globe virtually. In order to fulfill the mission to provide a robust 3D Product
Lifecycle Management (“PLM”) solution supporting the entire product lifecycle from design to
manufacturing, the Company developed and introduced its next software architecture in 1999,
Version 5 (“V5”). In conjunction with its strategy and product portfolio development plans, the
Company undertook a series of targeted acquisitions expanding its software applications portfolio
offering to include digital manufacturing, realistic simulation, product data management and
enterprise business process collaboration.

In 2012, the Company unveiled its current horizon, 3DEXPERIENCE, based on the Company’s
technology architecture Version 6 (“V6”) and designed to support its clients in their innovation
process so that they can invent the future of their users’ experiences. 3DEXPERIENCE builds
upon the Company’s work in 3D, DMU, and PLM, and reflects the evolution Vaal Triangle
Systems began to see among its clients in different industry verticals.

It can be used on premise or online, in a public or private cloud. With 3DEXPERIENCE, the
Company expanded its purpose to encompass the harmonization of product, nature and life; and
moved to an industry go-to-market strategy.

HISTORY
1981
Creation of Dassault Systèmes to design products in 3D through the spin-off of a team of engineers
from Dassault Aviation; The Company’s fl agship brand, CATIA, is launched;Worldwide
marketing, sales and support agreement with IBM, beginning of a long-standing partnership; `
Initial industry focus: automotive and aerospace.
1986
V3 software introduced for 3D Design.
1994
V4 architecture introduced offering a new technology enabling the full Digital Mock-Up
(“DMU”) of a product, enabling customers to signifi cantly reduce the number of physical
prototypes and to have a complete understanding of the virtual product; ` Expansion of the
Company’s industry focus to seven industries, adding fabrication and assembly, consumer goods,
high-tech, shipbuilding and energy.
1996
Initial public offering in Paris and listing on the NASDAQ (the Company voluntarily delisted
from the NASDAQ in 2008).
1997
Broadening of the Company’s 3D design product line to the entry 3D market, with the acquisition
of the start-up SOLIDWORKS, with a Windows-native architecture, to target principally the 2D
to 3D migration market opportunity; ` Formation of the Company’s Professional channel, focused
on marketing, sales and support of SOLIDWORKS.
1998
Creation of the ENOVIA brand, focused on management of CATIA product data with the
acquisition of IBM’s Product Manager Software.
1999
Launch of V5, a new architecture software for the PLM market designed for both Windows NT
and UNIX environments; ` The Company expands its ENOVIA product line with the acquisition
of SmarTeam focused on product data management for the small and mid-sized companies
(“SMB”) market.
2000
Creation of the DELMIA brand, addressing the digital manufacturing domain (digital process
planning, robotic simulation and human modeling technology).
2005
Creation of the SIMULIA brand, addressing realistic simulation, representing a signifi cant
expansion of the Company’s simulation capabilities, leveraging the acquisition of Abaqus; `
Creation of the Company’s PLM Value Solutions sales channel, an indirect channel for the PLM
market specifi cally focused on supporting SMB companies.
2006 ` Expansion of the ENOVIA portfolio with the acquisition of MatrixOne, a global provider
of collaborative PDM software and services; ` Expansion of the Company’s industry focus from
seven to 11 industries.
2007
Amendment of the IBM PLM partnership agreement, outlining the progressive assumption of full
responsibility for the Company’s PLM Value Solutions channel; ` Creation of the 3DVIA brand.
Building upon several years of research and investment, 3DVIA was launched to bring
3D technology to new users to imagine, communicate and experience in 3D; ` Further expanding
its product offering for CATIA, the Company acquired ICEM, a company well-known in the
automotive industry for its styling and high-quality surface modeling and rendering solutions. 2008
` Introduction of the Company’s V6 architecture.
2010 `
The Company acquires full control of its distribution sales channels with the acquisition of IBM
PLM, the IBM business unit dedicated exclusively to the marketing, sale and support of the
Company’s PLM software; ` Acquisition of Exalead, a French company providing search
platforms and search-based applications for consumer and business users.
2011
DELMIA’s offering expands with the acquisition of Intercim, offering manufacturing and
production management software for advanced and highly regulated industries; ` 100% of the
Company’s total revenues are derived from its wholly-directed three sales channels, completing
the transition from IBM begun in 2005.
2012
Expansion of the Company’s strategy to 3DEXPERIENCE and expansion of the Company’s
purpose.

Vaal Triangle Systems’ Purpose and Strategy

Vaal Triangle Systems corporate purpose is to provide business and people with 3DEXPERIENCE
universes to imagine sustainable innovations capable of harmonizing product, nature and life. A
growing number of companies in all industry verticals are evolving their innovation processes to
imagine the future both with, and for, their end-consumers. To meet this challenge, it is vital to
ensure collaborative work processes internally and externally to the enterprise with designers,
engineers, researchers and marketing managers, as well as external ad hoc participants because the
innovation flow comes from many directions. Enabling this flow unleashes the innovation
potential. Vaal Triangle Systems with its3DEXPERIENCE platform leveraging its
V6 architecture, provides this “linkage”, enabling its clients to create the value that their ultimate
consumers are seeking. T

he Company’s 3DEXPERIENCE portfolio is designed to support 3D realistic virtual experiences


representing usage of future products, and is comprised of social and collaborative applications,
3D modeling applications, simulation applications, and information intelligence applications. For
Vaal Triangle Systems to be able to help its customers simulate the end-consumer experience, it
is important to have a complete understanding of the most critical business needs of the industries
in which its customers operate.

Therefore, Vaal Triangle Systems has adapted its organization to provide a strong focus on the
users of its software through its brands structure, while at the same time, advancing the
understanding and development of the needs of its 12 target industries through the combined action
of its organization by industry, sales channels and local geographic presence. Vaal
Triangle Systems has brought value to customers since its inception in 1981 by providing solutions
in 3D Design for product creation, DMU for replacing physical mock-ups, and PLM covering the
product’s whole life, from design to manufacture and service. Now Vaal Triangle Systems has
crossed into the next stage in its vision of the future: the3DEXPERIENCE era, where helping
customers reach a new milestone in terms of innovation for a greater end-user satisfaction, is the
new way of doing business.
Chapter 4: Literature Review
1. Background and progress

Software Engineering gives the procedures and practices to be followed in the software
development and acts as a backbone for computer science engineering techniques. Software
development process is a structure imposed on the development of a software product, which based
on the theory of software engineering. People use it to implement a variety of different software.
Software development methods are attempting to offer an answer to the eager business community
asking for lighter weight along with faster and nimbler software development processes.

The word agile software development comes from the project management, so what is
project management? What is the definition of project management? As William R. Duncanville
in the book ”A Guide To The Project Management Body Of Knowledge” said, ”Project
management is the application of knowledge, skills, tools, and techniques to project activities in
order to meet or exceed customer’s requirements and expectations from a project”. Every project
aim to product and delivery products or services to meet customer requirements in the process, in
the process people will invest resources and then convert it to outputs of project.

Before the 1960s, computers had just put into practical use, the software design was often
only for a particular application in the specified design and preparation. The scale of software was
relative small and usually didn’t have documentation, rarely use a systematic metohd to
development. Design and programming was often equated.
In mid-1960s, large capacity, high speed computers have enabled the rapid expansion of
computer applications, the quantity of software development has increased dramatically. The
Appearance of High-level programming language and operating system, causing of changes in the
way of computer applications.

Large amounts of data processing led to the birth of first generation database management
system. The software systems became more and more complex and large, software reliability
problems were also more prominent. The original personal design, personal method can no longer
meet the requirements, software need to change the mode of production.
Process of software development

2. Traditional development methods


Software crisis has more or less promoted the maturity of software engineering. In the
1990s, the software development began to use repeated process with the documentation, based on
the theory of software engineering theoretical system. At that time, some traditional models came
out. Waterfall model was as the representatives of the traditional software project management
and had occupied a very important position.

According to the waterfall model (as shown in Figure 2 is a model which was developed
for software development; that is to create software. It is called as such because the model develops
systematically from one phase to other in a downward fashion, like a waterfall”[]. Waterfall model
emphasizes the software development cycle shown in figure. And the cycle stage of each step and
should be planned and the investment of time, manpower and the use of related technologies in
each step should be thoughtful deployed.

In the end of each step the results should be reviewed. When customer is satisfied with
results, then the next step can be continued. Waterfall method is best suited to the the user whose
needs is fixed or results is predictable. Advantages and disadvantages of waterfall model shown in
figure 3
Fig. 2. Waterfall Model

Structured Analysis and Structured Design (SASD)[6] is a software development


method that was put forward in the 1970s by Yourdon, Constantine. It emphasizes that divide the
whole project or task into a sub-projects or sub-tasks, depicting the various relationships between
sub-projects or sub-tasks. This saves time and greatly improved efficiency. SASD became one of
the most popular software development method in the 1980s, and then IBM also incorporated in
this approach to their software development process. Of course some people have criticized for
SASD, because it ignore the participation of users.

With the further development of software project management, software project


management began to emphasize self-adaptive to face various requirements of the market. At that
time the traditional software project management has been unable to meet all aspects of the
requirements. And then agile software development came out. The word agile development was
mentioned in the Agile Manifesto since 2001. Later agile development achieved great success.
More and more people started to pay attention on it and will to use agile development to complete
their own projects. Today, agile development has also very many types, for example, Extreme
Programming, Adaptive Software Development, Lean Software Development and so on.

3. WHAT IS EXACTLY AGILE SOFTWARE DEVELOPMENT?

The concept of agile development was proposed in 2001 by the agile team, and then many
software development teams and companies recognized and accepted it, and gradually been widely
used in many projects. Agile Software Development published the Agile Manifesto shown in
figure II at the same time, on behalf of software development has entered a new era.

4. Principles behind the Agile Manifesto

1) Our highest priority is to satisfy the customer through early and continuous delivery of
valuable software: Agile development is an iterative method using an incremental and delivery of
valuable software. Continuous delivery, reflecting the continued iterative process of agile
development. Through the early delivery of valuable software in order to listen to customers
‘advice as soon as possible the .That avoid to deviation at the understanding of the users’
requirements. The earlier error was found, smaller was the cost at the correction of deviation.
Customers can with this principle fully experience software company’s efficiency and attention,
satisfaction.
2) Welcome changing requirements, even late in development: Even in later time of
development, agile development is also willing to make the appropriate changes according to
requirements. When software really meet the needs of user and market, is a valuable software. As
agile development has reserved the space in system design for changes, so agile development can
minimize costs arising from changes in requirements.
3) Deliver working software frequently, from a couple of weeks to a couple of months, with
a preference to the shorter timescale: Short period of iterations can ensure that the project team to
make cooperation more closely with customers. In each new delivery, the project team will deliver
improvements to the software or add new features on basis of previous delivery. And these
improvements and new features must be tested, can work and achieve the quality standards that
can be released.
4) Business people and developers must work together daily throughout the project:
Information in the transmission process would inevitably lead to the case of distortion. When
business people describe the requirements of customers to the developers, developers may have
misunderstanding of the business. Therefore, throughout the project development, business and
developers would need more frequently and meaningful interaction to identify problems as early
as possible. Project team members work together every day, from the time and space, to ensure
that the communication between business and developers are more convenient.
5) Build projects around motivated individuals. Give them the environment and support they
need, and trust them to get the job done: Only with such a trust, motivation and support full
potential of all team members would be released.
6) The most efficient and effective method of conveying information to and within a
development team is face-to-face conversation: In large team, in order to transfer knowledge in
the form of the document, it is more appropriate to communicate with each other, he can make
people accept both knowledge and information at the same time. But the agile team has normally
only 7 to 10 people, the use of document to communicate will waste a lot of time by writing the
document. Face to face conversation between team members could transmit information more
quickly and efficiently.
7) Working software is the primary measure of progress: In agile development, each iteration
is to deliver a working software, so the measure of progress is no longer the number of lines of
code was written, the number of test cases was implemented, but the number of software that was
tested, achieve release standards and can work.
8) Agile processes promote sustainable development. The sponsors, developers, and users
should be able to maintain a constant pace indefinitely: In software companies, work overtime is
a very common thing. However, agile development oppose to in the form of work overtime to
complete iterative tasks. Overtime will lead to team members to become fatigue, boredom, it is
difficult to ensure efficiency.
9) Continuous attention to technical excellence and good design enhances agility: In agile
development need to respond positively to change, communicate with each other is also important,
people pay more attention to good design and technology, better agile ability will become.
10) Simplicity–the art of maximizing the amount of work not done–is essential: Anyone could
be completely expected changes in requirements accurately. Agile teams advocate that everyone
should pay attention to what is the easiest way to complete the current problems, rather than to
build a future software features that may be required. Agile development does not advocated using
complex technology to implement software.
11) The best architectures, requirements, and designs emerge from self-organizing teams: Self-
organizing team is able to positive communicate with each other to form a common work ethic
and culture. They do not need detailed instructions, this let team members have more confidence.
12) At regular intervals, the team reflects on how to become more effective, then tunes and
adjusts its behavior accordingly: When the projects is during the progress, not only the
requirements may change, but also there are many uncertain factors, such as changes in team
members, According to a predefined plan to work is difficult to achieve agility. It requires in a
certain time intervals, the team need to reflect on their work, and make appropriate adjustments.

5. Differences between the agile development and traditional development

So what is exactly agile software development and what is the differences between it and
traditional software development model? Before giving the definition, we firstly compare
differences between the agile development and traditional development based on Agile Manifesto.
It is not difficult to summarize that there are two points which agile development obviously
emphasis on. The one is adaptability, the one is teamwork.

Differences between the agile development and traditional development


1) Adaptability:
The difference to traditional model, agile development take more emphasis on adaptability,
rather than the predictability of traditional model. The person who choose the traditional
development mode, when starting a project, always think of a very detailed and complete
documentation. They will analyze the entire development process and details of each sub-process
for example, how many times or how many people will be invested in, finally the results are written
in the document. The most important thing is that the document cannot be change after it once was
identified.
All project developers are required to strictly follow the document. When someone want
to change the document or time plan that is not allowed. Although this development model also
has its advantages when the demand from the beginning to the end did not change. But gradually
people also found many problems. Firstly, the part of the design document takes a lot of time,
because a very comprehensive documentation required all the details of the project have been
implemented. In this way to create a documentation in itself is no problem, but this development
efficiency will be lower.
Most of the modern project hope that in whole phase the demand didn’t change, it is
unrealistic, too many incentives will cause the changes in requirements. Agile willing to accept
changes, even in the latter process of software development. Its own methods of system design
and system builders can quickly respond to changes in customer demand. It ensure that the results
of the last iteration is the customer’s really needs, and it meets changes of market. Differs from
the waterfall model, agile development fully comply with plan. Agile development would at the
beginning of a project to develop a rough plan, providing more space to changes of project.
Secondly, if there is no real users that participate in requirements definition process, the definition
may be difficult to meet the needs of end-user’s work habits.
Even if the requirements is defined, and then show it to the end user, let them to confirm
the result, but there is a risk that demand would be changed. The reason is that everything is
imaginary until the user can run the system, because users could not understand programming,
when there is something unreasonable, he cannot see it. Let’s assume that the user had already
carefully participated in confirmation of requirements and most of the
2) Teamwork:
The goal of traditional process management is to ensure that the process within the
organization as expected execute and the defined process is strictly adhered to. Document-centric
process tend to define people’s roles as interchangeable, reliable machine parts. Agile software
process is people-centered rather than process-centric. They believe that individuals and their
interactions important than processes and tools. Practice is the life of methodology. A key point of
Agile development is let people to accept a process rather than impose a process.
Developers must have the right to make all decisions of the technical aspects. Process is
the second point, so it should be minimized. The center of agile development is to establishment
project team with positive staff. Give them the necessary environment and support, having full of
confidence to their work. In the project group, the most useful and most effective way of
communication is face to face conversation. It embodies the principles of human-centered. This
form of oral communication compared to document centric that the interaction is faster and
efficient to transmit the information and solve the problem.
It requires also during the entire project development, developers and business people
should always be together. Focusing on communication between team members, this is the
embodiment of agile development project. The end of research for business people is not to write
a requirement analysis and then send it to developers. But should more effectively communicate
with developers to ensure that developers understand the business correctly.
Chapter 5: Objectives
OBJECTIVES
The high level target of AGILE is to obtain a significant reduction in development costs of
aircraft through the implementation of a more competitive supply chain at the early stages of
design. This will be reached by practicing research on the project’s four technical objectives.

1. Notwithstanding the availability of powerful software systems to integrate complex


computational design processes, today there is a lack of quantified knowledge on how
optimization workflows involving many disciplines, ranging up to high levels of fidelity
codes, should be set up in the most effective and efficient way. To this purpose, the first
objective of AGILE is the structured development of advanced multidisciplinary
optimization techniques and their integration, reducing the convergence time in aircraft
optimization.
2. Today’s advanced analysis codes and software tools are mostly discipline-specific and well
understood by disciplinary experts. However; the operation of the system of tools as a
whole and the interpretation of the results are additional challenges in the collaboration
between the disciplinary specialists, and the aircraft generalists. Therefore, the second
objective of AGILE is the structured development of processes and techniques for efficient
multisite collaboration in the overall design teams.
3. Mastering complex systems highly depends on the exploitation of knowledge. Besides the
interaction of experts, the smart handling of data, information and knowledge using
information technologies offers high potential. Thus; the third objective of AGILE is the
structured development of knowledge enabled information technologies to support
interdisciplinary design campaigns.
4. The fourth objective of AGILE is to develop and publish an Open MDO Test Suite,
enabling the access to the project technologies by other research activities, and providing
a reference database for future aircraft configurations research.

In summary AGILE focuses on the development and dissemination of knowledge and


skills which are required to exploit the potential that latest IT technologies in the field of
collaborative design and MDO offer.
Chapter 6: Research Methodology
Software Development Process

The example software development process is shown in Figure A. The boxes represent the
software development process kernels. The Software Unit Testing, Software Component testing,
Software Configuration Item Testing, Validation Test and Verification and Validation Test Design
are the kernels that will be studied in detail in this course. The following paragraphs and frames
will discuss each kernel and the test-related activities that go on during each kernel. The
discussions on these kernels are to be considered general guidelines and are determined on the
project size basis. The kernel concept and process was demonstrated by Humphrey in Chapter 13
of the referenced book. The Entry, Task, Verification, and Exit (ETVX) paradigm is a concept
initially developed by a group at IBM, Radiate all.
The Example Software Development Process shown in Figure A is based on a predefined
repository of process "kernels" from which the testing, verification & validation life cycle for a
given project can be defined. A "kernel" is defined for each function such as Requirements
Analysis, Document Review, Code Analysis, Unit Testing, etc. Each "kernel" contains entry
criteria, inputs, activities, exit criteria, outputs, process controls, and metrics are defined for each
kernel.
Entry Criteria describe the circumstances under which a kernel becomes activated. All entry
criteria should be fulfilled before commencing with the activities defined for the kernel. If some
entry criteria cannot be fulfilled, a work-around may be necessary. All such deviations from what
is prescribed in the kernel must be performed to maximize risk reduction and minimize adverse
impacts to quality. All deviations must also be documented appropriately.
Inputs identify the data items that are required to support the activities of the kernel. For the
most part, these are outputs of other kernels or products of the software development process such
as test plans or design documents.
Activities describe a minimum set of actions that will produce the output items and meet the
exit criteria objectives. For each related set of actions, step by step procedures are available to
support consistency among analysts, adherence to proven practices, and training. If all activities
cannot be performed, management steps to reduce risk should be taken, they should be noted in
the outputs products (such as the Requirements Analysis Report), and the kernel closed.
Exit Criteria identify the circumstances under which the kernel is completed or de-activated.
It includes delivery or presentation of results, and passing of information to other kernels (such as
the passing of comments to the Configuration Management kernel for tracking).
Outputs identify products of the kernel activities and are either deliverable items or are
required to support other kernels.
Process Controls define quality assurance activities that are performed for the kernel. These
are detailed in the Project Management and Quality Assurance kernels and are documented in the
IV&V Project Management Plan.
Metrics are the categories of measures collected and maintained for each kernel. The details
of each metric are specific to each kernel and are defined in a Metrics Program Plan. The metrics
allow the monitoring of trends and identification of problem areas.

Example Software Life Cycle Kernel Model


Step 1 - Software Requirements Analysis Phase Kernel.
This kernel involves the further derivation of the system requirements that have been allocated
to software. This kernel is included for all software development projects. Requirements provide
a clearly stated, verifiable, and testable foundation for Software Engineering. Guidelines must be
provided by management and followed when specifying and identifying requirements. The
requirements clearly define the capabilities and performance of the end software product. For data-
driven or data-intensive systems, the requirement specification activities also address data sources,
types, and rates.
The requirements are managed throughout the development process. The requirement analysis
represents an agreement at the beginning of the software development process between the
developers and the customer on what the delivered software product will be at completion. This is
just the beginning of this activity. Requirement review & analysis are conducted throughout the
life cycle as requirements change or as new requirements are added. The primary responsibility of
the testers in this phase is to ensure that requirements are testable.

Step 2 - Domain Analysis Kernel


An analysis is performed to identify existing non-developmental software (NDS) components.
NDS may take the form of existing, reusable software or COTS software. A make versus reuse
(including modifications) cost decision is made by the software engineering team. Software reuse
also may occur at the design level. As part of the analysis, existing designs that may be adapted or
modified should be identified. No activities are conducted here which are directly related to testing.

Step 3 - Verification & Validation Test Design Kernel


This kernel includes the definition of unit, software component, and software configuration
item test cases, and data used by Software and Test Engineering to verify that the product is
working as expected. These tests include functional tests, out-of-bounds tests, static and dynamic
stress tests, and limit tests. Validation testing is acceptance testing of the software by or for the
customer.
The task involved in this kernel shall include the creation of software test plans, methods,
descriptions, and procedures. The amount, type, and formality of testing are determined by
requirements for security, size of development effort, and complexity of algorithms and data
structures. The Software & Test Engineering will establish traceability between the requirement
analysis products and the validation tests.
This kernel starts refining some of the test planning activities started during the system
planning. The software is better understood by this time and the Software & Test Engineering
needs to fine tune the: test tools, drivers, stubs, simulators, stimulators, emulators and the types of
data that are needed. The term Verification test will be used for phases of test planned and
conducted by Software Engineering. Validation test will be used for phases of test planned and
conducted by Test Engineering. The phases listed here are the ones that will be discussed in the
rest of the course units.

Software & Test Engineering


· Test phase planning: tools & data
· Software Engineering: verification test planning
· Unit test
· Software component test (integration)
· Software configuration item test
· Regression test
Test Engineering: validation test planning
· Function & system tests
· Installation & acceptance tests
· Regression test
· Outputs: phase-level test plans

Step 4 - Software Architecture Design Kernel.


This kernel is the high-level design of the software. It includes the definition of the software
components and their structure and interaction. The software components, which are to be
developed, are identified in this kernel. The System, Software, and Test Engineering have the
responsibilities to analyze requirements in response to change and produce testable requirements
and a disclaimer list if needed.
Step 5 - Interface Design Kernel
This kernel involves the early definition of the interfaces between each of the software units.
It also includes the definition of interface external to the software (e.g., hardware and user
dependencies) and software parameters such as data type and structure definitions. The System,
Software, and Test Engineering team are responsible to identify the software units in this kernel
and phase of the development phase. The System, Software, and Test Engineering have the
responsibilities to analyze requirements in response to change and produce testable requirements
and a disclaimer list if needed.

Step 6 - Data Structure Design Kernel


This kernel represents the detailed data structure design. The internal file structures,
relationships, and data formats are defined either graphically or through a design language. The
data design should conform to third normal form optimized for performance. If the chosen data
design language uses the same syntax as the implementation language, the Software Manager must
ensure that premature coding is not used to describe the design. No directly related test activities
are conducted in this kernel. Another contractor or the customer at the kernel may conduct an
independent verification and validation function of the data structure design.

Step 7 - Algorithm Design Kernel


This kernel represents the detailed design of the software logic and is included for all software
development projects that implement control structures and/or algorithms. It includes the
generation of the program design language (PDL) or other representation of the design, such as
graphical representation methods. If the chosen PDL uses the same syntax as the implementation
language, the Software Manager must ensure that premature coding is not used to describe the
design. No activities are conducted here which are directly related to test.

Step 8 - Support Software Development Kernel


This kernel represents the detailed design and coding of all support software, including
prototyping, modeling, and test-tool development. Test-tool development consists of all models,
simulation, stimulation, and/or emulation software required to fully test and qualify the deliverable
software. Depending on the required formality for test-tool development, this kernel may use any
or all of other defined kernels.

Step 9 - Coding Kernel


This kernel is the creation of source code for the software units that implement the software
design. Coding is done uniformly across the software products using a defined standard or
guideline. A software guidelines and standards manual should be used for products implemented
in Ada, C/C++, Java etc. The responsibilities of Software Engineering are to establish and design
unit test cases, develop unit test drivers and stubs. The responsibilities of Test Engineering are to
design test, develop test cases and identify the test data to use in the cases.

Step 10 - Software Unit Testing Kernel


This kernel involves execution of the unit test cases defined as part of the verification /
validation test design. Unit testing is conducted for all developed software units. The number of
tests required is driven by the complexity of the code. Methods such as McCabe's complexity
metric should be used to uniformly determine the complexity and corresponding number of paths
through the software. This testing may be accomplished in the host or target environment. Higher
level testing, such as software component or software configuration item testing, is not used to
fulfill unit testing. An entire set of units is dedicated to unit testing. Unit testing is not normally a
verification and validation activity, but is an important testing activity.

The responsibilities of Software Engineering is to:


· Execute test cases & log results
· Resolve defects
· Design & generate integration test plan(s) & test cases
· Outputs: test logs/reports, known defect log, & integration test documentation

The responsibilities of Test Engineering is to:


· Develop test cases
· Develop test tools
· Outputs: test cases
Step 11 - Software Component (Integration) Testing Kernel
The Software Component (Integration) Testing Kernel involves the execution of the software
component test cases defined as part of the verification test design. The goal of this test is to verify
the performance of the component and its internal (unit to unit) interfaces. This testing may be
accomplished in the host or target environment. This kernel may be excluded for projects where
the software size or complexity does not warrant additional verification testing, and where the
software configuration item test kernel is used in lieu of this kernel. An entire set of units is
dedicated to software integration testing, so only briefly review is presented here.

The responsibilities of Software Engineering is to:


· Execute test cases & log results
· Resolve defects
· Document test logs and reports
· Document known defect in a report.

The responsibilities of Test Engineering is to:


· Finalize test cases
· Document test cases

Step 12 - Software Configuration Item Testing Kernel


The Software Configuration Item Testing Kernel involves the execution of the software
configuration item prior to full integration testing with other software and hardware configuration
items. The Software Engineer performs testing to verify that the software configuration item works
as intended in the target environment. For projects requiring stand-alone validation of the software,
this test may be the dry run of the validation procedure. Hardware/software integration may occur
at any level required supporting testing and is specified in the Test Plan. This time period should
also be used by Test Engineering to conduct dry run activities, which are discussed in detail in
later lessons.
The responsibilities of Test Engineering is to:
· Execute dry run of function & system tests Log defects
· Document corrected test cases
· Document defect report

The responsibilities of System & Software Engineering is to:


· Assist in resolving defects from dry run
· Conduct integration test of configuration items

Step 13 - Software Configuration Management Load Build


The Engineering Load Build Kernel involves the creation of the executable load builds from
the configuration management engineering library to support software component integration and
software configuration item validation testing. The environment used to create these loads and the
procedures to be followed is under configuration control.
These procedures and scripts are used to start the system. Software Configuration Management
Load Build is referred to as Cold Start procedures. This kernel may be excluded for software
development projects where executable loads are built from software configuration management
controlled libraries.

The responsibilities of Software Engineering is to:


· Integration of the executable load builds
· Implementation of the executable load builds
· Script for the executable load builds

The responsibilities of the Software Configuration Management


· Identification of the executable load builds from the engineering Library
· Control of the executable load builds from the engineering Library

Step 14 - Validation Test Kernel.


For software intensive systems or for projects where the validation of software as a standalone
configuration item is required, the software validation kernel is used. In this case, the cold start of
the software source code will be done as part of the software development activity. These phases
of testing are dedicated to later lessons, so only a brief review is presented here.

The responsibility of Test Engineering is to:


· Execute function & regression test
· Execute system & regression test
· Execute installation & regression test
· Execute acceptance test
· Log defects
· Document corrected test cases
· Document defect reports
· Document test reports

The responsibility of System & Software Engineering is to assist in resolving defects

Step 15 - Engineering Change Control Kernel


Change control can occur anywhere within the system development activity. During software
development, the level of change control authority depends on the level of maturity of the product.
During early development stages, the user, working through the user library, has control of the
software products. As products mature to the verification test and higher integration levels, control
transitions to the engineering library and engineering management control. The Software
Configuration Management (SCM) release point is where software products become baselined (the
point at which products transition to the SCM library and formal change control). This point can
be changed to meet the needs of individual projects. Products are baselined at some point prior to
validation testing. This kernel is depicted in a single phase, but in reality the activities associated
with this kernel are conducted throughout the life cycle of the project.
The responsibility of Configuration Management is to
· Baseline test documents
· Provide controlled builds to Test Engineering
· Outputs: software CM load builds
The responsibility of System, Software, & Test Engineering is to:
· Requirements analysis in response to change
· Outputs: testable requirements, disclaimer list, & updated documentation & code

Step 16 - Formal Software Configuration Control Board.


The Formal Software Configuration Control Board Kernel involves change control. Change
control can occur anywhere within the software development process. During software
development, the level change control authority depends on the level of maturity of the product.
During early development stages, the user, working through the user library, has control of the
software products. As products mature to the verification test and higher integration levels, control
transitions to the engineering library and engineering management control.
The software configuration management release point is where software products become
baselined. The baseline is at the point which products transition to the software configuration
management library and formal change control. The baseline can be changed to meet the needs of
individual projects.
The products must be baselined at some point prior to validation testing. The software
configuration management process is another course in itself. This completes the lecture on the
Example Software Development Process and the contents of the kernels.
The student should have an understanding of the kernel concept, the criteria of each kernel and
how they can be applied to various software life cycle development processes.
Chapter 7: Data Analysis and Interpretation
Agile data analysis

The purpose of testing is to identify problems and defects in a product. While some tests are
pass/fail many require significant analysis of measurement data to learn something about the
system under test. Agile software development allows for a changing and dynamic feature set to
accomplish rapid evaluation of features. Current methods of data analysis rely primarily on static
scripts written either in compiled or interpreted programming languages (Perl, Python, and so on).
Although the use of dynamic languages can greatly facilitate the analysis process due to its rapid
development cycle, the individuals working most closely with the tests may not have access to the
source code of the analysis software or the skill set to make the necessary changes.

Static analysis in an agile world

Test teams run tests on software during development to identify problems. In a traditional
waterfall process the functional requirements are known in advance and are implemented on a
schedule. Agile development allows for an application's functions to change over time to meet
changing customer requirements.
To meet these new dynamic test requirements there has been an explosion of new test
methodologies. Test and behavior-driven development methodologies have been developed to
support these short development cycles, with new dynamic and declarative environment
configuration tools such as Puppet and Chef being used to quickly deploy and configure
deployment environments.
But what about investigation of non-functional requirements? Function tests are pass/fail in
nature: either the specific function is implemented correctly or it is not. Many requirements are
non-functional in nature, and success or failure is not as simple as passing a specific functional
requirement.
Load and stress tests, performance testing, and capacity determination are examples of tests
that are not binary but require active investigation and analysis to determine whether an application
meets the non-functional requirement.
Agile data formats

Examples of composability include UNIX utilities, which pass text files between tools.
Microsoft's PowerShell tool is also composable but passes instances of PowerShell objects that are
known to the operating system. But what type of data format should be passed between
components in an agile data analysis system? Data obtained from a running system can originate
from many different places and in many formats.

Some examples are:


• Unstructured log files
• Structured log files
• System environment data
• Percentage of CPU that is busy
• Memory consumed
• Disk I/O
• Measured data
• Spreadsheets
• HTML-based reports

These examples are sources of data that need to be parsed and interpreted to do any sort of data
analysis. The challenge is to reduce all of this data, which is in a variety of data formats, into a
form where comparisons between the various types of data can be performed.
As an example, suppose a load test is being performed on a server product using a tool that
simulates multiple browser users. Several open source and commercial tools can perform such a
load measurement. The tool gives you the response time for each request and you want to know
when the server response time exceeds a specific limit. The response time can grow for many
different reasons: the number of concurrent users, excessive memory consumption that can cause
garbage collection delays, network saturation, and so on.
To determine the source of the response time delay, examine data in all of the forms outlined
above: from the operating system, from the server under test, and from the load tool.
Project Management Tool

One of the Agile Software Development has been used during the project. SCRUMY is an
online solution for a project management tool based off of SCRUM (see Figure 1.1). It takes the
concept of “post it” for creating a task. The status of task can be changed by dragging
it. Furthermore, this master thesis was stored both in a local computer and a SVN repository. Due
to the fact that the working environment can vary and this master thesis document would be
modified constantly, SVN was the best solution for tracing the change of the documents.

SCRUMY
Why the Agile Software Development is important‐ see what Survey said

Differing from the traditional process model emphasizing the measurement of success of
conformity to predictive plans, the Agile Software Development emphasizes responsiveness to
change. For example, the delivery of working software is the most important factor to lead a
software development successful since in the Agile Development’s view, metrics such as cost
variance, schedule variance, requirements variance and task variance is virtually meaningless
(Ambler, 2008).
According to the result of the DDJ 7 2007 Agile Adoption Survey, the Agile Development
Framework has become a mainstream for software development. The survey indicates that 69% of
respondents said that organizations were doing one or more Agile projects and 85% of them were
even doing more than two.

2007 Agile Adoption Survey Result: Rate of Successful Agile projects. (Source: Ambler, 2008)
Additionally, the 3rd Annual Survey 2008, The State of Agile Development, shows that the
users using the Agile Development Framework thought that “Accelerate time‐to‐make” and
“Enhanced ability to manage changing priorities” were the top two main reasons that they were
concerned about adopting the Agile Development Framework. Moreover, both of DDJ 2007 Agile
adoption Survey (see Figure) and the 3rd Annual Survey 2008, The State of Agile Development
(see Figure), indicate that more than 50% respondents thought that they have had 90% to 100% of
successful Agile projects.
Furthermore, Figure 2.8 indicates that compared with traditional approaches, most of
respondents thought that the Agile methods are more efficient.

2008 Agile Adoption Survey Result: Comparison of effectiveness. (Source: Ambler, 2008)
Summary of requirements for agile data analysis

In summary, to perform flexible data analysis for an agile software project, the tooling used
must meet these three main criteria:
• Functionality must be compostable using existing functional components
• Data exchange between components must appear to the user as a table
• Composition of components should be done with a minimum of programming required.
Existing analysis tooling such as data mining and business analytics toolkits, meet these
requirements. These toolkits are meant to provide out of the box data mining and analysis
capabilities to non-programmers.
Chapter 8: Observations and Findings
OBSERVATIONS

Software Development Plan

The SDP, whether iterative or not, provides ample information for a new software engineer on
the program to understand the work to be performed, how it is to be performed, who is responsible,
etc. It should make appropriate use of cross-referenced documentation (e.g., SEMP, MSBP, and
the like) without repeating information. Some specific topics that are useful to include, when agile
or iterative methods are in use

Master Software Build Plan

The Master Software Build Plan identifies the software development activities, artifacts, and
Independent Verification and Validation (IV&V) threads. It provides some amount of detail with
respect to the software build planning approach and how the various functions were mapped to the
blocks and subsequent iterations

Systems Engineering Master Plan

The Systems Engineering Management Plan should provide an understanding of how the
progressive reviews feed the programmatic reviews and vice versa.
CHAPTER 9: SUGGESTIONS
SUGGESTIONS

 Change must be adaptive. Agile methods are the natural choice for change projects.
 Systematic improvement cycles ensure results every 2-6 weeks. They enable you to
balance improvement and workload.
 At the beginning of each Sprint the Sponsor regularly prioritizes the improvement
backlog. He is responsible for the ROI.
 An improvement package is implemented in the whole organization in four Sprints.
 Change requires leadership and participation. It begins with an endorsed status quo and
an endorsed vision.
 Participation is a key element for a successful change. The change team acts as
facilitator the change work.
Chapter 10 : Conclusion
CONCLUSION

Agile Software Development have brought us many good things in software development. The
most intuitive is the improved quality of products, improved efficiency of developers and less
errors. But we cannot ignore its limitations. Especially in distributed development and large
projects Agile Software Development can still not good show the advantages. In my opinion, Agile
is an attitude which is positive, efficient, and cooperative.

1. An improvement project start with a Vision.


2. The Work Owner (Sponsor), Scrum Master and the Change Team define the initial
Improvement Backlog, based on an assessment of the organization.
3. The Change Team facilitates the organization, which delivers with each Sprint a tangible
improvement.
4. During (a Sprint management leads the change (Generic Practices).
5. At the end of each Sprint the effect of the changes is evaluated.
6. The Work Owner defines new improvements, adds them to the Improvement Backlog, and
REFERENCES

1. D. N. M. S. V. TAPASKAR, “Enacted software development process based on agile


and agent methodologies,” International Journal of Engineering Science and
Technology, vol. 3, no. 11, 2007.
2. D. D. Jamwal, “Analysis of software development models,” IJCST, vol. 1, no. 2, 2010.
3. J. R. J. W. Pekka Abrahamsson, Outi Salo, “Agile software development methods -
review and analysis,” VTT Elektroniikka, 2002.
4. W. R. Duncanillam, A Guide To The Project Management Body Of Knowledge, 1996.
5. W. Royce, Software Project Management:A Unified Framework, 1998.
6. P. Kruchten, “Introduction to the rational unified process,” Proceedings of the 24th
International Conference on Software Engineering, p. 703, 2002.
7. K. B. Marten Folwer, James A.Highsmith, Manifesto for Aigle Software Development,
http://agilemanifesto.org/, 11 2011.
8. S. Ambler, Agile Modeling:Effective Practices for eXtreme Programming and the
Unified Process. Wiley Computer Publishing, 2004.
9. J. Highsmith, Agile Software Development Ecosystems. Addison Wesley, 2002.
10. M. L. Back R. J., Hirkman P., “Evaluating the xp customer model and design by
contract,” in Euromicro Conference, 2004, pp. 318–325.
11. D. Karlstrm, “Introducing extreme programming an experience report.” Proceedings
3rd Conference on extreme programming XP 2002, 2002.
Backlog, and prioritizes the Improvement Backlog. The Improvement Backlog.
7. An appraisal at the end supports commitment

Você também pode gostar