Você está na página 1de 110

CPM

-500 Principles of Technical


CPM-500
Management
Lesson F: Project Schedule Risk Analysis
Presented by
David T. Hulett, Ph.D.
Hulett & Associates, LLC
Info@projectrisk.com
www.projectrisk.com
IPMC 2002 Fall Conference
Professional Education Program
2002 Hulett & Associates

Defining Risk

Project risk is an uncertain event or condition that, if it


occurs, has a positive or negative effect on a project
objective.*

*Guide to the Project Management Body of Knowledge


(PMBOK Guide), 2000 Project Management Institute

2002 Hulett & Associates

Opportunities and Threats


Most
Likely
Estimate
Optimistic
Estimate

Opportunities

Threats
2002 Hulett & Associates

Why Manage Risk?

Ignoring
Ignoringthe
therisk
riskdoes
doesnot
notmake
makeititgo
goaway
away
Our
Ourobjective
objectiveisistototurn
turnvague
vagueuncertainty
uncertaintyinto
into
identified,
identified,quantified
quantifiedrisk
risk

2002 Hulett & Associates

Iron Triangle of Project Objectives

Main three project objectives (a.k.a. Triple Constraint


of project management)
Technical
Technical
Cost
Cost
Schedule
Schedule

Environmental compliance, Safety, Being a good


corporate citizen
2002 Hulett & Associates

Which Objective Defines the Project?


The objectives are interdependent
When one is unmovable, the others need to be flexible
Any pressure on one will be transmitted to the others

Cost

Time

Technical
2002 Hulett & Associates

Performance Objective Traditionally


Comes First
Projects are usually defined by technical, performance,
quality or reliability specifications
The obvious question for a project management
professional:
How much will such a project cost?
How long will it take?

Cost

Time

2002 Hulett & Associates

Technical

But, Schedule May be the Main Goal

If the time factor is crucial

The project may be de-scoped

=> problem with

customer
Resources may be added => cost more to finish on
time
Cost

Time

2002 Hulett & Associates

Technical

Or, Cost May Be the Limiting Factor

Limits on cost will impact both schedule and


performance

Performance may be relaxed or reduced to limit cost


May use less-capable labor, less overtime => takes
longer

Cost

Time

2002 Hulett & Associates

Technical

The Objectives are Interrelated


Pressures from one objective will impinge on others
Pressures also impact on quality and business goals

Cost

Time

Technical
2002 Hulett & Associates

10

Establish Priority of
Project Objectives: Example
Cost

Technical
Performance

Schedule

Must
Have
Nice to
Have
Accept
Result
2002 Hulett & Associates

11

Measuring Objectives
Technical vs. Cost and Time
Cost is denominated in dollars, time in days
Technical objectives No common unit of measurement

Weight
Speed, range, capacity, climbing rate
Software function points, lines of code
Reliability Mean time between failure
Quality measures, costs
2002 Hulett & Associates

12

Using Technical Objectives Measurements

Technical objectives are not comparable

A dollar is a dollar, a day is a day

In Reliability Analysis (Fault Tree)

Each failure mode has a probability of failure there is one


measure of success

Otherwise:How do you trade off climbing rate with


carrying capacity or reliability?
2002 Hulett & Associates

13

Risk Management Processes

Risk Management Planning


Risk Identification
Qualitative Risk Analysis
Quantitative Risk Analysis
Risk Response Planning
Risk Monitoring and Control
Source: Guide to the Project Management Body of Knowledge,
(PMBOK Guide, 2000 Project Management Institute
2002 Hulett & Associates

14

Risk Management Plan


Plan your approach to risk management on THIS
PROJECT
Who will manage the risk management process?
Determine approach: quantitative, qualitative, narrative
How frequently will the risk analysis cycle be done?
Budget for the risk management activities

Reference: Project Risk Analysis and Management (PRAM) Guide,


Association for Project Management, 1997
2002 Hulett & Associates

15

Risk Management Processes

Risk Management Planning


Risk Identification
Qualitative Risk Analysis
Quantitative Risk Analysis
Risk Response Planning
Risk Monitoring and Control
Source: Guide to the Project Management Body of Knowledge
(PMBOK Guide), 2000 Project Management Institute
2002 Hulett & Associates

16

Risk Breakdown Structure

2002 Hulett & Associates

17

Technical Risk Examples


Major state-of-the-art advance may be needed
Requirements are not stable or are complex and may
be difficult to meet
Operating environment is difficult, and may pose
problems
Logistical challenges in getting equipment to site -- may
be more difficult than anticipated

2002 Hulett & Associates

18

Technical Risk (continued)


Integration requires interfaces between project sections
that may be complex
Reliability / maintainability standards may be
challenging
The design is incomplete or our approach may not work
Concurrency (overlapping of phases) may result in
confusion, missteps and rework

2002 Hulett & Associates

19

Technical Risk (continued)


Test and Evaluation program may not be capable of
assessing performance
Modeling and Simulation may not be adequate to
support the program through all phases
Production capabilities may not be adequate for the
demanding configuration

2002 Hulett & Associates

20

Technical Risk (continued)


The developer may not be capable to design and
manufacture the system
Budget resources may be insufficient
Customer and Contractor management teams may not
be sufficient or adequate
Time available may be insufficient

Source: Risk Management Guide for DoD Acquisition, Defense Acquisition


University, January 2000
2002 Hulett & Associates

21

SEI Categories for Software


Development Risk

Examine risks in several areas

Technical aspects of engineering software products


Environment within which the development takes place
Constraints to successful software development

Software Engineering Institute at Carnegie Mellon


University

www.sei.cmu.edu
2002 Hulett & Associates

22

Technical Risk - Top Ten Checklist of


Software Risks - Barry Boehm

Barry Boehm has written several articles that are

included in his edited volume: Software Risk


Management, IEEE Computer Society Press, 1989
(out of print).
This list is from his presentation to the Southern
California Risk Management Symposium, Sept.
2002

2002 Hulett & Associates

23

The Top Ten Software Risk Items Barry Boehm


Risk Item

Risk Management Techniques

1. Personnel Shortfalls

Staffing with top talent; key personnel


agreements; incentives; team-building; training;
tailoring process to skill mix; peer reviews

2. Unrealistic schedules
and budgets

Business case analysis; design to cost; incremental


development; software reuse; requirements descoping;
adding more budget and schedule

3. COTS; external components

Qualification testing; benchmarking; prototyping;


reference checking; compatibility analysis; vendor
analysis; evolution support analysis

4. Requirements mismatch;
gold plating

Stakeholder win-win negotiation; business case


analysis; mission analysis; ops-concept formulation;
user surveys; prototyping; early users manual;
design/develop to cost

5. User interface mismatch

Prototyping; scenarios; user characterization


(functionality, style, workload)

2002 Hulett & Associates

24

The Top Ten Software Risk Items Barry


Boehm (Continued)
Risk Item

Risk Management Techniques

6. Architecture, performance,
quality

Architecture tradeoff analysis and review boards;


simulation; benchmarking; modeling; prototyping;
instrumentation; tuning

7. Requirements changes

High change threshold; information


hiding; incremental development (defer
changes to later increments)

8. Legacy software

Design recovery; phaseout options analysis;


wrappers/mediators; restructuring

9. Externally-performed
tasks

Reference checking; pre-award audits;


award-fee contracts; competitive design
or prototyping; team-building

10. Straining Computer


Science capabilities

Technical analysis; cost-benefit analysis;


prototyping; reference checking

2002 Hulett & Associates

25

Common Colds of the Software World:


Capers Jones

Creeping user requirements


Excessive schedule pressure
Poor quality
Inaccurate estimation of costs
Inaccurate metrics and measurement
Management malpractice
Silver bullet syndrome
Source: Capers Jones, Assessment and Control of Software Risks,
Prentice Hall, Yourdon Press, 1994
2002 Hulett & Associates

26

Checklist Example
Project Risk Checklist
Project Name
Project Manager

Risk Type

Risk Area

Technology

Design

Organizational

Objectives
Resources

Customer

Regulatory

Expectations
Interface
Funding
Permit Required

Description of
Uncertainty

Resolution
(Action,
Evaluation of Risk
(present, importance) responsible)

Complexity
State-of-the-Art
Integration
Unclear
Changing
Compete w/ other proj.
Inexperienced
Unrealistic
Vague, changing
Not timely
Intermittant
Uncertain requirements
Uncertain timing

2002 Hulett & Associates

27

Risk Identification Tools: Brainstorming


Have the right people in the room
Expose the people to the project
Have them prepare for the session
Get them off-site to concentrate
Use the checklist developed on other projects

Lessons Learned files

Synergy among the participants


2002 Hulett & Associates

28

Sources of Data on Technical Risk

Comparison with similar systems


Relevant lessons learned
Experience
Results from tests and prototype development
Data from engineering and other models
Specialist and expert judgment
Analysis of plans and related documents
Modeling and simulation
Source: Risk Management Guide for DoD Acquisition, Defense Acquisition
University, January 2000
2002 Hulett & Associates

29

Risk Management Processes

Risk Management Planning


Risk Identification
Qualitative Risk Analysis
Quantitative Risk Analysis
Risk Response Planning
Risk Monitoring and Control
Source: Guide to the Project Management Body of Knowledge
(PMBOK Guide), 2000 Project Management Institute
2002 Hulett & Associates

30

Willoughby Templates
In early 1980s, W. J. Willoughby, Jr was chairman of the
Defense Science Board
DSB developed a way to look at risk in Defense
Acquisition programs using a simple template approach

What is the problem?


How can it be addressed?
When in the project life cycle should risk mitigation take
place

Represents Lessons Learned


2002 Hulett & Associates

31

Defense Science Boards


Findings and Recommendations
Most new weapon systems are less than satisfactory
and require burdensome maintenance and logistics
requires additional equipment in order to meet the
needs..
programs cannot succeed for technical reasons

A poorly designed produce cannot be tested efficiently,


produced or deployed.
Manufacturing problems will overwhelm production
schedules and costs
2002 Hulett & Associates

32

Defense Science Boards


Findings and Recommendations (continued)
Corrective measures by the DoD have focused on
establishing a series of management checkpoints and
review activities
this approach has been responsible for adding
numerous layers of management and has tended to
compartmentalize, matrixize and polarize the major
areas of the acquisition process: design, test and
productionthey do not describe the industrial process

2002 Hulett & Associates

33

Defense Science Boards


Findings and Recommendations (continued)
The probable cause (of the problems facing acquisition
programs) is inadequate engineering and manufacturing
disciplines combined with improperly defined and
implemented logistics programs.
Identify and establish critical engineering process and
their control methods

Transition from Development to


Production, Solving the Risk Equation
DoD 4246.7-M, January 1984
2002 Hulett & Associates

34

Overriding Attributes of the Defense


Science Boards Recommendations

Assurance of design maturity

Assessment of contractors design policy

Measurement of test stability

Near absence of failures in development testing of a


stable design

Certification of manufacturing processes

Design for production and proof of process


2002 Hulett & Associates

35

Willoughby Templates Top Level


Top Level of the Willoughby Templates

2002 Hulett & Associates

36

Willoughby Templates Detail

2002 Hulett & Associates

37

Willoughby Templates Detail (continue)

2002 Hulett & Associates

38

Willoughby Templates Detail (continue)

2002 Hulett & Associates

39

Example Template: Design Requirements

Area of Risk

Accurate and complete specification of the design

reference mission profile is required


Sometimes the profile does not correspond to the
ultimate service use
Often the profile is left to the contractors discretion

2002 Hulett & Associates

40

Example Template:
Design Requirements (continued)

Outline for Reducing Risk

Functional mission profile shows all functions on a time

scale prepared by the government customer


Environmental mission profile showing the surroundings
affecting the system
Contractor prepares profiles based on the governments
and become the design requirements

Timeline

During concept phase (JMSNS phase)


2002 Hulett & Associates

41

Qualitative Ranking of Risks


Group the risks into categories for appropriate action
This may be enough to manage risk effectively

Risk Ranking
High Risk

Risk Management Action

Stop Light Condition


RED

Resolve or mitigate in baseline plan

Moderate Risk

YELLOW

Resolve or develop a contingency plan

Low Risk

GREEN

Leave resolution to project team

2002 Hulett & Associates

42

Maxwell Risk Driver Assessment Matrix


Maxwell Risk Driver Assessment Framework
Risk Level
Very Low
Low
Medium
Minor Modifications Major
RequiredTechnical
1
Nothing New
Advancement
Only
Modifications
Currently in
Under
2 Tech;nology Status
Prototype Exists
Use
Development
Somewhat
Moderately
3 Complexity
Simple
Complex
Complex
Independent of Dependent on One Dependent on
Interaction /
Two Additional
Additional Risk
Other Risk
4
Dependencies
Risk Drivers
Driver
Drivers
Risk Driver Category

5 Process Controls

Manufacturing
Precision

High
State of the Art
In Design
Highly Complex
Dependent on
Three Additional
Risk Drivers

Very High
Beyond State of the
Art
Concept Stage
Highly Complex with
Uncertainties
Dependent on more
than Three Additional
Risk Drivers

Statistical
Documented
Limited
Process
Controls (No SPC) Controls
Controls (SPC)

Inadequate
Controls

No Known Controls

High

Known but
Inadequate

Unknown

Adequate

Limited Margins

2002 Hulett & Associates

43

Maxwell Risk Driver Assessment Matrix


(continued)

Maxwell Risk Driver Assessment Framework


Risk Driver Category

Risk Level
Medium
High

Very Low

Low

Reliability

Historically
High

Average

Producibility

Established

Demonstrated

Criticality to
Mission

Nonessential

Minimum Impact

Cost

Established

Known History or
Close Analogies

Schedule

Demonstrated Historical Similarity

Very High

Known Limited Serious Problems


Infeasible
Problems
of Unknown Scope
Feasible
Known
Alternatives
Available
Predicted by
Calibrated
Model

Known Difficulties Infeasible

Validated
Analysis

Inadequate
Analysis

2002 Hulett & Associates

Possible
"Show Stopper"
Alternatives Exist
Out of Range of
Experience

Unknown or
Unsupported
Estimate
Unknown or
Unsupported
Estimate

44

Probability and Impact Define Risk

Risk is defined by two dimensions of a possible event

Probability that the event will occur


Impact on the project objectives (cost, time,
performance) if it does occur

Discussions about risk rely on these two dimensions


The two attributes of probability and impact must be
considered separately

Impact
Impactisisindependent
independentofofhow
howlikely
likelythe
theevent
eventisis
2002 Hulett & Associates

45

Qualitative Assessment of
Probability from Technology Maturity
Technology Maturity
Scientific research ongoing
Concept design formulated for performance and
qualifications
Concept design tested for performance and qualification
concerns at bench scale
Critical functions / characteristics demonstrated at pilot
scale
Full-scale prototype hardware passed qualification tests
with ACWA feedstocks
More than one full scale facility operational and deployed

2002 Hulett & Associates

Probability
Very High
High
Moderate
Moderate
Low
Very Low

46

Probability of Risk from


Process Complexity
Process Complexity Risk
Five or more processes/technologies including complicated
interfaces or complex operations
Two to four processes/technologies with complicated interfaces
and complex operations
Two processes/technologies with either complicated interfaces or
complex operations
Two processes/technologies in a single processing train, and
standard interfaces with routine operations
One process/technology and standard interfaces with routine
operations

Probabilty
Very High
High
Moderate
Low
Very Low

Complicated interfaces: multiphase streams, extreme conditions beyond industrial norms


Complex operations: multiple interactive and interrelated activities beyond industrial norms

2002 Hulett & Associates

47

Probability of Risk from


Process Difficulty
Process Difficulty Risk
No comparable process is operating on an industrial
scale, and more than one of: C, T, TP, or PC is expected
to exceed state of the art
No comparable process is operating on an industrial
scale, and at least one of the requirements for C, T, TP,
or PC are expected to exceed state of the art
Integrated process is combination of standard industrial
processes and C, T, TP, or PC exceed norm for these
processes
Integrated process is a combination of standard
industrial processes and C, T, PC or TP are within the
norm for these processes
Standard industrial process meets C, T, TP, and PC
requirements
C = Conversion efficiency, T = Tolerance or precision
TP = Throughput, PC = Process controls
2002 Hulett & Associates

Probability
Very High

High

Moderate

Low
Very Low

48

Qualitative Assessment of
Impact on Performance
Impact of Risk on Performance Objective
Impact
System requirement not achieved, safety and
environmental objectives jeopardized
System requirement not achieved, safety and
environmental objectives satisfied
Degradation of system performance eliminates all margins
Degradation of subsystem performance, decrease in
system performance (still above requirement)
Potential degradation of subsystem performance, but
system level not affected
No effect on subsystem or system performance (includes
producibility and support)
2002 Hulett & Associates

Very High
High
Moderate
Moderate
Low
Very Low
49

Qualitative Assessment of P & I


Emphasis on Impact
Qualitative Risk Analysis: Probability - Impact Approach to Project Risk Analysis
Probability

Very high

Mod

Mod

High

High

High

High

Low

Mod

Mod

High

High

Moderate

Low

Mod

Mod

High

High

Low

Low

Low

Mod

Mod

High

Very low

Low

Low

Low

Mod

Mod

Very Low

Low

Moderate

High

Very High

Impact on Project Objective

2002 Hulett & Associates

50

Probability and Impact by the Numbers

Assessing probability and impact verbally is too vague,


easy to make mistakes

On purpose
By design
Because of inattention
Defining the levels by objective criteria helps
Applying numbers to risk probability and impact seems
to improve concentration, increase discipline

2002 Hulett & Associates

51

Probability by the Numbers:


Technology Maturity (natural)
Technology Maturity
Scientific research ongoing
Concept design formulated for performance and
qualifications
Concept design tested for performance and qualification
concerns at bench scale
Critical functions / characteristics demonstrated at pilot
scale
Full-scale prototype hardware passed qualification tests
with ACWA feedstocks
More than one full scale facility operational and deployed

2002 Hulett & Associates

Probability

0.9
0.8
0.6
0.4
0.3
0.1

52

Impact by the Numbers: Performance


Objective (more problematic)

Many people feel comfortable assigning numbers to the


impacts

Numbers imply cardinality meaning that the relation

between the numbers means something


An impact rated .6 is 3 times as bad as one rated .2

DoD recommends avoiding cardinal numbers in favor


of ordinal rankings (one is worse than the lower one)

E.g. Impacts rated A / B / C / D / E


2002 Hulett & Associates

53

Compare Impact Neutral


and Impact Aversion
Example of Different Impact Scales Reflecting
Organizational Preferences
Linear, Impact
Non Linear, Impact
Risk Impact Level
Neutral
Averse
Very High
High
Moderate
Low
Very Low

0.9
0.7
0.5
0.3
0.1
2002 Hulett & Associates

3.2
1.6
0.8
0.4
0.2

54

Impact on Performance by the Numbers


Impact of Risk on Performance Objective
Impact
System requirement not achieved, safety and
environmental objectives jeopardized
System requirement not achieved, safety and
environmental objectives satisfied
Degradation of system performance eliminates all margins
Degradation of subsystem performance, decrease in
system performance (still above requirement)
Potential degradation of subsystem performance, but
system level not affected
No effect on subsystem or system performance (includes
producibility and support)
2002 Hulett & Associates

3.2
1.6
0.8
0.4
0.2
0.1
55

Computing the Technical Risk Score

The Risk Score can be computed by multiplying


probability times impact

Prob. of .4 and impact of .7 yields a score of .28

The organization can determine the cutoff for each level


of Risk Score

Risk Ranking
High Risk
Moderate Risk
Low Risk

Stop Light Condition Cut-Off for Risk Score


RED

.30 < X

YELLOW

.15 < X < .30

GREEN

X < .15

2002 Hulett & Associates

56

Computing the Risk Score


Probability and Impact Risk Scores
Risk = P x I
Probability
0.9
0.18 0.36 0.72 1.44
0.7
0.14 0.28 0.56 1.12
0.5
0.10 0.20 0.40 0.80
0.3
0.06 0.12 0.24 0.48
0.1
0.02 0.04 0.08 0.16
0.2

0.4

0.8

1.6

2.88
2.24
1.60
0.96
0.32
3.2

Impact (Ratio Scale)


2002 Hulett & Associates

57

DoD Generally Recommends Against


Using Cardinal Values for Impact

This (use of numbers to calculate risk such as P x I)


may be suitable if both likelihood and consequences
have been quantified using compatible cardinal scales
or calibrated ordinal scales (e.g. using Analytic
Hierarchy Process). In such a case mathematical
manipulation of values may be meaningful
Source: Risk Management Guide for DoD Acquisition, DAU,
January 2000
2002 Hulett & Associates

58

DoD Generally Recommends Against Using


Cardinal Values for Impact (continued)
In many cases, however, risk scales are actually just
raw (uncalibrated) ordinal scales, reflecting only relative
standing between scale levels and not actual numerical
differences.
Any mathematical operations performed on results
from uncalibrated ordinal scales can provide
information that will at best be misleading, if not
completely meaningless

2002 Hulett & Associates

59

Risk Management Processes

Risk Management Planning


Risk Identification
Qualitative Risk Analysis
Quantitative Risk Analysis
Risk Response Planning
Risk Monitoring and Control
Source: Guide to the Project Management Body of Knowledge,
2000 Project Management Institute
2002 Hulett & Associates

60

Decision Tree Analysis

Making Decisions with Uncertainty


Decision analysis helps structure the problem
Disciplined approach
Decisions to be made
Events that can happen, likelihood and impact
Costs and benefits of making some decisions, having
some events happen

2002 Hulett & Associates

62

Completed Simple Tree with Decisions,


Events, Costs, Rewards and Probabilities
35%

High

500

350

Market Demand

Greenfield
-150

65%

Low

300

150

Plant Decision
35%

High

400

365

Market Demand

Retrofit
-35

65%

Low

200

165

35%

High

300

300

Market Demand

None
0
Low

2002 Hulett & Associates

65%
150

150

63

Folding Back To Solve the Tree

Value of $235 moves from right to left


35%

High

500

350

Market Demand

Greenfield
-150

220
65%

Low

300

150

Plant Decision
235
35%

High

400

365

Market Demand

Retrofit
-35

235
65%

Low

200

165

35%

High

300

300

Market Demand

None
0

2002 Hulett & Associates

Low

202.5
65%
150

64
150

Changing Costs may Change Results and


Value
35%

High

500

350

Market Demand

Greenfield
-150

220
65%

Low

300

150

Plant Decision
220
35%

High

400

335

Market Demand

Retrofit
-65

205
65%

Low

200

135

35%

High

300

300

Market Demand

None
0
Low
2002 Hulett & Associates

202.5
65%
150

150

65

System Reliability:
System Failure Analysis

Purposes of a System Failure Analysis


Analyze the possible ways a facility, product or system
might fail
Understand how to build a more fault-tolerant facility
Discover what happened if it failed (e.g. plane crash)
Determine design that is efficient use of funds to keep
the facility running
Evaluate different competing designs from the failure
perspective

2002 Hulett & Associates

67

Simple Failure Analysis Model

What makes the room go dark in the evening?

We have two lamps, desk and floor

Specify the objective

To have a conversation, we need at least one light and


Failure means that both lights go out (BOTH .. AND)
To read by, we need both lights and failure means one
light goes out (EITHER .. OR)

2002 Hulett & Associates

68

Quantitative Analysis of the


Two
-Element Fault Tree
Two-Element

Suppose that the failure rates are as follows:

Floor lamp fails 4% of the time over a month


Desk lamp fails 5% of the time over a month

What is the likelihood that the room will be completely


dark in the evening, over the month?

An And Gate requires:


Both
Boththe
theFloor
FloorLamp
Lampand
andthe
theDesk
DeskLamp
Lampmust
mustFail
Fail

These are redundant


systems,
failure is unlikely
2002 Hulett
& Associates

69

AND Gate
-- Only 0.2% Likelihood
Gate-of a Completely Dark Room
Fault Tree Analysis
State of Being
Floor On, Desk On
Floor On, Desk Off
Floor Off, Desk On
Floor Off, Desk Off

Likelihood
Floor Lamp Desk Lamp
95.0%
96.0%
96.0%
5.0%
4.0%
95.0%
4.0%
5.0%
Total Likelihood

2002 Hulett & Associates

Joint Likelihood
of Occurrence
91.2%
4.8%
3.8%
0.2%
100.0%

70

Complete Darkness? Simple AND Gate


in Software

The likelihood is shown on the solved fault tree

Result
And Gate

Likelihood
of Failure

2002 Hulett & Associates

71

Suppose Bright Light and


Both Lamps are Necessary?
This is an OR GATE
The condition of Bright Light fails if:

Either
Eitherthe
theFloor
FloorLamp
Lampor
orthe
theDesk
DeskLamp
LampFails
Fails

This is a more common occurrence


These are not redundant systems any more

2002 Hulett & Associates

72

OR Gate 8.8% Likely That At Least One


Lamp Fails

Likelihood of failure increases to 8.8%


Fault Tree Analysis

State of Being
Floor On, Desk On
Floor On, Desk Off
Floor Off, Desk On
Floor Off, Desk Off

Likelihood
Floor Lamp Desk Lamp
95.0%
96.0%
96.0%
5.0%
4.0%
95.0%
4.0%
5.0%
Total Likelihood
2002 Hulett & Associates

Joint Likelihood
of Occurrence
91.2%
4.8%
3.8%
0.2%
100.0%
73

Fault Tree for Room Relatively


Dark, with Probabilities -- OR Gate

The likelihood that either / or failure occurs is 8.8%

Result
Or Gate

Likelihood
of &Failure
2002 Hulett
Associates

74

Risk Management Processes

Risk Management Planning


Risk Identification
Qualitative Risk Analysis
Quantitative Risk Analysis
Risk Response Planning
Risk Monitoring and Control
Source: Guide to the Project Management Body of Knowledge
2000 Project Management Institute
2002 Hulett & Associates

75

Project Life Cycle, Risk and Risk


Management
Total Project Life Cycle

INCREASING RISK

CONCEPT

Execute

DEVELOPMENT

Opportunity and Risk

IMPLEMENTATION

TERMINATION

EFFECT OF RISK
MANAGEMENT

INCREASING VALUE

Plan

Amount at Stake

TIME

2002 Hulett & Associates

76

Evaluate a
Risk Response (Handling) Option
Can it be feasibly implemented?
What is its expected effectiveness?
Is it affordable?
Is time available to develop the option? What is its
impact on the schedule?
What effect does it have on the systems technical
performance

Source: Risk Management Guide for Acquisition, DAU 2000


2002 Hulett & Associates

77

Risk Avoidance
Eliminating the risk event
Relax the project objective
Severing the link to the project objective
Any risk for which the Triple Constraint can be relaxed

2002 Hulett & Associates

78

Risk Mitigation (Handling)

Design Immaturity Risk

Fall back to a less demanding design


Extend the time of the design phase to get it right
Rapid prototyping of the user interface

Design Uncertainty Risk

Parallel development
This is an expensive risk response strategy
2002 Hulett & Associates

79

Risk Mitigation (Handling)

Process Immaturity Risk

Test components
Test the integrated system at a scale factor
Semi-works tests at a larger scale
May need to test some components at full-scale

2002 Hulett & Associates

80

Some other Handling Options


Trade Studies Arrive at a balance of engineering
requirements
Incremental Development Design for later upgrade
Technology Maturation Plan to replace current
technology with preferred technology on Unit 2
Robust Design Use advanced design and
manufacturing techniques to promote quality, may be
costly

2002 Hulett & Associates

81

Some other Handling Options


Design of Experiments I.D. critical design factors and
focus on those
Open Systems Carefully selected commercial
specifications and standards
Use Standard Items / Software reuse
Modeling / simulation of the system
Manufacturing screening to identify deficient
manufacturing processes

Source: Risk Management Guide for Acquisition, DAU 2000


2002 Hulett & Associates

82

Software Risks: Capers Jones

Creeping User Requirements

New and unanticipated user requirements added after the


project is initiated and estimated
User requirements creep at about 1% per month

Mitigation

Prototyping can reduce the severity and volume of this

2002 Hulett & Associates

83

Software Risks: Capers Jones

Poor Quality

Mitigation

Result of technical and cultural issues


Not use effective defect prevention, removal therapies
Corporate culture not committed to quality
Test planning
Requirements analysis up front
Formal inspection
Beta testing with many users
Formal defect prevention
2002 Hulett & Associates

84

Software Risks: Capers Jones

Inaccurate Metrics

Proven as early as 1978 that lines of code cannot safely


be used to size projects, especially when multiple
languages exist.
Impact is poor measurement of productivity, cost and
schedule estimation and project planning

Mitigation

Jones promotes the use of function points as accurate


metrics

2002 Hulett & Associates

85

Software Risks: Capers Jones

Management Malpractice

Managers not trained for their jobs, not rewarded for good project

management skills
Culture lacks awareness of good practices
No training or good curriculum in schools

Mitigation

Understand what is needed


Allocate training resources, acquire courses
Elevate project management status in company
Establish best in class culture of professionalism
2002 Hulett & Associates

86

Checklist of Software Risk Items:


Barry Boehm

Developing the wrong software functions


Risk management techniques

Organization analysis
Mission analysis
Ops-concept formulation
User surveys
Prototyping
Early users manuals
2002 Hulett & Associates

87

Checklist of Software Risk Items:


Barry Boehm

Developing the wrong user interface


Risk management techniques

Prototyping
Scenarios
Task analysis
User characterization (functionality, style, workload)

2002 Hulett & Associates

88

Checklist of Software Risk Items:


Barry Boehm

Gold plating
Risk management techniques

Requirements scrubbing
Prototyping
Cost-benefit analysis
Design to Cost

2002 Hulett & Associates

89

Risk Transfer

Key for large projects


Contract provisions include conditions for customer or
supplier paying
Type of contract may deflect from customer to contractor or
vice versa

Fixed price contracts may limit customers cost exposure


It may increase cost if there are engineering change orders

2002 Hulett & Associates

90

Risk Deflection Between Customer


and Contractor / Supplier by Contract
Contractor and Customer Risk

80
60
40
20
1

Cost Plus

9 10

Relative Risk

100

Contractor Risk
Customer Risk

Fixed Price
2002 Hulett & Associates

91

Contingency Planning
Plan B is the contingency if the baseline plan does not
work out
Plan B will be developed, then kept for emergency
Identify the events that might trigger the need for Plan B
Identify the value or characteristics of the trigger events
that would initiate Plan B
Monitor those trigger points in Risk Monitoring and
Control

2002 Hulett & Associates

92

Risk Acceptance:
Set a Dollar Contingency Reserve

The best way is to:

Quantify risks and perform a Monte Carlo simulation


Agree on the organizations risk tolerance and pick the
point on the cumulative distribution

A short cut that may be helpful -- use the averages with


Method of Moments

2002 Hulett & Associates

93

Risk Acceptance:
Take the Right Risks
Accepting the Right Risks and Setting Contingency Amounts
Risk Accepted
Subcontractor late

Likelihood Impact
a
40%

b
250

Expected
Mitigation Strategy
Value
c=axb
100 Use Old Subcontractor

Mitigation
Expected Savings
Cost
d
250

d-c
150

Expected value of the risk is less than mitigation costs


Take the right risks

2002 Hulett & Associates

94

But Establish
a Contingency Reserve
Risk Accepted
Subcontractor late
Part not pass the test
Design assumption faulty

Likelihood

Impact

Expected
Value

a
40%
30%
20%

b
250
350
650

c=axb
100
105
130

Total

335

Set the right contingency reserve

Set at the expected value if have enough risks


2002 Hulett & Associates

95

Content of the Risk Register


Date risk latest revised
Name
Description
Responsibility for monitoring, mitigating, reporting
Risk before further risk mitigation described in the Risk
Register

2002 Hulett & Associates

96

Content of the Risk Register (continued)


Risk mitigation options chosen
Timing of the action -- now, later, contingent
Fallback plans if action fails
Amount of risk expected to remain after risk
management

2002 Hulett & Associates

97

Risk Management Processes

Risk Management Planning


Risk Identification
Qualitative Risk Analysis
Quantitative Risk Analysis
Risk Response Planning
Risk Monitoring and Control
Source: Guide to the Project Management Body of Knowledge,
2000 Project Management Institute
2002 Hulett & Associates

98

Risk Monitoring and Control


Risk Analysis is a continuing requirement
Cycle through identification, assessment, quantification
and response planning
Keep track of the identified risks (watch list)
Identify residual risks
Assure the execution of risk plans or if new plans should
be developed

2002 Hulett & Associates

99

Risk Monitoring and Control (continued)

Test and Evaluation (T&E) monitoring the performance of


selected risk handling options and developing new risk
assessments
Test-Analyze-and-Fix (TAAF) use a period of dedicated testing
to identify and correct deficiencies
Demonstration Events points in the program where technology
and risk abatement are demonstrated
Program metrics measure how well the system is achieving its
objectives, monitor corrective action
Source: Risk Management Guide for Acquisition, DAU 2000
2002 Hulett & Associates

100

Earned Value Analysis

Earned Value Management System (EVMS)

Helps forecast the budget and schedule at completion


Augments quantitative risk analysis
Based on data from the project itself, in an early stage

Earned Value has exhibited a reliable ability to predict


results at completion early in the project (e.g. at 20%
complete)
2002 Hulett & Associates

101

Picture of a Troubled Project


Earned Value Management System
120

80
60
40

Over Cost

Behind Schedule

Plan
Earned Value
Actual

20
19

17

15

13

11

0
1

Dollars

100

Project Months

2002 Hulett & Associates

102

Earned Value Forecast of a Troubled


Project
160

Status at Planned
Completion

140
100

Plan

Time Now

80

Earned Value
Actual

60
40

Projected EV
Projected Actual

20
22

19

16

13

10

0
1

Dollars

120

Project Months
2002 Hulett & Associates

103

Technical Performance Measurement

Technical Performance Measurement (TPM)

Identify key technical goals or targets to be made

through the project


Set the targets in the schedule, usually at key milestones
Measure technical achievements
Compare measured achievements to the technical
baseline

Variances between actual achievements and the


baseline lead to warnings of technical risk of completion
2002 Hulett & Associates
104
satisfaction

Technical Performance Measurement


(continued)

Targets should have meaning for technical risks

Technical targets all support the successful completion


of the project
These targets should have predictive value
If they are not met, you are in trouble

Identifying and scheduling the technical performance is


key
2002 Hulett & Associates

105

Technical Performance Measurement


(continued)

Variances from the technical baseline early in the


project are hard to make up later

Variances are accompanied with schedule and budget


implications

Some projects go on with only partial completion of


technical requirements at milestones

These projects may be in trouble


Catching up may be difficult and / or costly and / or take
more time

2002 Hulett & Associates

106

Technical Performance
Starts Well, Falls Behind
Technical Performance Measurement

Technical Metric

100
80

Technical
Plan

Technical Shortfall

60
40

Technical
Performance
To-Date

20
0
0

10

15

20

Project Months
2002 Hulett & Associates

107

Technical Performance Metric


of a Troubled Project
Status at Revised
Time Now
Completion

100

Technical Plan

80
60

Technical
Performance ToDate
Technical To-Go

40
20
25

22

19

16

13

10

0
1

Performance Metric

Forecasting
Technical Risk with TPM

Project Months
2002 Hulett & Associates

108

Project Closeout

Data gathering

Lessons Learned data base


Help future projects

Make it possible for future project managers to access


the data easily, learn the lessons clearly

2002 Hulett & Associates

109

Project Schedule Risk Analysis


CPM-500 Principles of Technical Management
Presented by
David T. Hulett, Ph.D.
Hulett & Associates, LLC
Info@projectrisk.com
www.projectrisk.com

2002 Hulett & Associates

110

Você também pode gostar