Escolar Documentos
Profissional Documentos
Cultura Documentos
Sigma concepts
Dr Jane Marshall
Product Excellence using 6 Sigma
Module
PEUSS 2012/2013
Page 1
PEUSS 2012/2013
Page 2
Page 3
Page 4
Page 5
2s
308,537 dpm
PEUSS 2012/2013
3s
66,807 dpm
4s
6,210 dpm
Design for Six Sigma
5s
233 dpm
6s
3.4 dpm
Page 6
DMAIC
Define
Measure
Analyse
Data analysis
Root cause analysis
Process analysis
Improve
Control
PEUSS 2012/2013
Page 7
Continuous
assessment
Design and
development
Installation
Manufacturing
PEUSS 2012/2013
Page 8
Introduction to DFSS
Systematic methodology for designing or
redesigning products or services according to
customer requirements and expectations.
Optimises design process to achieve six sigma
performance
Get it right first time
PEUSS 2012/2013
Page 9
Page 10
PEUSS 2012/2013
Page 11
Define
Project
Identify
Requirements
Select
Concept
Develop
Design
Implement
Design
CTQ Flow up
PEUSS 2012/2013
Page 12
Relative Cost
to Impact Change
1000
100
Research
Design
Development Production
Product Stage
Page 13
cost
Potential is positive
Impact>cost
Potential is negative
Impact<cost
impact
Design
PEUSS 2012/2013
Produce/build
Design for Six Sigma
deliver
Service support
time
Page 14
Predictive
Design Quality
DFSS
From
To
Quality tested in
PEUSS 2012/2013
Page 15
DFSS Methodology
DMADV
Define, Measure, Analyse, Design and Verify
PIDOV
Plan, Identify, Design, Optimise and Validate.
PEUSS 2012/2013
Page 16
DFSS Process
Plan
Identify
Develop
project
plans
VOC CTQ
Prioritise CTQ
Define
Measure
PEUSS 2012/2013
Design
Generate,
evaluate,
select and
review
concepts
Optimise
Verify
Detailed
design and
optimise
performance
Demonstrate
satisfies
requirements
Design
Verify
Analyze
Page 17
PEUSS 2012/2013
Page 18
Checklists
Summary statements of tools and best practices required to
fulfil gate deliverable
Scorecards
Summary statements from specific application of tools and
best practice
PEUSS 2012/2013
Page 19
DMADV
Define
PEUSS 2012/2013
Page 20
10
DMADV - Define
Develop
the
Charter
PEUSS 2012/2013
Develop
Project
Plans
Develop
Organizational
Change
Plan
Identify
Risks
Review
Tollgate
Requirements
Page 21
Elements of a Charter
Problem Statement
Opportunity Statement
Importance
Expectations/Deliverables
Scope
Schedule
Team Resources
PEUSS 2012/2013
Page 22
11
PEUSS 2012/2013
Page 23
Page 24
12
Sources of risk
External legislation outwith the control of the
project team
Internal within control of project design, human
factors and technology etc.
Handle risk by taking action to avoid (mitigation)
Build up reserves (contingency)
PEUSS 2012/2013
Page 25
Spheres of Risk
PEUSS 2012/2013
Page 26
13
Risk Plan
Covers how the project team might have to
change the plan
Supports the Risk Management Process
PEUSS 2012/2013
Page 27
Quantify/Classify
Analyse
Get OWNERSHIP
Provision
Monitor
Manage / Control
PEUSS 2012/2013
Page 28
14
Page 29
Risk Categorisation
Probability of occurrence
high
Uncertainties
Remain
medium
low
Impact
high
Some
uncertainties
remain
Major redesign
and program
delay
medium
Few
uncertainties
remain
Minor redesign
and schedule
readjustment
low
Requirements
met within
schedule
Page 30
15
Categorizing Risks
High
Yellow Light:
Proceed with
caution
Red Light:
Address before
proceeding
Red Light:
Do not
proceed
Medium
Yellow Light:
Proceed with
caution
Yellow Light:
Proceed with
caution
Red Light:
Reassess
project
Green Light:
Yellow Light:
Proceed with
caution
Red Light:
Address before
proceeding
Medium
High
Low
Probability of Occurrence
Proceed!
Low
PEUSS 2012/2013
Impact on Project
Page 31
Mitigation involves
problem in advance
Contingency means
being ready
manage crises pro-actively
PEUSS 2012/2013
buying off
to
Page 32
16
Contingency plans
and contingent
mitigation plans
are held in the
Risk Plan
Mitigation activity is
entered directly into the
project plan
Contingency
Plan
PEUSS 2012/2013
Page 33
Responsibility
The Technical Manager or the Lead Engineer of the
project
He/She acts as, or alternatively, nominates a team
leader.
Responsible for organising Technical Risk Assessment
sessions & maintaining the information and selecting a
team.
The team consists of at least one representative from:
Project Management; Stress/Integrity; Design Engineering;
Development Testing; ILS; Quality; Manufacturing; Sourcing;
Systems/Requirements
PEUSS 2012/2013
Page 34
17
Example: Risk
assessment/management
process
2
RISK IDENTIFICATION
Identify risk
RISK PRIORITISATION
Assess magnitude of the risk
RISK MANAGEMENT
Develop plans to manage the risk
PEUSS 2012/2013
Page 35
Example
In order to ensure that all possible risks are identified,
the risk identification for a component/sub-assembly is
carried out in three stages:
PEUSS 2012/2013
Page 36
18
Risk Identification
Example : Brainstorming risks
Clamp Plate : The function of the clamp
plate is to axially clamp the rotor
windings
Subassembly
Part
Rotor
Wound
Main
Assembly
Clam
p
Plate
PROMPTS FOR
BRAINSTORMING
RISKS!!
Type of risk
Description of risk
Who
Paul
Harris
Requirement - What
does it have to cope with
? ( eg. CF loading,
vibration, temp., oil)
Paul
Harris
Manufacture - How do
you make this or where
did you buy it (supply
chain) ?
Steve
Robb
PEUSS 2012/2013
Page 37
Magnitude of risk
After a risk has been identified, the next step is to
assess the magnitude of the risk. This enables the
prioritisation of all the risks identified & ensures that a
concerted effort is made to mitigate the high scoring
risks.
The following factors are used to assess the magnitude
of & prioritise technical risks :
PEUSS 2012/2013
Pedigree
Testing
Analysis
Severity
Probability
Design for Six Sigma
Page 38
19
Risk Prioritisation
FACTORS USED FOR PRIORITISING RISKS
PEDIGREE
TESTING
ANALYSIS
Do we have any
evidence from
testing that can
help us assess the
magnitude of the
risk ?
Do we have any
evidence from
theoretical
analysis that can
help us assess the
magnitude of the
risk ?
Based on evidence
SEVERITY
PROBABILITY
What is the
likelihood of the
risk being realised
?
PEUSS 2012/2013
Page 39
Risk Prioritisation
Function/Requirement Risks
Pedigree
1 Identical design in Long Field Service
3 Identical design in Development Units/ Similar
design in long term service
9 New Design
If pedigree = 1, stop scoring, move onto next risk.
Testing
1 Full representative test
3 Read across tests with good sample size/
limited tests/ Verification
9 Zero testing/ small sample size
Analysis
1 Full Capable Analysis
3 Preliminary Analysis
9 No Analysis/not capable of
analysis
No
:
Preliminary analysis
suggests design meets
requirements. Therefore,
Analysis = 3
Preliminary Analysis
Report
Clamp Plate
Subassembly
Part
Rotor Wound
Main
Assembly
Clamp
Plate
PEUSS 2012/2013
Type of risk
Function - What does it do ?
(eg. Insulating, protecting)
Issu
e:
D ate:
Description of risk
Evidence
Stress
Report 1234
Page 40
20
Probability
1 Low
3 Medium
9 High
Part
Type of risk
Rotor
Wound
Main
Assembly
Clam
p
Plate
Descriptio
n of risk
P T A
RP
N1
P
R
Clamp
plate is
too
weak to
clamp
9 9 3
24
3
PEUSS 2012/2013
Page 41
Part
Rotor
Clam
Wound
p
Main
Plate
Assembl
y
PEUSS 2012/2013
Type of risk
Function - What
does it do ? (eg.
Insulating,
protecting)
Description of risk
P T A
RP
N1
P
R
RP
N2
Project
Risk
Score
9 9 3
243
6561
27
Page 42
21
Who
Planne
d
Closure
Date
David
Bonniema
n
15/01/02
Open
15/01/02
Open
Monitor
Peter
performance
Dunkz
Fields
filled
in
by project
of clamp plate
manager
when
next five
development
units are
tested
PEUSS 2012/2013
Actual
Closure
Date
Status
RPN2
Expected Trend
Time
Page 43
Risk Management
PEUSS 2012/2013
Page 44
22
Project Reviews
Regular reviews are key for successful
projects and should be included in the
project schedule
There are several levels of review:
Milestone or tollgate reviews; weekly reviews;
daily reviews
Page 45
Project team
Project business case
Project objective
Project plan (GANNT chart)
Document control systems
Risk reduction plan
PEUSS 2012/2013
Page 46
23
DMADV
Measure
PEUSS 2012/2013
Page 47
DMADV - Measure
Goals:
Output:
Prioritized CTQs
PEUSS 2012/2013
Page 48
24
Measure: Tools
PEUSS 2012/2013
Page 49
Understand
Voice of the
Customer
PEUSS 2012/2013
Translate
VOC Needs
Into Requirements (CTQs)
Prioritize
CTQs
Reassess
Risk
Page 50
25
Page 51
Page 52
26
Page 53
Page 54
27
PEUSS 2012/2013
Page 55
PEUSS 2012/2013
Page 56
28
Page 57
DMADV
Analyse
PEUSS 2012/2013
Page 58
29
Page 59
DMADV - Analyse
Identify Key
Functions
PEUSS 2012/2013
Prioritize
Functions
Generate
Concepts
Evaluate
and Analyze
Concept
Review
Page 60
30
PEUSS 2012/2013
Page 61
PEUSS 2012/2013
Page 62
31
Generate Concepts
Concepts are generated using two approaches:
Creative idea-generation techniques that focus
on analogy, connections, extrapolations and
creative visualization to develop new ideas
Benchmarking techniques that study similar
designs in competing and non-competing
businesses
PEUSS 2012/2013
Page 63
Design Review
Process for objectively evaluating the quality of a
design at various stages of the design process
Opportunity for voices external to the design team
to provide feedback on the design, as the product
and service is being developed
Helps to ensure that the design will satisfy
customers, and that the design process will
function effectively to produce a high quality
product or service
PEUSS 2012/2013
Page 64
32
Page 65
Design for X
PEUSS 2012/2013
Page 66
33
Page 67
DMADV
Design
PEUSS 2012/2013
Page 68
34
DMADV - Design
High Level
Design
PEUSS 2012/2013
Detailed
Design
Page 69
PEUSS 2012/2013
CONCEPT
DESIGN
HIGH LEVEL
DESIGN
Redesign
DETAILED
DESIGN
Page 70
35
Outputs:
Tested high level design
Tested detailed design
Plans for process control
Completed design reviews
PEUSS 2012/2013
Page 71
Design: Tools
QFD
Simulation
Rapid prototyping
Weibull analysis
SPC and process capability
Detailed design scorecards
FMEA
Reliability testing and qualification testing
Design reviews
PEUSS 2012/2013
Page 72
36
Tollgate review
The pre-pilot detailed design tollgate review
focuses on:
Developed design
Completed FMEA/simulation analysis
Design solutions for vulnerable elements
Organizational Change Plan updates
Process management system variables
Process management system details
PEUSS 2012/2013
Page 73
DMADV
Verify
PEUSS 2012/2013
Page 74
37
DMADV - Verify
Plan the
Pilot
PEUSS 2012/2013
Conduct
and
Evaluate
the Pilot
Implement
Close
Project
Page 75
Build a prototype
Pilot test the prototype
conduct design reviews using design scorecards
Decide if the process is meeting business
objectives
Close DMADV project
Transfer lessons learned from the project
PEUSS 2012/2013
Page 76
38
Outputs:
Working prototype with documentation
Plans for full implementation
Process owners using control plans to measure, monitor and
maintain process capability
Project closure and documentation completed
Ownership transition from sponsor to operations
management, and from design team to process management
team(s)
Lessons learned
PEUSS 2012/2013
Page 77
Completion Checklist
Completed project documentation that
summarizes results and learnings
Recommendations (supported by updated
information, if possible) for the next generation of
this design
Plans for (or results from) communicating your
achievements to the rest of the organization
Plans for celebrating your success
PEUSS 2012/2013
Page 78
39
Advantages of DFSS
PEUSS 2012/2013
Page 79
DFSS
reactionary
proactive
preventing problems
PEUSS 2012/2013
Page 80
40
DFSS Summary
PEUSS 2012/2013
Page 81
41