You are on page 1of 64

Six Sigma

DMAIC Guide
Q u i c k

G u i d e

Six Sigma Team


3M Germany

All rights, including translation into foreign languages, reserved.


No part of this work may be reproduced in any form, even for teaching
purposes, or processed and duplicated by the use of electronic media,
without prior written permission.
copyright 2003 3M Deutschland GmbH

Introduction
The systematic, targeted implementation of the Six Sigma
corporate initiative is very extensively determined by mastery and project-related application of the tools and quality techniques of the DMAIC
methodology.
For this, the appropriate training for Champions, MBBs, BBs and GBs
provides the necessary platforms of knowledge.
In response to numerous requests, and on the basis of knowledge
acquired by the trainer teams, we have condensed the elements of the extensive training material down to their essential content and, by the use of the examples, created understandable aids to
interpretation.
We hope that this tool is of assistance to you in the scope of your project
responsibility.
D. Kuncar

Dr. . Ertrk

E. Spenhoff

DEFINE
1. Project Charter

MEASURE
2. Process Map
3. Cause & Effect Matrix
4. Measurement System Analysis
5. Process Capability Analysis

ANALYZE
6. Failure Mode and Effect Analysis
7. Multi-Vari Analysis

IMPROVE
8. Design of Experiment

CONTROL
9. Control Plan

User Instructions
How do I make use of this brief DMAIC manual?
As a clearly arranged reference work, it provides swift access to
terms and parameters used in the context of the DMAIC
methodology. In discussions and project reviews it can be very helpful
for recalling or checking on concepts and valuation limits in the course of
assessing test results.
The brief DMAIC manual is arranged in line with the structure and chronology of DMAIC. The stages of the DMAIC process and tools, as identified in the margins of the pages, quickly show where the explanatory notes
tie in. In addition, each stage of DMAIC is colour-coded.
The parameters and evaluation criteria serve to evaluate the results from each successive phase of projects. They are
supplemented with explanations of terms where necessary.
Concrete examples to clarify the subject in detail or a graphical presentation providing orientation in cases of complex
relationships (e.g. multi-vari analysis) are given under the heading
Example.
Practical knowledge that can be used to facilitate work is listed under
Useful Hints.
Under Lessons Learned you can make your own notes.
The Pitfalls are cautionary notes based on difficulties and
problems encountered during the course of projects.
The Review Check Questions are listed to enable the reader to conduct an independent check in advance of a review and to ensure that a
list of questions is readily available during the review.
A glossary of terms frequently used in Six Sigma is given at the end of
the book.

DEFINE
Project Charter

Process Map

Project Charter

DEFINE

MEASURE ANALYSE IMPROVE CONTROL

Project Charter
Purpose
The purpose of a Project Charter is to initiate a Six Sigma project by defining its scope and the project variables. The Project Charter is the basis
for Black Belts and Green Belts to work from in the sense that it clearly
defines the project.

Baseline:
Baseline is the actual performance of the process that is to be improved.
Entitlement:
Entitlement is the best possible performance that has been observed recently or was observed in a Benchmarking study. It can be performance as defined during development or by the equipment manufacturer.
Metrics:
COPQ (Cost of poor Quality): COPQ is a measurement of documented
costs of mistakes or errors, such as complaint costs, rework costs, waste,
etc.
Cpk (Process Capability Index): The CpK measures a process
variables short term performance. It requires ranges to be defined. Target
values are not sufficient to calculate a CpK. Cpks less than 1 indicate
room for improvement whereas Cpks greater than 1.33 indicate only
limited room for improvement.

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Measurements and Evaluation Criteria

DPMO (Defects per Million Opportunities): As opposed to the DPU,


the DPMO provides comparable values under differing complexities of
the attribute. Weighting of the different complexities is carried out under
Opportunities (Types of Errors).

Conditions

Process Map
Control Plan

DoE

The Project Ys should, wherever possible, be measured with one of


these metrics.
The Entitlement must be verifiable. Wishful thinking or estimates are
not helpful.
Project goals should be defined so that the Gap (EntitlementBaseline) is closed by 75%.

FMEA

PCA

RTY (Rolled Throughput Yield): RTY provides a processs actual yield


based on the individual process steps.

C&E Matrix

DPU (Defect per Unit): The DPU is a general measurement for the
number of errors per unit. In this case, Error is a synonym for unwanted
events (e.g. non-answered telephone calls per day, etc.)

MSA

Ppk (Process Performance Index): The Ppk measures a process variables long term performance. It requires ranges to be defined. Target values are not sufficient to calculate a Ppk. Ppks less than 1 indicate room
for improvement whereas Ppks greater than 1.33 indicate only limited
room for improvement. PPks greater than 1.5 fulfil 6-requirements.

Project Charter

MEASURE ANALYSE IMPROVE CONTROL

Multi-Vari

DEFINE

MEASURE ANALYSE IMPROVE CONTROL

Example: Goal Tree


Coporate Ys

Business Ys

Market
Penetration

Growth

MSA

Market
Penetration of
Product A

???
???

???
???

Lagerzeit

Days Sales
Outstanding

PCA
FMEA
Multi-Vari
DoE
Control Plan

Process Ys

Number of
Dead SKUs

Lagerbestand
Cash

Project Ys

Costumer
Satisfaction

New Market
Development

C&E Matrix

Process Map

Project Charter

DEFINE

???
Yield (RTY)

Manufacturing
Cost

Cost

???
Downtime

Energie
Cost

Hopper
Project list
with priority
Yes

Solution is clear
and
Neither clear nor simple
simple
Not easy
and not clear

Not easy
Quick Hits
Is something
broken?
insufficiently defined?
uneducated?

Other initiatives
Driven by initiative
techniques known
low engineering
change of cultures
or rules

10

Six-Sigma-Projects
complex relationships
process oriented
reduce dispersion (mean error)
remove errors
requires analysis

Pitfalls
Projects with too many measurements become problematic.
Projects take too long they should be able to be completed in 4 to 6
months.
Project complexity is too great because too many points need to be
clarified. (Scope/boundaries - Definition)
Even when the solution seems apparent, the DMAIC method should
still be used.
A project is only feasible if a measurable Project Y exists.
A goal should be expressed as change the Project Y from __ to __,
general expressions such as a reduction of __% are not sufficiently
accurate.
The boundaries to Just-do-it-projects is unclear (e.g. reduction of
manufacturing cost through replacement of raw materials. Better:
Reduction of raw material costs.)
The process to be improved is unknown or cannot be influenced.
Projects which plan to incorporate new methods are unsuitable (e.g.
introduction of new software).

11

Process Map
C&E Matrix

Include project goals for all measurements. These goals can be revised
during the project.
Entitlement of complaints should always be set to zero.
Stakeholder Analysis helps to find the right champion.
The Scope should be large enough to encompass points that have a
direct input/influence on the project.

MSA

Manufacturing
RTY
waste disposal cost
productivity

PCA

Inventory
Inventory level
rented warehouse space
ability to deliver

FMEA

Examples:
Primary:
Secondary:
Contrary:

Multi-Vari

Measurements can be primary, secondary or contrary.

DoE

Tips

Project Charter

MEASURE ANALYSE IMPROVE CONTROL

Control Plan

DEFINE

MEASURE ANALYSE IMPROVE CONTROL

Review Checklist
1.
2.

What is the expected business impact?


Has a Goal Tree been developed linking the Project Y to
Business Unit Y to Corporate CTX?
3.
Has the potential savings benefit been identified and
validated?
4.
What is the business impact to date?
5.
What is the problem and where is it focused?
6.
Has this been worked on before and how are you leveraging that effort?
7.
What are the boundaries of the process?
8.
Is the scope meaningful and manageable?
9.
Is the defect definition clear and concise?
10.How do we currently measure the Project Y?
11.Do you have Baselines established?
12.How was Entitlement determined?
13.What percent of the entitlement Gap is being targeted by
this project?
14.Who is on the team and who is championing the effort?
15.How effective is the team and the facilitator (Project Leader)?
16.What actions were taken to correct for any issues
uncovered?
17.What barriers to success exist?
18.Has a Stakeholder Analysis been done?
19.Have Influence Strategies been developed?

Lessons learned

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

12

MEASURE
Process Map

Cause & Effect


Matrix
Measurement
System
Analysis
Process
Capability
Analysis

MEASURE

ANALYSE IMPROVE CONTROL

Process Map
Purpose
The Process Map depicts a simple Process diagram to identify the Input
Xs (influences) and Output Ys (Process Ys, result variables). The
process diagram serves as the direction guild line.

Measurements and Evaluation Criteria


The measurements are the process steps, Input Xs and Output Ys.
Process Steps are separated into value creating and non-value creating
steps (potential for reengineering). The Process Steps are activities and
therefore should be defined by verbs. The process should be broken down
into 48 process steps.
Controlled Inputs are controlled variables that can be used to change the
Output Ys.
For Example: Temperature (X) changes the drying time (Y) or number
of service personnel (X) changes the response time (Y) for customer
questions.
Uncontrolled Inputs are uncontrolled variables that have an effect on the
Output Ys. They can be defined as Common Cause or Special Cause
effects. These variables are measurable but difficult to set.
For Example: Room Temperature (X) influences Adhesion (Y) or
Illness (X) influences Company Results (Y).

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

14

Conditions

Control Plan

DoE

Multi-Vari

FMEA

The Input Xs and Output Ys should contain nouns.


Input Xs can be Output Ys of previous Process Steps.
Attributes should be based on specific features (e.g. availability, speed,
etc.) rather than the source of the feature (e.g. Maker G6) to be able to
qualify as Input Xs or Output Ys.
The Input Xs and Output Ys should be quantifiable
wherever possible.
Include Output Ys fr process, products and service.
Avoid monetary Output Ys wherever possible.

Process Map

Outputs are responses or results of the process. Differentiate


between the Project Ys (Targets) and Process Ys (responses that influence the Project Ys).

C&E Matrix

Critical Inputs are uncontrolled Input Xs that have a proven


special cause effect on the Output Ys. These inputs can also be nonquantifiable. They must be marked boldly on the Process Map.

Project Charter

ANALYSE IMPROVE CONTROL

MSA

MEASURE

PCA

DEFINE

15

MEASURE

Inputs / Controls
Product name
Order amount
Delivery date
Payment method
Order date

U
U
U
U
U

Size
Production amount
Inventory amount
Production rate
Set up time
Availability

C
U
C
C
C
U

Production date
Process parameters
Materials

C
C
C

Process steps

PCA

Accept order

Outputs
Part number
Inventory amount
Production amount
Order time to delivery
Confirmed delivery time

Production date
Delivery date
Production time
Plan order

Production date attainment


Availability
Set up machine

Production amount
Delivery date
Settings
Lot number

U
C
C
U

Run time
Product attributes
Produce products

FMEA
Multi-Vari
DoE
Control Plan

ANALYSE IMPROVE CONTROL

Example: Process Map

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

Tips
A cause and effect diagram (Fishbone, Ishikawa Diagram) can
be helpful to define Inputs by separating them into the categories
Person, Machine, Material, Measurement, Method, Management and
Environment.
The Process Steps are acivities, so use verbs.
Define Process Steps first, followed by outputs, then define the necessary inputs.
Use features (e.g. speed) rather than feature sources
(e.g. Maker G6) for the Input Xs or Output Ys.
16

Forecast
Packing list
Sales

Project Charter
Process Map

Coater

Feature attribute
Attainment x or y
Measurement (e.g. 80 kg)
Measurement (e.g. 183 cm)
Category (e.g. blue)
Category (e.g. female)
Rating (e.g. IQ 120)
Measurement (e.g. 2 m/s)
Measurement (e.g. 50%)
Value (e.g. 10 breakdowns/month)
Value (e.g. 10000 EUR)
Category (e.g. no)
Category (e.g. no)
Category (e.g. yes)
Category (e.g. distributor)
Category (e.g. end user)

C&E Matrix

Person

Feature
Variabe X or Y
Weight
Height
Eye Colour
Sex
Intelligence
Speed
Availability
Downtime
Deviation
Availability
Completeness
Machine readability
Sales method
Customer type

MSA

Feature source

ANALYSE IMPROVE CONTROL

Pitfalls

Control Plan

DoE

Multi-Vari

The process has been defined in too much or too little detail.
Inputs are feature sources and not measurable features.
The Process Map depicts wishful thinking rather than reality.
Experienced team members not available.

PCA

MEASURE

FMEA

DEFINE

17

MEASURE

ANALYSE IMPROVE CONTROL

Review Check List


1.
2.
3.
4.
5.
6.
7.
8.

Who helped to develop the map and what organizations do they


represent?
Does the Process Map reflect current state or the desired process?
Are all non value-added steps included?
What did you learn from the Process Map?
What Quick Hits did you find from this effort?
What Process Steps does the team feel can be eliminated or
combined to reduce opportunities for scrap and
increase rate?
How will you measure the Key Inputs and Key Outputs?
What characterizes an Uncontrolled and a Controlled
variable?

Lessons learned

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

18

ANALYSE IMPROVE CONTROL

Cause & Effect Matrix


Purpose
The C&E Matrix helps to select the important Input Xs (controlled and uncontrolled inputs) from all of the Input Xs in the
process map. Selection is based on the strength of the effect that the
Input Xs are thought to have on the Output Ys.

Project Charter

MEASURE

Process Map

DEFINE

Comment
No effect
Weak effect on dispersion
Medium effect on dispersion and position
Strong effect on dispersion and position

MSA
Control Plan

DoE

The weighted sum of the evaluations serves as a filter based on the Pareto
principal. A rule of thumb is: The 5 to 20 largest weighted sums are used to
choose the Input Xs which will be carried over into the FMEA.

PCA

Evaluation
0
1
3
9

FMEA

The Evaluation of the Effect of Input Xs on the Output Ys is carried out


according to the following table:

Multi-Vari

The Weighting of the Output Ys (Rating of Importance to Project)


should be in the range of 1 to 10. The Outputs of the C&E Matrix are
used as selection criteria. A slight difference in weighting will not lead to
a useful selection of the Inputs.

C&E Matrix

Measurements and Evaluation Criteria

19

MEASURE

ANALYSE IMPROVE CONTROL

Conditions:
All Input Xs from the Process Map are evaluated C&E Matrix.
More than 10 Output Ys should be never used.
The Output Ys should be Process Ys and Project Ys.

PROCESS MAP
Inputs

Process
Steps

Outputs

C&E MATRIX
Outputs

Inputs

20

9
3
3
9
3
3
1
1
3
3
3

3
3
3
1
1
1
3
9
3
9
9

9
1
1
1
1
3
1
1
1

1
9
9
9
9
9
3
1
1
0

Sum

Torsion

9
3
3
9
3
3
1
3
3
3
3

AI Content

Particle size

Inputs Xs
AI Content
Viscosity
Solvent content
Duration
Formulation
Solvent/particles
Granulate size
Drying time
Recycling content
Vortex efficiency
Air content

Output Ys
7
5
3

Hardness

Process Step
Melting
Grinding
Grinding
Melting
Grinding
Grinding
Grinding
Grinding
Grinding
Grinding
Grinding

Density

Rating of Importance

FMEA
Multi-Vari
DoE
Control Plan

Sum

Example: C&E Matrix

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

177
141
138
134
131
123
119
100
99
90
87

123

119

100

99

90

87

50
Granulate size

Particle content

Formulation

Duration

Solvent content

Viscosity

AI Content

Tips

Control Plan

DoE

The ideal number of Output Ys is between 3 and 7.


In the selection process for Output Ys it may be helpful to look at
Process Ys and other helpful values
(e.g. Counterbalance: if you want to reduce the changeover time
in manufacturing, you must ensure that warehouse inventory is not
increased.)
If the C&E-Matrix is used properly, FMEA will be easier.
A Pareto Diagram helps in visualizing the priorities.
When there are more than 150 Input Xs, start by
evaluating the Process Steps and then evaluate the Input Xs in
more detail.

C&E Matrix

131

MSA

134

Air content

138

Vortex efficiency

141

100

Recycling content

177

150

Drying time

200

PCA

Pareto Diagram

Process Map

Project Charter

ANALYSE IMPROVE CONTROL

FMEA

MEASURE

Multi-Vari

DEFINE

21

MEASURE

ANALYSE IMPROVE CONTROL

Pitfalls
It is wrong to ask whether the Input Xs and Output Ys
correlate. It is better to ask how strong the effect of X is on Y.
Correlation does not imply causality.
Only Process-Ys are a Customer Requirement for selection purposes
in the C&E Matrix.
If several Output Ys are listed, they can have a higher weighting than a single more important Output Y.
Do not force customer related Output Ys into the C&E Matrix, if
they are not helpful with the project.

Review Check List


1.
2.
3.
4.
5.
6.
7.

Who provided inputs to the Customer Requirements


for this C&E Matrix?
What groups determined the relationship ratings for the Inputs
and Outputs?
What method was used to determine the final relationship score?
What surfaced as the top Input Variables from the
C&E Matrix? Do these make sense?
What actions are being taken on the top ranked Input Variables?
Are there any Quick Hits that can be assigned to lower ranking
Input variables?
Do the current Control Plans reflect the need to monitor these
top Process Input Variables?

Lessons learned

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

22

ANALYSE IMPROVE CONTROL

Measurement System Analysis


Purpose
The MSA defines the ability of a system to set up a process
correctly (Input Xs) and to evaluate the process (Output Ys). Therefore
the MSA is an important requirement in setting up and evaluating processes.

Project Charter

MEASURE

Process Map

DEFINE

Attributive MSA
Kappa > 0.90
Kappa > 0.70
Kappa < 0.70

MSA
DoE

Classification
good
acceptable
unacceptable

Control Plan

Variable MSA
R&R < 10%
R&R < 30 %
R&R > 30 %

PCA

In the transactional processes of business data, accuracy is


evaluated by carrying out Audits. If the data are generated by
measurement systems, a Gage R&R-Analysis (Repeatability and
Reproducibility) Analysis is carried out. If the data are generated by
a qualitative evaluation, an attributive MSA is used. The Gage R&RAnalysis and the Attributive MSA use values according to the definitions
in the following table:

FMEA

Accuracy is based on the Trueness and Precision of the data.

Multi-Vari

Validity is a check on whether the correct aspect of a process is being


measured. The data can come from a trustworthy method or source but do
not need to comply with the projects goals.

C&E Matrix

Measurements and Evaluation Criteria

23

MEASURE

ANALYSE IMPROVE CONTROL

Conditions
To audit business data, audit documents must be produced to show
the relevance and correctness. Relevance must always be checked,
especially for relative values such as square metres per time unit or
material, etc.
In the Gage R&R-Analysis repeated measurements on the same part
must be possible.
When destructive test methods are used, homogenous material must be
used for the Gage R&R-Analysis. In this case the value %R&R must
be up to 50%.
The Attributive MSA can be used on evaluated data only and not for
measurement of defective parts or number of defects.

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

24

Source
Total Gage R&R
Repeatability
Reproducibility
Tester
Tester* Parts
Part to part
Total Variation

Systematic
measurement error

Accidential
measurement error

Trueness

Repeatability
(measurement equipment
influence)

Linearity over the


measurement area

Homogeneity of the
dispersion

Stabiloity over
the time

Reproducibility
(operator influence)

StdDev
(SD)

Study Var
(5.15*SD)

%Study Var
(%SV)

0.10129
0.03944
0.09329
0.00000
0.09329
1.09463
1.09931

0.52163
0.20312
0.48046
0.00000
0.48046
5.63736
5.66144

9.21
3.59
8.49
0.00
8.49
99.57
100.00

Trueness, linearity and stability form part of the regular calibration


and therefore can be left out of the analysis. Homogeneity is covered by the different parts (different measurement points) and is a part of
the Reproducibility study.

Difference between %P/T and %R&R


The %P/T quotient is the number of measurement errors on the tolerance,
whereas the %R&R quotient is the number of measurement errors on the
total mean error (Dispersion).

25

Process Map

Long term process


dispersion

C&E Matrix

Short term process


dispersion

MSA

Measurement
Dispersion (R&R)

Process
Dispersion

PCA

Observed Process
Dispersion

FMEA

GAGE R&R ANALYSIS DIAGRAM

Multi-Vari

Example: Measurement System Analysis

Project Charter

ANALYSE IMPROVE CONTROL

DoE

MEASURE

Control Plan

DEFINE

Process Map

Project Charter

DEFINE

MEASURE

ANALYSE IMPROVE CONTROL

Tips
To validate data from databases (e.g. EUROMS) use known test inputs
or compare with original data (e.g. Delivery note).
Validate data from databases through audit documents of
continuous led audits.
A plausibility check is useful for large amounts of data, whereby the
data for each variable are sorted by size. It is important to return the
data to their original state after the test.
The Gage R&R Analysis and the attribute MSA need to have precisely
defined test conditions.

FMEA

PCA

MSA

C&E Matrix

Pitfalls
It is difficult but usually not impossible to carry out a Gage R&R
analysis in transactional projects.
The attributive MSA is used for any attributive tester data.
The chosen units in the Gage R&R Analysis may not be representative
of the true process variability. The reason for this is that the choice of
units was not arbitrary (e.g. 10 units from one jumbo rather than from
several jumbos, 10 units from one run rather than from several runs).
The measuring equipment does not have a high enough resolution.

Review Check List


1.
2.

Control Plan

DoE

Multi-Vari

3.
4.
5.
6.
7.
8.

What are the major sources of measurement error?


Did you conduct MSA on the Project Ys and all
critical Input Xs?
How much measurement error exists in comparison to the process variation (%R&R)?
How much measurement error exists in comparison to the specifications (%P/T)?
Is the measurement system acceptable for the process improvement efforts?
If not, what actions do you suggest?
How were the samples chosen?
Was the team sensitive to sub-grouping for
sample selection?

26

Short term
CP
Cpk
2.00
1.50
1.83
1.33
1.67
1.17
1.50
1.00
1.33
0.83
1.17
0.67
1.00
0.50
0.83
0.33
0.67
0.17
0.50
0.00

long
Pp
2.00
1.83
1.67
1.50
1.33
1.17
1.00
0.83
0.67
0.50

term
Ppk
1.50
1.33
1.17
1.00
0.83
0.67
0.50
0.33
0.17
0.00

Sigma
Level
6.00
5.50
5.00
4.50
4.00
3.50
3.00
2.50
2.00
1.50

DPU
0.0000034
0.0000317
0.0002327
0.0013500
0.0062097
0.0227501
0.0668072
0.1586553
0.3085375
0.5000000

DPMO in
ppm
3.4
31.7
232.7
1350.0
6209.7
22750.1
66807.2
158655.3
308537.5
500000.0

The capability indices marked in red can be improved by the tools of the
DMAIC process. For the capability indices marked in yellow the DFSS
tool could be necessary. No improvement is necessary for the capability
indices marked in green.
If the Cp or Pp of variable features is higher than the Cpk or
Ppk, then the process is not centralised (target deviation). If the
Cp, Cpk, Pp and Ppk are approximately the same but less than
1.33, the process is not mastered (Common Causes). If the Cp and
Pp or Cpk and Ppk differ greatly from each other, then process
failures will result (Special Causes).

27

Process Map
C&E Matrix

The process capability is defined by six values:


(variable: Cp, Cpk, Pp, Ppk; attributive: DPU, DPMO).

MSA

Measurements and Evaluation Criteria

PCA

The PCA evaluates the capability of the process to meet the customer
requirements (Output Ys, CTQ).

FMEA

Purpose

Multi-Vari

Process Capability Analysis

Project Charter

ANALYSE IMPROVE CONTROL

DoE

MEASURE

Control Plan

DEFINE

MEASURE

ANALYSE IMPROVE CONTROL

Conditions
Data must be independent to carry out a capability analysis.
The residuals of measurable variables must have approximately a normal distribution.
If the data is not normally distributed because of instabilities such as
exceptions (Outliers), then the lack of stability is the problem and the
non-normality is an aspect of this problem.
If the data are stable and not normally distributed, they need to be
transformed (e.g. with logarithmic transformation).

Tips
To calculate the capability indices (Cp, Cpk, Pp, Ppk), the
specification ranges are required. These can be calculated with standard
deviations and a target value.
MINITAB contains a Six-Pack analysis package.
For I-MR Charts: If all measurement values are within the control ranges
(UCL and LCL) then only common causes exist (the process is
stable), otherwise special causes are present (the process is not
mastered).

Pitfalls
The capability indices are always too small if the data are not independent (autocorrelation, autoregression). It can be checked with MINITAB
run charts.
The order of the data has been changed in some way or sorted, in which
case the calculation of dispersion is nothing more than guesswork.
The analysis does not provide plausible control ranges, because the data
are limited on one or both sides. Use the p-chart for error portions, cchart for number of errors and the logarithmic normal distribution for
measurable data.

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

28

Obser.

Individual Value

Mov.Range

29

Control Plan

DoE

Multi-Vari

FMEA

PCA

0.908687
0.55
0.31

MSA

C&E Matrix

Project Charter

Capability Plot

Normal Prob Plot

Capability Histogram

Process Map

Graphical comparison of the


short and long
term dispersion
relative to
specification.

Check whether
the process is
normally distributed (points
lie on the red
curve).
This process
tends to a
normal
distribution.

Check that the


process is stable (no
special causes, meaning
no points out
side the UCL and LCL
limits).
This process is stable.

Overall
StDev: 0.905299
Pp:
0.55
Ppk:
0.31

Within
StDev:
Cp:
Cpk:

LCL = 0

R =1.025

UCL = 3.349

LCL = 8.937

Mean =11.66

UCL =14.39

Capability Indices: Cp,Cpk,Pp,Ppk.


Fact 1: Process is not central, since Cp (0.55)> Cpk (0.31). Target is Cp = Cpk.
Fact 2: Dispersion is double the specified amount, since Cp = 0.55. Target is Cp = 2.
Fact 3: There are no process breakdowns, since Cp=Pp and Cpk = Ppk

Observation Number

Last 25 Observations

Individual and MR Chart

Example: PCA Six Pack for Delivery Time

MEASURE

Values

DEFINE
ANALYSE IMPROVE CONTROL

MEASURE

ANALYSE IMPROVE CONTROL

Review Check List:


Initial Capability
1.
2.
3.
4.
5.

What are the long and short-term process capability values?


How does this compare to the customers perspective of performance?
Is there a significant opportunity to improve beyond
current levels (i.e., how large is the gap between Cp and Ppk)?
What are the definitions of defects and opportunities?
Are the capabilities established for Process Input Variables and
Process Output Variables from the C&E?

Baseline Data
6.
7.
8.
9.
9.
10.
11.

How much data or what time period was used to


determine the baseline capability?
What variables were evaluated?
Are they stable?
Are there explanations for out of control signals?
How long was the process monitored to determine
stability?
Have you captured samples of data that truly reflect the normal
process?
What are the largest types of variation over time
shift-to-shift, day-to-day, week-to-week, etc.?

Lessons learned

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE

30

ANALYSE
Failure Mode &
Effect Analysis
M u l t i - Va r i
Analysis

ANALYSE

IMPROVE CONTROL

Failure Mode and Effect Analysis


Purpose
Possibilities for errors in the Input Xs which would create a
process breakdown on the Process Ys should be identified and
evaluated. The FMEA risk analysis takes the important Input Xs
identified in the C&E Matrix and evaluates the critical and noncritical values of these points.

Measurements and Evaluation Criteria


The following 3 parts of the FMEA help to evaluate the risk of the Input Xs
on the process:
Severity: evaluates the severity of the error (deviation of the Input Xs).
It is evaluated by values (1 = meaningless to 10 = life endangering or
high costs).
Occurrence: evaluates the frequency of the error. It is evaluated by
values (1 is 1.5 Ppk (6 level) to 10 which is a Ppk less than or equal
to zero).
Detection: evaluates where and how the error will be recognised. It is
evaluated by values (1 is error, is always immediately detected, to 10
where the error is detected by the customer or the error is only detectable
with great effort).
The Risk Priority Number (RPN) is the product of the three values
and defines the risk of the error from 1-1000. Risks with a RPN
< 125 can be ignored, all other RPNs must be taken into account.

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE

32

ANALYSE

IMPROVE CONTROL
Project Charter

DEFINE MEASURE

C&E MATRIX
Outputs

Sum

Process Map

Inputs

FMEA
Error

Effects

RPZ

Measures

FMEA
Control Plan

DoE

The FMEA must be carried out by people with expertise in the process.
The Effects of the error must relate back to the defined Process Ys.
All Input Xs carried over from the C&E Matrix are included in the
FMEA table.
If there is a high RPN, define corrective measures and keep Quick
Hits in mind.

Multi-Vari

Conditions

PCA

MSA

C&E Matrix

Inputs

33

What causes the Key Input to


go wrong?

How severe is the effect to the


customer?
What is the impact on the Key
Output Variables (Customer
Requirements) or international
requirements?
In what ways does the Key
Input go wrong?
What is the process step/
Input under investigation?

34

All hands on deck!


Big trouble!

high

high

high

high

low

high
low

high

Action

Improve detection first,


then process improvement

Process improvement
high
low

high

Frequent failures,
detectable, costly
Frequent failures,
reaches user
Frequent failures
with major impact

low
high

low

No action

Address controls

low
high

high

Failure does not


reach user
Failure reaches user

low
low

low

No action

No action
Ideal Situation

Assured Mastery

low
low

low

Occur.

Det.

What are the existing


controls and procedures
(inspection and test) that
prevent eith the cause or
the Failure Mode? Should
include an SOP number.
How often does cause or FM
occur?

high

Action
Result

How well can you detect cause


or FM?

IMPROVE CONTROL

Sev.

Actions
Recommended
RPN
Current Controls
Process Potential Potential
Failure
Step
Failure
Mode
Input
Effects

Severity
(110)

Potential
Causes

Occur.
(110)

Detection
(110)

RPN = Severity x Occurence x


Detection

MSA
FMEA
Multi-Vari
DoE
Control Plan

ANALYSE

What are the actions for


reducing the occurrance of the
Cause, or improving detection?
Should have actions only
on high RPNs or easy
fixes.

PCA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE

Useful Hints

Multi-Vari
Control Plan

Wrongly defined Inputs Xs,e.g. a non-quantified noun without features leads to many useless potential failure modes. Example:
Wrong Input: Operator
Pot. failure: Operator is too small, too tall, demotivated etc.
Correct Input: Operators qualification
Pot. failure: insufficiently qualified
Causality (observation window) between failure, cause of failure and
failure effect must remain on one level and not be changed.

DoE

Pitfalls

FMEA

PCA

MSA

For each Input X, several potential failures can be determined.


Potential failures should be defined in such a way that the
measurable input value may be described as too small/low, wrong,
incomplete or to early/late.
Make sure to list no more than 2 potential failures for each
Input Xs, otherwise the Input X is not correctly defined.
No additional Input Xs may result.
A failure tree is helpful to structure and identify cause-effect relationships.
Equivalent failure sequences have the same severity rating.
Corrective actions to minimize risks should be prioritized.
Establish rating scales as they help to access Severity, Occurrence
and Detection.
The FMEA-Analysis initiates further projects (eg. GB-Projects).
The FMEA provides potential Inputs Xs for the Control Plan.

Project Charter

IMPROVE CONTROL

Process Map

ANALYSE

C&E Matrix

DEFINE MEASURE

35

ANALYSE

IMPROVE CONTROL

Review Check List


1.
2.
3.
4.
5.
6.
7.
8.

Who helped develop the FMEA; what organizations do they represent?


Which items from your C&E Matrix did you evaluate in the
FMEA tool?
Does the tool reflect current state or the desired process?
What type of ranking system did you use?
What quick-hits did you find from the FMEA?
Did you complete the actions recommended section of the tool?
Do all actions in your FMEA have responsibilities assigned and a
completion date identified?
Have you updated your Control Plan with what you know so
far?

Lessons learned

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE

36

The simple correlation coefficient (r) is a statistical value for


quantifying the dependency between two variables. Negative
correlation signifies that increasing the input variable reduces the level of
the output value. Conversely, positive correlation causes the output value
to rise.
|r|
0.00 < 0.10
0.10 < 0.30
0.30 < 0.70
0.70 < 0.90
0.90 < 0.99
0.99 < 1.00

Remarks
No or negligible correlation
Weak correlation
Medium correlation
Strong correlation
Very strong correlation
Functional correlation

A null hypothesis is confirmed or rejected on the basis of the


p-value. The p-value is the probability of the null hypothesis being
correct. Rejection of the null hypothesis indicates a significant
finding.
P-Value
p > 0.10
0.10 > p > 0.05
0.05 > p > 0.01
0.01 > p

Remarks
No significance
Weak significance
Significant finding
Highly significant finding

37

Process Map
C&E Matrix

For the purpose of analysis, use is made of existing historic data if available (passive approach).

MSA

Parameters and Evaluation Criteria

PCA

For the purpose of screening the most important inputs, the


correlations of the critical input Xs derived from the FMEA with
the project and process Ys are ascertained. Any correlation identified does not necessarily signify a cause-effect relationship
(dependency). Special attention is paid to possible uncontrolled input Xs
(noise variables).

FMEA

Purpose

Multi-Vari

Multi-Vari Analysis

Project Charter

IMPROVE CONTROL

DoE

ANALYSE

Control Plan

DEFINE MEASURE

ANALYSE

For analysis, the data must be independent.


For measurable variables, the residuals must be more or less
normally distributed and spreads must be homogeneous.
The error probability must be established before testing is
carried out.
The data set should include no outliers.

Example: Multi-Vari Analysis


Data

Description

A1 A2
B1 5 10
B2 80 50

discrete X
discrete Y

MSA

0 1

PCA

discrete X
continuous Y
X one level

Hypothesis
of dependence
H0: p11 = p1. p.1
of homogenity
H0: p1 = p2
of normality
H0: F(x) = F0(x)
target

Chi-Square-Test
2 -Test

Anderson-Darling
Test, 1 sample
t-Test (confidence
interval)

H0: 0 = 1

Multi-Vari

FMEA

discrete X
continuous Y
X 2 levels

1 2

discrete X
continuous Y
X 2+ levels

DoE
Control Plan

IMPROVE CONTROL

Requirements

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE

continuous X
continuous Y

y
x

homogenity
of variances
H0: 12 = 22
of mean
H0: 1 = 2

F-Test 2-sample
t-Test

homogenity
of variances
H0: 12 = ... = k2
of mean
H0: 1 = ... = k

Levene-Test,
one-way ANOVA

of correlation
H0: r = 0
of regression
H0: b0= 0
H0: b1 = 0

Correlation- and
regression analysis

38

Pitfalls

Process Map
C&E Matrix
Control Plan

Correlations might also be due to the feature correlations (body measurements, storks nests versus births), inhomogeneous correlations
(shoe size versus income), and formal correlations (solids content versus solvent content). These do not help in
eliminating the causes of spread. On causal correlations, expert decisions have to be taken. Pure correlations are overinterpreted as dependencies (cause-effect relationships).
Historic data are unreliable by reason of unknown background factors
(e.g. changes in material/process).
Multi-collinearity (i.e. dependencies between the Input Xs) may seriously impair the results. If the correlations between the Input Xs are
IRI< 0.3, they are deemed negligible.

MSA

Time effects (e.g. delayed effects, such as turns versus net manufacturing costs) must be taken into account and the data must be
synchronized.
Partial populations (N>1000) should be avoided because,
otherwise, the slightest differences or correlations become
significant although not practically relevant. A sample size of n=200 to
500 is adequate and still readily manageable.
The data should always be left in their original sequence;
otherwise, testing for independence is not possible.
The variables should be sorted into Input Xs and Output Ys.

PCA

Useful Hints

FMEA

The tools of multi-vari analysis are presented in brief tabular form. In


addition to the description and presentation of the data, the null hypotheses to be tested, and the planned tests, are stated. Apart from these univariate tests, multiple or multivariate tests can be conducted. The methods
of the general linear models (GLM) are very largely applicable.

Project Charter

IMPROVE CONTROL

Multi-Vari

ANALYSE

DoE

DEFINE MEASURE

39

ANALYSE

IMPROVE CONTROL

Review Check List


1.

How did you determine what Process Input Variables to


investigate?
2.
How did you statistically determine the contribution of each
Process Input Variable to the overall variability
observed?
3.
What sample size did you use for each Process Input Variable
to ensure statistical validity?
4.
What was your null hypothesis?
5.
What risk levels did you assume?
6.
Is the project addressing a mean shift or variation
reduction?
7.
What type of data is the Process Output Variable? Process
Input Variables?
8.
What statistical tools were used for evaluation of the
relationship between the Process Input Variables and the
Process Output Variable?
9.
What Process Input Variables contribute the most to the
Process Output Variable variability?
10.Are the statistically significant variables practically
significant?
11.Does your data confirm that your critical Xs from the C&E Matrix
are causing the variation?

Lessons learned

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE

40

IMPROVE
Design of
Experiment

IMPROVE

CONTROL

Design of Experiment
The optimum setting of the selected process Input Xs will be determined by experiment so that the Process and Project Ys can
achieve their target values. Moreover, the setting is done in such
a way as to ensure that the process is robust on noise variables.
In contrast to the multi-vari analysis, the inputs are varied in
accordance with an experiment plan (active approach) and results are
recorded.

Parameters and Assessment Criteria

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

Process Map

Purpose

C&E Matrix

Project Charter

DEFINE MEASURE ANALYSE

The parameter (R2) describes what percentage of the variation is explained


by the function Y = f (X1, X2, ..., Xk).
The rest of the variation is the residual error. A null hypothesis is confirmed or rejected using the p-Value. An outcome is significant, if the null
hypothesis is rejected, i. e. the impact of the Input Xs is ascertained.
Estimated Effects and Coefficients for Yield
Term
Constant
Temp
Time
Pressure
Temp*Time
Temp*Press.
Time*Press.

Effect
85.507
3.919
4.885
2.975
13.840
0.238
-7.219

Coef
85.507
1.960
2.442
1.488
6.920
0.119
-3.609

SE Coef
0.05648
0.05648
0.05648
0.05648
0.05648
0.05648
0.05648

T
1514.04
34.70
43.24
26.34
122.53
2.10
-63.91

P
0.000
0.000
0.000
0.000
0.000
0.065
0.000

The Lack of Fit shows whether the regression model is adequate. It is the
most important measurement for testing the model-related data. Rejection
of the null hypothesis requires a higher-order model.

42

Adj. SS
192.286
974.853
0.459
0.004
0.456

Adj. MS
64.095
324.951
0.051
0.004
0.057

F
1E+03
6E+03

P
0.000
0.000

0.07

0.803

Conditions

MSA

It is important for the analysis that the data are independent. For measurable parameters, the residuals must be normally distributed. The
experiment plan should be orthogonal,
rotatable or D-optimal. Moreover, the trials must be performed in randomised and repeated manner.

Example: Analysis & Optimization


After designing an orthogonal experiment plan, the trial is
performed. The data will then be analysed via regression. Contour line
graphics are used for setting the Input Xs in order to yield an ideal value
for Output Ys.
8

X2

6
X3

X1

43

DoE

Multi-Vari

PCA

Seq SS
192.29
974.85
0.46
0.00
0.46
1167.60

Control Plan

DF
3
3
9
1
8
15

Process Map

Analysis of Variance for Yield


Source
Main Effects
Interactions
Residual Error
Lack of Fit
Pure Error
Total

Project Charter

CONTROL

C&E Matrix

IMPROVE

FMEA

DEFINE MEASURE ANALYSE

IMPROVE

Project Charter

DEFINE MEASURE ANALYSE

C&E Matrix

Process Map

No.
1
2
3
4
5
6
7
8

X1
+
+
+
+

X2
+
+
+
+

X3
+
+
+
+

X1X2
+
+
+
+

X1X3
+
+
+
+

CONTROL

X2X3
+
+
+
+

Variable: Yield
2** (3-0)-Plan; MQ residuals=.051032

MSA

1.2
1.0
0.8

PCA

0.6
0.4

100
95
90
85
80
75

Multi-Vari

Time

FMEA

0.2

DoE

X1X2X3
+
+
+
+

0.0
-0.2
-0.4
-0.6
-0.8
-1.0
-1.2
-1.2 -1.0 -0.8 -0.6 -0.4 -0.2

0.0

0.2

Control Plan

Temperature

44

0.4

0.6

0.8

1.0

1.2

Pitfalls

Process Map
Control Plan

DoE

Multi-Vari

The defined scope of experiment is wrong, too small, or too large.


The settings of the Input Xs (factors) were incorrectly performed or
do not remain constant during the trials.
The trials were performed in standard and not randomised order.
There were not enough repetitions. A degree of freedom
exceeding 30 has to be aimed at in every case.
The wrong model has been defined for analysis, e.g. factors constituting percentages (100 % fit) cannot be analysed by means of regression
constants.

C&E Matrix

Limit the experiment plan to a maximum of five Input Xs and three


process Ys.
If there is no possibility of adding trial factors later, you should consider the use of Central Composite or Box-Behnken Design.
Follow the rule of maximum comparability, i. e. completely homogeneous test conditions have to be created, e.g. through block formation.
Use the Derringer & Suich procedure to
optimise several process Ys.
Do not use experiment plans of solution type III for optimization. Use
D-optimal experiment plans, if the use of standard experiment plans is
not possible for some reason.

MSA

Tips

Project Charter

CONTROL

PCA

IMPROVE

FMEA

DEFINE MEASURE ANALYSE

45

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE ANALYSE

IMPROVE

CONTROL

Review Check List


1.
2.
3.

What are your critical Ys?


What are your critical Xs?
How did you select the factors (Process Input Variables) to
include?
4.
How were the levels for each factor determined?
5.
Did you complete a screening DOE?
6.
Does the knowledge to be gained justify the expense of the experiments?
7.
How did you determine the sample size requirements for each
experimental run?
8.
What risk levels did you assume when you designed your experiment?
9.
What actions would you recommend based on the
experimental results?
10.What did you learn from your Pilot Run?
11.Who was on your team and what were their roles?
12.How much variation in the response (Process Output Variable) was
due to the factors (Process Input Variables) you investigated?
13.What significant factors and interactions exist?
14.How many full factorial designs were completed and why?
15.What methods have been utilized to optimize the process?
16.What actions have been taken to verify the results of the experimentation?
17.What changes have you made to the process?
18.How have the results been used to impact the Control Plan?
19.What is the underlying model/equation that relates the Process Input
Variables to the Process Output Variable(s)?
20.Can you statistically demonstrate real change as a result of improvements?
21.What percent of the Entitlement Gap has been closed with these
improvements?
22.What next steps should be taken?

Lessons learned

46

CONTROL
Control Plan

Process Map

Project Charter

DEFINE MEASURE ANALYSE IMPROVE

CONTROL

Control Plan
Purpose
The documentation of the project results in the Control Plan is used for
mastering the identified critical Input Xs and thus for controlling the processes affected in order to maintain the improvement gained in the project
the Project and Process Ys (over the long term).

The columns of the Control Plan contain the following data/information:


Process;
Process Step;
Output; Input;
Control Plan
Process Specifications
(LSL, USL,Target)
Capability/Date;
Measurement Technique
%R&R and/or P/T
Sample Size;
Sample Frequency
Control Method;
Reaction Plan
These data are derived from the outcome of the Process Map, of the
Process Capability Analysis (Initial Capability), the C&E Matrix, and
the FMEA.

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Parameters and Assessment Criteria

48

CONTROL

Audit Plan
DOEs

Reactions
Plans

Mistake
Proofing
SOPs
Capability
Studies
FMEA s
Process

Training

Multi-Vari

Control Plan

FMEA

PCA

Maps

Process Owner
The person with major responsibility for the process outcome must
have the authority and capability to control the process and thus
influence the result. If the improved process is controlled by several persons in their respective partial areas, each of these persons is the
process owner responsible for his/her partial area.
The established management systems (e.g. Controlling, Administration, HR
49

DoE

Customer
Requirements

Control Plan

MSA

Preventive
Maintenance
Parameter
Cards

C&E Matrix

The Control Plan is part of a process documentation package which may


be rather extensive. This project documentation should be available as an
information base from the Process Owner. Established process regulations
(operating or manufacturing procedures, machine procedures...) will be
updated with the project outcome and serve as a daily work resource.

MSA

Documentation Package

Process Map

Project Charter

DEFINE MEASURE ANALYSE IMPROVE

CONTROL

Management, Environmental Management System, Quality Management


System, etc.) must be considered during process restructuring or development in order to avoid any build-up of parallel systems and thus confusion of authorities. The new process regulations must become a part of
established systems in order to ensure sustainable improvement through
signature regulations, internal audits etc.

Example: Control Plan

PROCESS MAP
Inputs

Process
Steps

Outputs

MSA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE ANALYSE IMPROVE

C&E MATRIX
Outputs

Sum

FMEA
Inputs

Fault

Effects

RPN

Capability Summary Sheet


Actions

Outputs M-Tech. R&R Cpk LSL USL

Overview of the process capability


The Key-outputs Y
will be analyzed

The Key-inputs X
will be analyzed

Control Plan
Process Inputs Outputs M-Sys. %RAR Cpk LSL USL React.

Control Plan

DoE

Multi-Vari

FMEA

PCA

Inputs

50

Pitfalls

Process Map
Control Plan

DoE

Multi-Vari

FMEA

There is not much time at the end of the project to consolidate data and
information and to develop documentation.
The established documentation system is not known.
Additional regulations are created instead of existing ones being
adapted.

C&E Matrix

The established Quality Management System can/must be used for


developing and updating the documentation.
Develop a reaction plan.
Make use of mistake proofing systems.
The sample, audit and test plans indicate how often, where and to
whom results must be reported.
Develop a training plan. The training describes all aspects of the process and responsibilities: process improvement activities must be fully
documented.
The Control Plan is regularly reviewed and updated and is
located at the process site.

MSA

Tips

Project Charter

CONTROL

PCA

DEFINE MEASURE ANALYSE IMPROVE

51

Control Plan

DoE

Multi-Vari

FMEA

PCA

MSA

C&E Matrix

Process Map

Project Charter

DEFINE MEASURE ANALYSE IMPROVE

CONTROL

Review Check List


Control Methods:
1.

What control methods are in place for the Critical Process Input
Variables and Process Output Variables?
2.
How were the control methods determined? Are they
statistically valid?
3.
How are the control methods identified and documented?
4.
What methods exist for updating and controlling the
Control Plan?
5.
How was the Reaction Plan for each Process Input Variable/
Process Output Variable developed?
6.
Who is the Process Owner?
7.
Does the Process Owner accept the results of your project and
support the implementation of proposed process
changes?
8.
Who is responsible for maintaining the control plans and methods?
9.
Who is responsible for implementing the control scheme?
10.What training is needed and for who?
11.Who will conduct necessary training?
12.What confidence do you have that the changes implemented will be effective once the project is completed?
13.Has an audit plan been developed, and documented in the business
units quality management system?
Long-Term Capability
14.How long will data be taken to verify that a successful
process improvement has been accomplished before
closing the project?
15.What data will be taken?
16.Who will take the data?
17.Who will analyze and track the data?
18.What statistical tools will be used to verify the
improvement?

52

SIX SIGMA GLOSSARY

2002

August

A
Analyze
3rd phase in the DMAIC process: process data have been identified
and the process is now analysed with regard to causes for mistakes (see Root Cause Analysis) and non-value-adding activities.

B
Baseline
The performance of a process at the start of the improvement processes: How has the process performed up to now?
Black Belt
Six Sigma expert (full time) leading Six Sigma projects/teams and
trained in leadership, project management, statistical methods and
problem solving strategies.
Breakthrough performance
Improvement signifying a jump to a higher performance level.
Significant reduction of the gap between baseline and entitlement
(real possible maximum performance). Leads to substantial improvement in terms of the corporate objectives growth, cost and cash.
Business Critical Y
A business units concrete improvement target. Is related to the
corporate objectives (Corporate Ys) Growth, Cost and Cash.

53

C
Champion
An upper level business leader who facilitates the implementation
of Six Sigma projects. The Champion identifies improvement possibilities in his/her responsibility area, initiates projects, makes the
necessary resources available and supports the Six Sigma teams.
Control
Last phase of the DMAIC process.
Development of standard measures for regular performance review:
How can we sustain the new, higher performance level?
Corporate Y
Target parameter for corporate success, e.g. growth, cost, cash.
Cost of Poor Quality
Costs associated with the delivery of poor quality products or services. Examples: complaint handling, replacements, inspections
and other activities that do not generate added value.
Critical to Quality (CTQ)
Part of a process that has direct impact on the perceived quality.
Critical Y
Parameter to be improved in the Six Sigma process: What exactly
do we want to improve?

54

D
Defect
Any product characteristic that deviates from customer requirements.
Defects per Million Opportunities
Quality measurement parameter that is often used in the Six Sigma
process: number of observed mistakes per million mistake opportunities.
Define
First phase in the DMAIC process: definition of the problem or the
process improvement possibilities or of customer requirements. Not
static, but reviewed again and again in the course of the DMAIC
process and adjusted according to the findings.
DMAIC Process
Key Six Sigma process: Define, Measure, Analyse, Improve and
Control. Its methodology is systematic, scientific and fact-based.
Design for Six Sigma (DFSS)
Methodology similar to DMAIC with the aim of deriving new products and services. Unlike DMAIC, the DFSS methodology consists
of a variety of techniques to be combined.
Design of Experiment (DOE)
Statistical experiment planning which identifies the factors and their
optimum settings that affect the process outcome.

55

E
Entitlement
Actual possible maximum performance (benchmark).

F
Failure Mode and Effect Analysis (FMEA)
Technique enabling systematic analysis of possible mistakes and
assessment of risks through products and process mistakes. Helps
to select high-risk inputs in the context of Six Sigma.
Funnel Effect
Filters down the key input variables of all possible impact variables
of project Y.

G
Gap
Gap between entitlement and baseline.
Green Belt
Has a similar function to the Black Belt but is less comprehensively trained. Dedicates part of his/her working time to coaching
on smaller or partial projects.

H
Hopper
Collection of potential Six Sigma projects.

I
Improve
Fourth phase of the DMAIC process: problem solutions are
developed, tested and implemented and roots of mistakes are
eliminated.

56

M
Master Black Belt
Full-time Six Sigma expert. Is responsible for implementing the Six
Sigma strategy in a region (e.g. Germany) or in a business unit. He
leads and assists several Black Belts, manages a project, and is
responsible for training and communication.
Measure
Second phase in the DMAIC process: measurement, capture and
collection of all relevant process data.
Multi-Vari Analysis
Statistical method for analysing suspected process input variables
in relation to correlations and effects on the process outputs.

P
Process Capability Index
Parameter for identifying whether a process is able to meet the
process output requirements (e.g. customer requirements).
Process Map
A step-by-step pictorial sequence of a process showing process
inputs, outputs, cycle time, decision points and activities involved
at a glance.
Process Owner
The person responsible for the process output. This person has the
authority and ability to influence and control the process.
Project Y
Parameter to be improved in a Six Sigma project (e.g. inventory
reduction, output increase of a production line, market share increase in a certain segment). Helps to achieve higher corporate
objectives.

57

R
Rolled Throughput Yield (RTY)
Yield of a process expressed in percentage. It is calculated by multiplying the yields of each process step. Example: The yield of the
first step is 98% of input, in the second, 90%, and in the third, 95%.
The Rolled Throughput Yield (RTY) thus amounts to 84% (0.98 x
0.90 x 0.95 = 0.84).
Root Cause Analysis
Analysis of the exact cause for a process deviating from the desired result.

S
Six Sigma
Statistical measure for the extent to which a process deviates from
perfection. Six Sigma stands for 3.4 mistakes per million mistake
possibilities (see Defects per Million).
Super Y
Clustering similar Six Sigma projects into one topic with general
significance for the company (e.g. production, inventories) with the
aim of increasing transparency and efficiency by sharing information and experience.

V
Variation
Spread r (result) differences) within a process. Variation is a
parameter for the stability or predictability of a process. Any process improvement (in the context of Six Sigma) aims at reducing
variation as far as possible.

58

Analyze-Roadmap: 2 -Test
Independence test of two variables
at two or more levels
Action

Minitab

Remarks

Define
Hypothesis

H0: The variables are independent


H0: pij =pi. p.j
HA: The variables are dependent or
HA: pij =pi. p.j
Evaluate the consequences of wrong decisions. Define
error probability and the necessary
sample sizes.

Collect
Data

Randomized data collection. Direct analysis of implausible results.

Examine
independence
of the variables

Evaluate
Table

Use the 2 -independence test to review the null


hypothesis and reject it if p-value < .
The variables are dependent in that case.
Chi-Square The expected values must be 5 for 22 tables, 2 for
Test
r 2 tables and 1 for r c tables.
If these conditions are not met, columns (c) or rows (r)
can be combined.

The differences between the observed values and the


expected values show by which cells
the deviations are caused. The 2 values of the cells
are the standard sizes of the deviations.

59

Analyze-Roadmap: One-Way ANOVA


Comparisons of mean values of two or more samples
H0: All mean values are equal.
HA: At least one mean value is different.
H0: All variances are equal.
HA: At least one variance is different.
Evaluate consequences of wrong decisions.
Define error probability and the necessary
sample sizes.

Define
Hypotheses

Randomized data collection.Direct analysis of implausible results. Secure the time sequence of the data.
Do not round off the measured values.

Collect
Data

Study homogeneity of the


variances

Study homogeneity of
the variances

Check ND
Fit

Study
Autocorrelation

Test for
equal
Varances

Use the Levene test to study the null hypothesis and


reject it if p-value < . The samples then come from
different populations. If the variances and the sample
sizes differ widely, the mean value test is probably
not correct.

One-way

Use the One-Way ANOVA to check the null


hypothesis and reject it if p-value < . The
samples then come from different populations.
If the differences between the mean values are
not significant, yet substantial, increase the
sample size. Store residuals.

Normality
Test

Use the Normality Test to check the normality and


reject it if p-value < . This means that the
residuals are not normally distributed. Minor
deviations in the ends are negligible.
In the event of major deviations, the data must be
transformed.

I-MR-Chart

Use the I-MR-Chart to study autocorrelation and stability. The process is disturbed if special
causes or patterns exist. This result may distort the
hypothesis test.

Analyze-Roadmap: Regression
Simple and multiple regression and correlation
Action

Minitab

Define Model Y = f(X), keeping in mind that


only causal inputs may be used.
HO: All regression coefficients equal zero.
HA: At least one regression coefficient does not
equal zero.
Evaluate consequences of wrong decisions. Define
error probability and the necessary
sample sizes.

Define
Model and
Hypotheses

Randomized data collection. Direct analysis of implausible results. Secure the time sequence of the data. Do
not round off the measured values.

Collect
Data

Check correlation
and regression of
the variables

Check NT
Fit

Remarks

Check the null hypothesis and reject it if p-value < .


This means that the variables impact the
targeted value. There is a correlation between the variables X and Y. The measures for the
Regression correlation are the correlation coefficient r and the
square r2. High values for r2 signify high dependence
und
Fitted Line but no adequate regression model. Therefore, the lack
of fit sample must also be checked. For check, activate
Plot
Pure Error and
evaluate the ANOVA. Use the Fitted Line Plot for a
first graphical analysis. Reduce the model to the significant variables and store the residuals.

Normality
Test

Use the Normality Test to check the normality and


reject the null hypothesis if p-value < .
This means that the residuals are not normally distributed. Minor deviations in the ends are negligible.

Use the I-MR-Chart and the Residual Plot to study


autocorrelation and stability. The process is disturbed,
Study
I-MR-Chart or the model not adequate, if special causes or patModel and
Residual-Plot terns exist.
Autocorrelation

Analyze-Roadmap: DoE
Factorial or central composite experiments
Action

Define
Model and
Hypotheses

Minitab
Create
Factorial
or
Response
Surface
Design

Randomized trials. Direct analysis of implausible


results. Secure the time sequence of the data.
Do not round off the measured values.

Collect
Data

Determine
Effects of
Inputs

Graphical
Presentation
of Model

Check NT
Fit

Remarks
Define factors, experiment scope and model type
(first or second grade)
HO: All regression coefficients equal zero.
HA: At least one regression coefficient does not equal
zero.
Evaluate consequences of wrong decisions. Define
error probability and the necessary
sample sizes.

Analyse
Factorial
or
Response
Surface
Design

Check the null hypothesis and reject it if p-value < .


This means that the inputs impact the
targeted value. Use the Lack of Fit to check the model.
For check, activate Pure Error and
evaluate the ANOVA. Reduce the model to the significant variables and store the residuals.

Choose the factors for graphical presentation. Choose


optimum, robust factor settings in
Factorial
accordance with the graphics. Use modified
or
settings to find optimum, robust settings for determined
Contour Plot
factors also.

Normality
Test

Use the Normality Test to check the normality


and reject the null hypothesis if p-value < .
This means that the residuals are not normally
distributed. Minor deviations in the ends are
negligible.

Use the I-MR-Chart and the Residual Plot to check


Study
I-MR-Chart autocorrelation and stability. The process is disturbed,
Model and
Residual-Plot or the model not adequate if special causes or patterns
Autocorrelation
exist.

3M Deutschland GmbH
Carl-Schurz-Strasse 1
41453 Neuss
Phone 0 21 31/14 0
Fax 0 21 31/14 32 00

DW-0001-1187-5

You might also like