You are on page 1of 58

IE 3265

R. Lindeke, Ph. D.
Quality Management in POM
Part 2

Topics
Managing a Quality System
Total Quality Management (TQM)

Achieving Quality in a System


Look early and often
6 Sigma an approach & a technique
Make it a part of the process

The Customers Voice in Total Quality


Management
QFD and the House of Quality

Quality Engineering
Loss Function
Quality Studies
Experimental Approaches
T.M.; FMEA; Shainin

Taguchis Loss Function


Taguchi defines Quality Level of a
product as the Total Loss incurred
by society due to failure of a
product to perform as desired
when it deviates from the delivered
target performance levels.
This includes costs associated
with poor performance, operating
costs (which changes as a product
ages) and any added expenses
due to harmful side effects of the
product in use

Exploring the Taguchi Method


Considering the Loss
Function, it is quantifiable
Larger is Better:

L( y ) k 1 2
y

Smaller is Better:

L( y ) ky 2
L( y ) k y m

Nominal is Best:

where :
m is the target of the
process specification

Considering the Cost of Loss


k in the L(y) equation is found from:
k

A0

02

A0 is cost of repair or replace


a product and must include
loss due to unavailability
during repair
0 is the functional limit on
y of a product where it would
fail to perform its function
half the time

Loss Function Example: (nominal is


best)
We can define a processes average
loss as:

Lk s ym

s is process (product) Standard


Deviation
ybar is process (product) mean

Example cont.
A0 is $2 (a very low number of this type!)
found by estimating that the loss is 10% of
the $20 product cost when a part is
exactly 8.55 or 8.45 units
Process specification is: 8.5+.05 units
Historically: ybar = 8.492 and s = 0.016

Example Cont.
Average Loss:

L 2

2
2

0.016

8.492

8.500

.052

L 800 .00032 $0.256


If we make 250,000 units a year
Annual Loss is $64,000

Fixing it
Shift the Mean to
nominal

2
2

L 800 .016 0 $0.2048

Annual Loss is $51200 about 20% reduction

Reduce variation L 800 .0102 .008 2 $0.1312

(s = 0.01)
Annual Loss is $32800 about 50% reduction
Fix Both!

2
2

L 800 .010 0 $0.08

Annual Loss is $20000 about 66% reduction

Taguchi Methods
Help companies to perform the Quality Fix!
Quality problems are due to Noises in the product or
process system
Noise is any undesirable effect that increases
variability

Conduct extensive Problem Analyses


Employ Inter-disciplinary Teams
Perform Designed Experimental Analyses
Evaluate Experiments using ANOVA and Signalto noise techniques

Defining the Taguchi Approach


The Point Then Is To Produce
Processes Or Products The Are
ROBUST AGAINST NOISES
Dont spend the money to eliminate all
noise, build designs (product and
process) that can perform as desired
low variability in the presence of
noise!

WE SAY:
ROBUSTNESS = HIGH QUALITY

Defining the Taguchi Approach


Noise Factors Cause Functional Variation
They Fall Into Three Classes
1. Outer Noise Environmental Conditions
2. Inner Noise Lifetime Deterioration
3. Between Product Noise Piece To Piece
Variation

Taguchi
Method is
Step-byStep:

Defining the Taguchi Approach


TO RELIABLY MEET OUR DESIGN
GOALS MEANS: DESIGNING
QUALITY IN!

We find that Taguchi considered


THREE LEVELS OF DESIGN:
level 1: SYSTEM DESIGN
level 2: PARAMETER DESIGN
level 3: TOLERANCE DESIGN

Defining the Taguchi Approach


SYSTEM DESIGN:
All About Innovation New
Ideas, Techniques, Philosophies
Application Of Science And
Engineering Knowledge
Includes Selection Of:
Materials
Processes
Tentative Parameter Values

Defining the Taguchi Approach


Parameter Design:
Tests For Levels Of Parameter
Values
Selects "Best Levels" For Operating
Parameters to be Least Sensitive to
Noises
Develops Processes Or Products
That Are Robust
A Key Step To Increasing Quality
Without Increased Cost

Defining the Taguchi Approach


Tolerance Design:
A "Last Resort" Improvement Step
Identifies Parameters Having the
greatest Influence On Output
Variation
Tightens Tolerances On These
Parameters
Typically Means Increases In
Cost

Selecting Parameters for Study and


Control
Select The Quality Characteristic
Define The Measurement Technique
Ennumerate, Consider, And Select The
Independent Variables And Interactions
Brainstorming
Shainins technique where they are determined by
looking at the products
FMEA failure mode and effects analysis

Preliminary Steps in Improvement


Studies
To Adequately Address The Problem At
Hand We Must:
1. Understand Its Relationship With The Goals
We Are Trying To Achieve
2. Explore/Review Past Performance compare
to desired Solutions
3. Prepare An 80/20 Or Pareto Chart Of These
Past Events
4. Develop A "Process Control" Chart -- This
Helps To Better See The Relationship
between Potential Control And Noise
Factors

A Wise Person Can Say: A Problem


Well Defined Is Already Nearly Solved!!

Going Down the Improvement


Road
Start By Generating The Problem
Candidates List:
Brainstorm The Product Or Process
Develop Cause And Effects (Ishikawa)
Diagrams

Using Process Flow Charts To


Stimulate Ideas
Develop Pareto Charts For Quality
Problems

DEVELOPING A Cause-and-Effect
Diagram:
1. Construct A Straight Horizontal Line (Right Facing)
2. Write Quality Characteristic At Right
3. Draw 45 Lines From Main Horizontal (4 Or 5) For Major
Categories: Manpower, Materials, Machines, Methods And
Environment
4. Add Possible Causes By Connecting Horizontal Lines To 45
"Main Cause" Rays
5. Add More Detailed Potential Causes Using Angled Rays To
Horizontal Possible Cause Lines

Generic Fishbone C&E Diagram

Building the Experiment Working


From a Cause & Effect Diagram

Designing A Useful Experiment


Taguchi methods use a cookbook
approach!! Building Experiments for
selected factors on the C&E Diagram
Selection is from a discrete set of
Orthogonal Arrays
Note: an orthogonal array (OA) is a special
fractional factorial design that allows study
of main factors and 2-way interactions

T.M. Summary
Taguchi methods (TM) are product or
process improvement techniques that
use DOE methods for improvements
A set of cookbook designs are available
and they can be modified to build a
rich set of studies (beyond what we
have seen in MP labs!)
TM requires a commitment to complete
studies and the discipline to continue in
the face of setbacks (as do all quality
improvement methods!)

Simplified DOE
Shainin Tools these are a series of
steps to logically identify the root
causes of variation
These tools are simple to implement,
statistically powerful and practical
Initial Step is to sample product (over
time) and examine the sample lots for
variability to identify causative factors
this step is called the multi-vari chart
approach
Shainin refers to root cause factors as the
Red X, Pink X, and Pink-Pink X causes

Shainins
Experimental
Approaches
to Quality
Variability
Control:

Shainin Ideas exploring further


Red X the primary cause of
variation
Pink X the secondary
causes of variation
Pink-Pink X significant but
minor causes of variation (a
factor that still must be
controlled!)
Any other factors should be
substituted by lower cost
solutions (wider tolerance,
cheaper material, etc.)

Basis of Shainins Quality


Improvement Approaches
As Shainin Said: Dont ask the engineers, they dont know, ask the
parts
Contrast with Brainstorming approach of Taguchi Method

Multi-Vari is designed to identify the likely home of the Red X factors


not necessarily the factors themselves
Shainin suggests that we look into three source of variation regimes:
Positional
Cyclical
Temporal

Does the
mean shift
in time or
between
products
or is the
product
(alone)
showing
the
variability?

Positional Variations:
These are variation within a given unit
(of production)

Like porosity in castings or cracks


Or across a unit with many parts like a
transmission, turbine or circuit board

Could be variations by location in


batch loading processes

Cavity to cavity variation in plastic injection


molding, etc.
Various tele-marketers at a fund raiser

Variation from machine-to-machine,


person-to-person or plant-to-plant

Cyclical Variation
Variation between consecutive
units drawn from a process
(consider calls on a software
help line)
Variation AMONG groups of
units
Batch-to Batch Variations
Lot-to-lot variations

Temporal Variations

Variations from hour-to-hour


Variation shift-to-shift
Variations from day-to-day
Variation from week-to-week

Components Search the


prerequisites
The technique is applicable (primarily) in
assbly operations where good units and
bad units are found
Performance (output) must be measurable
and repeatable
Units must be capable of disassembly and
reassembly without significant change in
original performance
There must be at least 2 assemblies or
units one good, one bad

The procedure:
Select the good and bad unit
Determine the quantitative parameter
by which to measure the units
Dissemble the good unit reassemble
and measure it again. Disassemble
and reassemble then measure the bad
units again. If the difference D between
good and bad exceeds the d difference
(within units) by 5:1, a significant and
repeatable difference between good
and bad units is established

Procedure (cont.)
Based on engineering judgment, rank the
likely component problems, within a unit, in
descending order of perceived importance.
Switch the top ranked component from the
good unit to the bad unit or assembly with the
corresponding component in the bad
assembly going to the good assembly.
Measure the 2 (reassembled) units.
If there is no change: the good unit stays good bad
stays bad, the top guessed component (A) is
unimportant go on to component B
If there is a partial change in the two measurements
A is not the only important variable. A could be a
Pink X family. Go on to Component B
If there is a complete reversal in outputs of the
assemblies, A could be in the Red X family. There is
no further need for components search.

Procedure (cont.)
Regardless of which of the three
outcomes above are observed, restore
component A to the original units to
assure original conditions are
repeated. Then, repeat the previous 2
steps for the next most important
components: B, C, D, etc. if each swap
leads to no or partial change
Ultimately, the Red X family will be IDd
(on complete reversal) or two or more
Pink X or pale Pink X families if only
partial reversals are observed

Procedure (cont.)
With the important variables
identified, a capping run with the
variables banded together as good
or bad assemblies must be used to
verify their importance
Finally, a factorial matrix, using data
generated during the search, is
drawn to determine, quantitatively,
main effects and interactive effects.

Paired Comparisons
This is a technique like
components search but
when products do not lend
themselves to disassembly
(perhaps it is a component in a
component search!)
Requires that there be several
Good and Bad units that can
be compared
Requires that a suitable
parameter can be identified to
distinguish Good from Bad

Steps in Paired Comparison


1.
2.

3.
4.

Randomly select one Good and one Bad unit call


it pair one
Observe the differences between the 2 units these
can be visual, dimensional, electrical, mechanical,
chemical, etc. Observe using appropriate means (eye,
optical or electron microscopic, X-ray, Spectrographic,
tests-to-failure, etc)
Select a 2nd pair, observe and note as with pair 1.
Repeat with additional pairs until a pattern of
repeatability is observed between goods & bads

Reviewing:
The previous (three methods) are ones that
followed directly from Shainins talk to the
animals (products) approach
In each, before we began actively specifying the
DOE parameters, we collect as much
information as we can from good or bad
products
As stated by one user: The product solution
was sought for over 18 months, we talked to
engineers & designers; we talked to engineering
managers, even product suppliers all without a
successful solution, but we never talked to the
parts. With the component search technique we
identified the problem in just 3 days

Taking the Next step: Variables


Search
The objective is to
Pinpoint the Red X, Pink X and one to three (more) critical
interacting variables
Its possible that the Red X is due to strong interactions between
two or more variables
Finally we are still trying to separate the important variables from
unimportant ones

Variables search is a way to get statistically significant


results without executing a large number of experimental
runs (achieving knowledge at reduced cost)
It has been shown the this binary comparison technique
(on 5 to 15 variables) can be successful in 20, 22, 24 or 26
runs vs. 256, 512, 1024, etc. runs using traditional DOE

Variables Search is a 2 stage


process:
STAGE 1:

1. List the important input variables as chosen by


engineering judgment (in descending order of
ability to influence output)
2. Assign 2 levels to each factor a best and worst
level (within reasonable bounds)
3. Run 2 experiments, one with all factors at best
levels, the second with all factors at worst levels.
Run two replications sets
4. Apply the D:d 5:1 rule (as above)
5. If the 5:1 ratio is exceeded, the Red X is captured
in the factor set tested.

Stage 1 (cont):
6.

If the ratio is less than 5:1, the right factors are not
chosen or 1 or more factors have been reversed
between best & worst levels. Disappointing, but not
fatal!
a. If the wrong factors were chosen in opinion of design team
decide on new factors and rerun Stage 1
b. If the team believes it has the correct factors included, but some
have reversed levels, run B vs. C tests on each suspicious
factor to see if factor levels are in fact reversed
c. One could try the selected factors (4 at a time) using full
factorial experiments could be prone to failure too if
interacting factors are separated during testing!

Moving on to Stage 2:
1.

Run an experiment with AW (a at worst level) and the


rest of factors at best levels (RB)
a) If there is no change in best results in Stage 1 step 3, factor A is
in fact unimportant
b) If there is a partial change from best results toward Worst
results A is not the only important factor. A could be Pink X
c) If a complete reversal in Best to Worst results in Stage 1 step 3,
A is the Red X

2.

Run a second test with AB and RW


a) If no change from Worst results in Stage 1 the top factor A is
further confirmed as unimportant
b) If there is a partial change in the worst results in Stage 1
toward Best results A is further confirmed as a possible Pink X
factor
c) If a complete reversal Best results in Stage 1 are
approximated, A is reconfirmed as the Red X

Continuing Stage 2:
3. Perform the same component search swap of step 1
& 2 for the rest of the factors to separate important
from unimportant factors
4. If no single Red X factor, but two or three Pink X
factors are found, perform a capping or validation
experiment with the Pink Xs at the best levels
(remaining factors at their worst levels). The results
should approximate the best results of Step 3, Stage
1.
5. Run a second capping experiment with Pinks at
worst level, the rest at Best level should approx.
the worst results in Step 3, Stage 1.

Variables Search Example:


Press Brake Operation
A press brake was showing high variability with poor C PK
The Press Brake was viewed as a Black Magic operation
the worked sometimes then went bad for no reason
Causes of the operational variability were hotly debated,
Issues included:
Raw Sheet metal
Thickness
Hardness

Press Brake Factors (some which are difficult or impossible to control)

The company investigated new P. Brakes but observed no


realistic and reliable improvements
Even high cost automated brakes sometimes produced poor results!

A Variables Search was


Performed
Goal was to consistently achieve a .005 tolerance (or
closer!)
6 Factors were chosen:
A. Punch/Die Alignment B: Aligned, W: not Specially
Aligned
B. Metal Thickness B: Thick, W: Thin
C. Metal Hardness B: Hard, W: Soft
D. Metal Bow B: Flat, W: Bowed
E. Ram Storage B: Coin Form, W: Air Form
F. Holding Material B: Level, W: Angle

Results reported in Process Widths which is twice


tolerance, in 0.001 units

Results:
STAGE 1

Process Width (x.001)


All Best

All Worst

Initial

47

Rep 1

61

D = 50; d = 7 D:d 7:1 (> 5:1) so a significant


repeatable difference; Red X (or Pink Xs) captured
as a factor

Continuing to Stage 2
Test

Comb.

Results

AW R B

A BR W

102

BW R B

B BR W

47

CW RB

C BR W

72

DW RB

23

D BR W

30

EW R B

10

E BR W

20

11

FW R B

73

12

F BR W

18

Cap Run

DW FW R B

70

Conclusion
A. not Important
B. Not Important
C. Not Important
Pink X: Interaction w/ other
factor(s)
???
Prob. Red X + Interaction

Factorial Analysis: D & F

F Best

F Worst

Diagonal
Sum: 72

D Best

D Worst

4, 4, 3, 5, 7,
7, 4
Avg: 4.9

23, 18

73, 20

Row Sum:
109.3

Avg: 51.5

47, 102, 61
47, 72, 70,
20; Avg: 57.8

Column Sum:
56.4

Column Sum:
78.3

Diagonal
Sum: 62.7

Row Sum:
25.4

Avg: 20.5

Factorial Analysis:
20.5 51.8 4.9 51.5

D
2

78.3 56.4

10.95
51.5 57.8 4.9 20.5

F
2

109.3 25.4

41.95
D. Sum1 D. Sum 2 72 62.7
DF (interaction)

2
2
4.7

Factorial Analysis:
Factor G is Red X: It has a 41.9 main effect
on the process spread
Factor D is a Pink X with 10.9 main effect on
process spread
Their interaction is minor with a contribution of
4.9 to process spread
With D & F controlled, using a holding fixture
to assure level and reduction in bowing (but
with hardness and thickness tolerances open
up leading to reduced raw metal costs) the
process spread was reduced to 0.004 (.002)
much better than the original target of .005
with an observed CPK of 2.5!

Introduction to Failure Mode and


Effects Analysis (FMEA)
Tool used to systematically evaluate a product,
process, or system
Developed in 1950s by US Navy, for use with flight
control systems
Today its used in several industries, in many
applications

products
processes
equipment
software
service

Conducted on new or existing products/processes


Presentation focuses on FMEA for existing process

Benefits of FMEA
Collects all potential issues into one document
Can serve as troubleshooting guide
Is valuable resource for new employees at the process

Provides analytical assessment of process risk


Prioritizes potential problems at process
Total process risk can be summarized, and compared to other
processes to better allocate resources

Serves as baseline for future improvement at process

Actions resulting in improvements can be documented


Personnel responsible for improvements can gain recognition
Controls can be effectively implemented
Example: Horizontal Bond Process: FMs improved by 40%;
causes improved by 37%. Overall risk in half in about 3 months.

FMEA Development
Assemble a team of people familiar with
process
Brainstorm process/product related defects
(Failure Modes)
List Effects, Causes, and Current Controls
for each failure mode
Assign ratings (1-10) for Severity,
Occurrence, and Detection for each failure
mode
1 is best, 10 is worst

Determine Risk Priority Number (RPN) for


each failure mode
Calculated as Severity x Occurrence x Detection

Typical FMEA Evaluation Sheet

Capturing The Essence of FMEA


The FMEA is a tool to systematically
evaluate a process or product
Use this methodology to:
Prioritize which processes/ parameters/
characteristics to work on (Plan)
Take action to improve process (Do)
Implement controls to verify/validate
process (Check)
Update FMEA scores, and start focusing
on next highest FM or cause (Act
Plan)

You might also like