You are on page 1of 262

About the Authors

Daniel Sloan is an internationally recognized Six Russell Boyles earned his PhD in Statistics at the
Sigma Master Black Belt and an ASQ certified Black University of California, Davis. He subsequently
Belt. His 16 years of experience have been spent two years in the Applied Mathematics Group
distinguished by Six Sigma seminars in Mexico, at Lawrence Livermore National Laboratory, two
Uruguay, Brazil, Australia, and 47 of the United States. years as Director of Statistical Analysis for NERCO
McGraw Hill and Quality Press published five of his Minerals Company, and eight years as Statistical
7 books. As a Senior Vice President of Applied Process Control Manager at Precision Castparts
Business Science for a $500 million company, he led Corporation. As a trainer and a consultant, Russell
their Six Sigma initiative. With "factory floor" Six specializes in Six Sigma Master Black Belt and
Sigma successes ranging from non-woven fabrics, Black Belt certification courses, Design of
extruded products, medical equipment, aerospace Experiments, Gage Studies, Reliability and
engineering, automotive parts, to Internet router Statistical Process Control. A few of his recent
production and health care, Daniel has a proven track papers have appeared in ASQ publications
record in helping companies produce bottom line results. Technometrics and Journal of Quality Technology.

Evidence-based Decision Services and Products


We are the first and best provider of evidence-based decision services in the world.
We help clients rapidly use the evidence in their raw data to dramatically improve bottom line business results.

Six Sigma Services Evidence-based Decision Support

Master Black Belt, Black Belt, Green Belt, Data mining, strategic Information Systems design.
Champion, and Senior Executive certification
training for all industries including manufacturing, Bottom-line business results project coaching.
financial services, and health care.
Consulting support to private industry, government
Consortium Six Sigma events for small companies and academic institutions that are implementing
who wish to pool resources. evidence-based decision systems.

Custom designed training events and multi-media, Custom designed training events and multi-media,
evidence-based Six Sigma materials. evidence-based education and training materials.

For more information visit or call:

Sloan Consulting, Seattle, WA (206)-525-7858


M. Daniel Sloan, author and owner of the copyright for this work, has
licensed it under the Creative Commons Attribution Non-Commercial Non-
Derivative (by-nc-nd) License. http://www.danielsloan.com is the legal, file
download location. To view this license visit:

http://creativecommons.org/licenses/by-nc-nd/2.5/
http://creativecommons.org/licenses/by-nc-nd/2.5/legalcode

Or send a letter to:

Corporate Headquarters
Creative Commons
543 Howard Street
5th Floor
San Francisco, CA 94105-3013
United States
Profit Signals
How Evidence-based Decisions
Power Six Sigma Breakthroughs

By

M. Daniel Sloan and Russell A. Boyles, PhD

Sloan Consulting, LLC


Seattle, Washington

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Profit Signals, How Evidence-Based Decisions Power Six Sigma Breakthroughs.
M. Daniel Sloan and Russell A. Boyles

Library of Congress, Cataloging-in-Publication Data


Sloan, M. Daniel, 1950-
Boyles, Russell A., 1951-
Profit signals how evidence-based decisions power six sigma breakthroughs / M. Daniel
Sloan and Russell A. Boyles
Included bibliographical references and index.
1. Six Sigma—quality control—Statistical Models. 2. Medical care—quality assurance—
Statistical Models. 3 Cost Control—Statistical Models—Mathematical Models.

© 2003 by Evidence-Based Decisions, Inc. http://www.evidence-based-decisions.com

All rights reserved. No part of this book may be reproduced in any form or by any means,
electronic, mechanical; photocopying, recording, or otherwise, without the prior written
permission of the publisher. Your support of author’s rights is appreciated. For permissions the
authors can be contacted directly.

Sloan Consulting http://www.danielsloan.com/


206-525-7858
10035 46th AVE NE, Seattle WA 98125

Trademark Acknowledgements

Profit Signals® and the phrase “Vector Analysis Applied to a Data Matrix®” and the Profit
Signals tetrahedron on the book’s cover are registered trademarks of Sloan Consulting. LLC,
Six Sigma® is a registered trademark and service mark of Motorola, Incorporated. Sculpey
Clay® is a registered trademark of Polyform Products Co. Excel® is a registered trademark of
Microsoft. Other copyright notices are listed in the production notes at the end of the book.

Illustrations: Cover, Robin Hing. Tables and illustrations, Robin Hing, Russell A. Boyles, M.
Daniel Sloan, John Pendleton, Austin Sloan, and Alan Tomko. Netter illustrations used with
permission from Icon Learning Systems, a division of MediMedia USA, Inc. All rights reserved.

The book’s design and layout, using Adobe InDesign 2.0.2, were completed by M. Daniel
Sloan. Printed in the United States of America.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Many of the most useful designs are extremely simple.
Ronald Alymer Fisher

How much variation should we leave to chance?


Walter A. Shewhart

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Table of Contents

Premise ................................................................... 9
The Parable of the Paper Bags ........................... 14
The Dollar Value of Evidence............................ 16
Six Sigma ......................................................... 18
How to Read This Book .................................. 20
Endnotes .................................................................24

Chapter 1—The Five-Minute PhD ............... 27


Start Your Stopwatch Now ............................... 28
Business Art and Science ................................... 30
Profit Signals ..................................................... 37
Data Recycling.................................................. 43
The Full Circle of Data Discovery..................... 44
The New Management Equation ...................... 44
Closing Arguments ........................................... 46
Endnotes ........................................................... 47

Chapter 2—Standards of Evidence ............... 49


Poetry versus Science......................................... 50
“Scientific” Management................................... 51
Cost Accounting Variance Analysis ................... 53
Accounting versus Science ................................. 55
Delusions and Bamboozles ................................ 56
Vector Analysis 101 ........................................... 57
Degrees of Freedom........................................... 65
Bar Chart Bamboozles ..................................... 67
The Game is Afoot............................................ 74
Spreadsheet versus Data Matrix......................... 79

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


vi Table of Contents

P-values, Profit Signals, Confidence Levels


and Standards of Evidence ................................ 81
Closing Arguments ........................................... 83
Endnotes ........................................................... 84

Chapter 3—Evidence-based Six Sigma ........ 87


Six Sigma (6σ) Basics ....................................... 89
The Six Sigma Profit Strategy .......................... 90
Lucrative Project Results Map ...........................94
Define, Measure, Analyze, Improve, Control ... 96
Lucrative Project Selection ............................... 98
Financial Modeling and Simulation ............... 100
Compare and Contrast Analysis ..................... 104
Process Maps ................................................. 106
The Costs of Poor Quality ...........................110
Process Capability ..............................................113
Endnotes ..............................................................115

Chapter 4—Case Studies ...............................117


Customer Service – Governmental Agency.......119
Days in Accounts Receivable ........................... 122
Breaking the Time Barrier .............................. 128
“Beating Heart” Bypass Grafts ...................... 135
The Daily Grind ........................................... 142
“Die Tuning” for Vinyl Extrusion ....................145
Endnotes ..........................................................149

Chapter 5—Using Profit Signals .................151


A Better Way to Look At Numbers ..................152
Corrugated Copters..........................................153
Testing the Current Way of Doing Things .......156
Overcoming Obstacles ................................... 162
Comparing Two Ways of Doing Things.......... 163
Comparing Three Ways of Doing Things ........167
Comparing Eight Ways of Doing Things .........169
Comparing 256 Ways of Doing Things............172
Chapter Homework..........................................174
Closing Arguments ..........................................175
Endnotes ..........................................................175

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Table of Contents vii

Chapter 6—Predicting Profits ..................... 177


Fingerprint Evidence ........................................178
Three Wishes ...................................................179
Prediction Practice ...........................................183
Predicting Real Flight Times........................... 188
Closing Arguments ..........................................191
Endnotes ..........................................................191

Chapter 7—Sustaining Results .....................193


Evaluating Practices and Profits....................... 194
Process Improvement Simulation..................... 199
Monitoring Practices and Profits ..................... 205
Taking Action ..................................................211
Closing Arguments ..........................................214
Endnotes ..........................................................214

Chapter 8 —The Three Rs ............................217


Six Sigma’s Hidden Factory ..............................218
Our Proposal................................................... 222
Endnotes ......................................................... 224

Appendices ....................................................... 225


I. Glossary of Terms: Data Matrix,
Vector Analysis And
Evidence-based Decisions ......................... 225
II. The Business Bookshelf ............................. 227
III. Evidence-Based Decisions, Inc.
Six Sigma Black Belt/ Expert 16 Class
Curriculum Outline ................................. 231
IV. Profit Signals Production Notes ................. 252

Index ...................................................................255

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


viii Table of Contents

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Premise

P
rofit Signals is a guide for using evidence to make
better, more profitable business decisions. Face
value judgments, opinions, gut feelings, suspicions,
circumstance, and superstitions are not pathways to evidence.
Measurements are. This book will show you how to turn
measurements into evidence and evidence into profit.

Measurements become evidence when they are analyzed


correctly. Since 1920, the correct analysis has consisted of
a vector analysis applied to a data matrix. Very few people
know this. With this book, we aim to transform the arcane
mysteries of vector analysis into common knowledge. Vector
analysis is a must-have, fundamental job skill. Every person in
every organization can use this tool to make more money.

Vector analysis is a vast, audacious and empirically true


Generalization. Sir Ronald Fisher first explained it at the
beginning of the 1920s.1 Evidence is the foundation of Profit
Signals and vector analysis is the foundation for evidence.

The word ‘generalization’ usually denotes a thoughtless, broad


assertion with no basis in fact. By contrast, a Generalization
is a verifiable law of the universe. Thus one word has two,
opposite meanings.2 Unlike a generalization, a Generalization
delivers valid conclusions and accurate predictions.

The laws of gravity are a Generalization. Gravity is a


physical constant of our universe. The laws of motion are a
Generalization. The laws of Variation, the chance fluctuations
or Noise that attend every measurement, are a Generalization.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
10 Premise
They are Law. Just as you can measure gravity (Newton and
the apple), Generalizations like statistical variation can be
tested and validated through a process of experimentation,
observation, and analysis.

Evidence-based decisions focus on the three vectors on the


right side of the following vector analysis equation:

Raw Data = Data Average + Profit Signal + Noise.

A vector is a set of numbers that is treated as a single entity.


A vector defines magnitude and direction. A vector is best
visualized as an arrow connecting one point in space to
another. The vector analysis equation is much easier to
understand when it is presented as a picture. The six edges of
the tetrahedron shown in Figure 1 represent the six different
ways of combining the three vectors on the right side of the
equation. We call this stable geometric figure “the cornerstone
of evidence.”

Figure 1 A vector analysis requires


a minimum of three Generalized
dimensions. An evidence-based
decision evaluates three key vectors:
1) a Data Average, 2) a Profit Signal,
and 3) Noise.

The cornerstone of evidence is even easier to grasp in


its physical form. In Profit Signals you will learn how to
build one with bamboo skewers and Sculpey Clay.® The
construction process is fun and informative.

The keys to making better, more profitable business decisions


are (1) identifying and (2) interpreting the profit signals in
your raw data. Profit signals are the most important element
in any data-driven business decision. Vector analysis is the
only way to identify profit signals. Knowing how to find and

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Premise 11
graph your profit signals are extraordinary money-making
skills.

Relatively few people are aware of vector analysis and its


universal relevance. Many do not understand the fundamental
difference between a data matrix and a spreadsheet. Profit
Signals fills in those educational gaps. With every chapter you
will understand more clearly what profit signals are. You will
soon know why they are invaluable.

The break-even school of thought has dominated business


decisions since 1918. That was the year G. Charter Harrison,
a London accountant employed by Price, Waterhouse &
Company, published “Principles of a Cost System Based on
Standards” in Industrial Engineering magazine. 3, 4, 5 Since
then Harrison’s accounting principles and procedures have
become universally accepted. They are known today as cost-
accounting variance analysis.6

Figure 2 The break-even school of


thought was founded by G. Charter
�������
Harrison in 1918. It is inherently one- ������
(Averaged
dimensional. Dollars Expenses)

���� ���������������
�� �����������������
��� ged
�� era e)
v
(A com ����������������
In

Product or Service Volume

For example, Figure 2 illustrates traditional break-even point


analysis. It assumes that average expenses and average income
are perfect linear functions of volume. The lines cross at the
“break-even point.”

A break-even point, “cost-accounting variance analysis” is


inherently one-dimensional. It is based on the difference
between the two lines—the differences between average
expense and average income at various volumes. Technically
speaking, these differences are called predicted values. As
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
12 Premise
shown in Figure 3, these differences form the vector at the
back of the tetrahedron. It is but one of the six vectors required
for a complete, three-dimensional vector analysis.

Figure 3 The differences between


average income and average
expense form one vector in the set
of six required for a complete vector
analysis.

Establishing performance standards and evaluating actual


results in relation to them were important steps forward.
Unfortunately, there was no cross-pollination between
Harrison’s work in London and Sir Ronald Fisher’s
simultaneous 1918 development of vector analysis in rural
England. Instead, cost-accounting variance analysis evolved
as a collection of one-dimensional methods. In break-even
analysis, predicted values are used in isolation. In other
accounting cases, the analysis is based solely on the raw data.

Because the methods of cost-accounting variance analysis are


inherently one-dimensional, it is impossible for any of them
to produce a correct analysis. For example, the accuracy of
any predicted value depends on the length of the noise vector.
Equally important, the strength of evidence supporting any
conclusion depends on a ratio involving the profit signal and
noise vectors. This F ratio, as it is called, compares the length
of the profit signal vector to the length of the noise vector.

Modern textbooks teach cost-accounting variance analysis as a


way to identify causes of profits and losses.7 According to one
Top-10 business-school accounting text, a cost accounting
variance analysis can “…decompose the total difference
between planned and actual performance into elements that
can be assigned to individual responsibility centers.”8

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Premise 13
This sounds like a vector analysis. It is not. Cost-accounting
variance analysis is just arithmetic. It is a set of one-
dimensional analysis methods incapable of distinguishing
profit signals from noise.

21st Century teaching methods and computer graphics place


vector analysis in its rightful position in business decision-
making. In the past, we had to wade through volumes of
bewildering algebra to analyze data. Few of us ever saw the
cornerstone of evidence. Few of us were able to take evidence to
our bottom line: “How can I personally use vector analysis to
solve my problems and make my business more profitable?”

The one formula we will use in this book is the Pythagorean


Theorem. (If you now have a frown on your face, you
probably learned about this idea in your favorite high school
class, geometry.) This equation defines the right triangles
comprising the cornerstone of evidence:

The square of the long side of a right triangle equals


the sum of the squares of the other two sides.

In other words, c2 = a2 + b2 . The vast majority of evidence-


based decisions are based on this simple formula. We call it the
New Management Equation. We trust you will too.

The president of a $500 million company put it this way: “In


old-school cost accounting we determine the variance and
there the analysis stops. With the New Management Equation
we determine the variance and there the analysis begins.” This
vector analysis is represented by the forward right triangle in
Figure 4 .

Vector analysis is transparent. Transparency implies the full


disclosure of all elements. Transparency is indispensable to
evidence-based decisions. It is a desirable accounting quality.

Cost accounting variance analysis lacks transparency. It hasn’t


changed one whit since the day it was born during the 20th
Century’s “scientific management” craze. Like the whalebone
corsets of that era, it is a constricting artifact. It creates an
outward appearance of propriety while it conceals covert
improprieties. It suppresses five-sixths—83 percent—of the
accounting and analysis information that is contained in raw
data.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
14 Premise

Figure 4 Cost accounting variance


analysis ends with variations from
standard values. A vector analysis
begins with variations around the
data average. The variations vector is
then broken up into two components:
1) the Profit Signal vector and 2) the
Noise vector.

The differences between a vector analysis applied to a data


matrix and a spreadsheet analysis applied to arbitrary clusters
of numbers are irreconcilable. The more you know about the
cornerstone of evidence, the more you will understand how
using just one of six possible vectors can misrepresent evidence
and damage profitability.

Our indictment is harsh. We will help you challenge it.


Then we will ask that you vote in favor of full disclosure and
transparency.

The Parable of the Paper Bags

We wrote Profit Signals for business leaders who are resolute


competitors. Competitors, and we both belong in this class,
are human. Therefore, all of us face the same challenges when
we tackle the Six Sigma body of knowledge for evidence-based
decisions. One of our novice students shared a personal story
on a first day of training. We use her parable of the paper bags
in our Six Sigma decision courses.

“What you are teaching us is a new skill that is hard to grasp.


This process reminds me of the way my grandfather ran his
business.

“I loved my grandparents very much. I was very close to

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Premise 15
them. My grandfather was born in Sylva, North Carolina in
1893. He had to leave school in the second grade to go work
on a tobacco farm. During the Great Depression, he and his
wife couldn’t earn a living. So they packed up their car and
traveled across country to Darrington, Washington. With his
brother’s family, there were 13 of them altogether.

“He originally worked for the Sauk Logging Company.


But, he preferred to work for himself. When the Federal
government bought his parcel of land in North Carolina for
an addition to Smokey Mountain National Park, he took the
money and bought property here in Arlington.

“My grandparents planted 80 cherry trees. They had 10 acres


of raspberries and a five-acre garden. They sold produce.
There were milk cows and always some beef. Grandpa bought
and sold heifers. He split cedar shakes in his spare time to
supplement the family income.

“There were all kinds of transactions. But, my grandfather


didn’t know how to multiply. Instead, he kept all his receipts
in different brown paper bags. Once a month he would
arrange these bags in the living room. Then he would add
up columns of numbers so he would know what to charge
people.

“I learned how to multiply in third grade. I got pretty good


at it by the fourth grade. I have always found math to be
difficult. I still do. I have to work at it and I really would
rather do something that comes easy.

“One day, I think it was in 1959, I came home and proudly


told him I could teach him to multiply. By multiplying he
wouldn’t have to spend so much time with his paper bags.
He listened and he learned how to do a few simple problems
correctly. After he picked it up he went right back to using
his paper bags. He never trusted multiplication. He couldn’t
bring himself to believe in this new-fangled way of doing
things.

“His system worked, but gosh, what he sacrificed.”

There is no doubt about it. The break-even thinking of


cost-accounting variance analysis works. But, the testimony
of Arthur Andersen, Cendant, Coca Cola, Enron, Rite
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
16 Premise
Aid, WorldCom, and other companies of former greatness,
suggests that the way it works is costly.

It would be a generalization of the non-mathematical, non-


scientific variety to claim that vector analysis is the solution
to problems of this magnitude. Nevertheless, we propose that
vector analysis, and the vector analysis mind set, are in fact
important parts of business decision solutions. We ask you to
critically evaluate this proposal.

Vector analysis theory and tools are to multiplication as


multiplication is to addition. They are valuable time savers.
Power and beauty are their strengths, but also their Achilles
heel. Though these ideas do not intimidate children, they can
threaten adults.

As you and your colleagues work to master evidence-based


decisions, do not be surprised if you observe anger, denial,
bargaining, and depression. Anticipate this roller coaster. At
the end, we hope you will arrive at acceptance. This cycle
accompanies any and every substantive life change.

Negotiate and get to “Yes” with your peers. Get to yes with
your executives and those you lead so that your company
is not using brown paper bags to compete against a more
powerful, efficient, effective, and profitable way of doing
work.9 That way of doing work is a vector analysis applied to a
data matrix.

The Dollar Value of Evidence

The quality of a manager’s decisions and consequent actions


determine profit and loss. Since large sums of money are
often at risk, managers weigh their “evidence”.

They watch production. They pore over monthly spreadsheet


reports. Usually, the most valuable information a manager
has remains buried in the spreadsheets. It is further obscured
by arithmetic totals, averages, differences, bar graphs and pie
charts.

As senior executives, and as consultants to senior executives,


we have seen this process repeated month after month, year
after year, decade after decade. With the pressure to produce
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Premise 17
profits, it is easy to understand why many managers resort to
an expedient device: appearances. The privileges of position—
title, clothing, automobiles, office location and furnishings,
social networks, and financial reward—can and often do
persuade others that appearance is evidence. Evidence-based
decisions call for a higher standard.

Whenever a manager looks at the numbers used to measure


the performance of an organization, certain questions ought
to begin to perk:10

1. Should I believe these numbers?


2. What is the evidence in these numbers, and how
strong is it?
3. What actions should I take based on this evidence?
4. What evidence will confirm that management actions
produced the desired results?

Because we are only human, these questions are accompanied


by unsettling feelings and thoughts that nurture anxiety:

a) I am comfortable with the way things are.


b) This new knowledge puts my previous decisions in a
bad light.
c) I don’t want to lose my job.

You are probably reading this book because you have made
business decisions. Some of those decisions were good. Some
were bad. Some were based on evidence; others were not. We
ask you to contrast the profit related to good decisions with
the loss related to bad ones. The difference between these two
numbers forecasts the initial Return on Investment (ROI) you
can expect from reading this book.

Because we are all human, evidence and ROI may not be


enough. We must handle fear. We know this from serving
individual and corporate customers in virtually every
industry, in Australia, Brazil, England, Mexico, New Zealand,
Singapore, Uruguay, and 44 of the United States. It may help
you to know that, in every case where our students have used
the information we present, ROI is at least 10:1. A more
typical result is 50:1. In most cases, bottom line business
results like these eventually break through the barriers of fear.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


18 Premise

Our clients welcome the opportunity to improve on


their current methods of analysis and decision-making.
Nevertheless, we all have a natural aversion to change. The
greater the change is, the greater our aversion. Six Sigma
companies have made a conscious decision to conquer their
reluctance. Experience, evidence and most of all, competition
are forcing all of us to improve.11

The process of making an evidence-based decision is elegantly


simple. It is not new, yet it is profound. It has demonstrated
its ability to improve productivity and profitability in every
industry. The one, two-, three- and n-dimensional profit
signals waiting to be discovered in your raw data can provide
practical, profitable solutions to even the most complex,
confounding and challenging business problems you face.

Six Sigma

In today’s popular press, evidence-based decisions are known


as Six Sigma (6σ).12 Bill Smith, an engineer at Motorola,
conceived Six Sigma in 1986. This major step forward has
produced trillions of dollars in profit. Each breakthrough
spurs demand for further, more dramatic breakthroughs. This
is natural and good.

The iterative nature of the Six Sigma project cycle has taught
us which parts of Six Sigma are essential. We, and other
experienced professionals in the field, also have learned
which parts are extraneous. The demand for additional,
rapid, dramatic breakthroughs can be satisfied only if we trim
fat from Six Sigma’s middle-aged spread. Six Sigma made
Profit Signals possible. We return the favor by showing how
to flex the evidence muscle without carrying the weight of
bureaucracy.

Two fundamental Six Sigma concepts, the data matrix


and vector analysis, have never been explained to anyone’s
satisfaction in any previous publication. Since 1986,
Six Sigma has been based on two seemingly reasonable
assumptions:
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Premise 19

1. It is impossible to teach Six Sigma theory to everyone


in a company.
2. Six Sigma tools are too difficult for most people to
use.

We have discovered these assumptions are no longer valid.


Personal computers and software have changed the world.
Today’s requirements for Six Sigma leadership are simply
these:

a) A passionate aptitude for pursuing the truth in a


system.
b) An understanding of the nature of a physical law or
Generalization.
c) The ability to operate carefully chosen statistical
software.

Anyone and everyone can learn this unifying theory. They can
learn it quickly. Anyone can master what is called the Black
Belt Body of Knowledge (BOK). Based on our experience,
and with the support of senior management leadership, this
process can be accomplished in 10 to 16 days.

This is the path we take and the case we make in Profit


Signals. You will learn fundamentals quickly. You will
immediately be able to use what you learn to make evidence-
based decisions. Improved decisions can lead you and your
company to Six Sigma profits.

If your enterprise is to succeed, its products and services must


exceed the great expectations of fickle customers. Goods and
services must be able to withstand the scrutiny of the free
press, and even an investigative Senate sub-committee. If you
expect your business to meet these objectives, your enterprise
must embrace and leverage the power of evidence-based
decisions.

Profits are always in fashion. Six Sigma is a very classy way


to earn them. The underlying principles of a data matrix and
vector analysis are timeless style. For over 2,000 years, the
New Management Equation ( c2 = a2 + b2) has helped people
make money from measurements. The supporting evidence
for this claim is overwhelming. It is well beyond any shadow
of doubt.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
20 Premise

Pick a profitable 21st Century product, service, or sport. Any


one will do. The qualities of almonds, aviation, agriculture,
beer, computers, electricity, electrocardiograms, fast food,
global navigation systems, gourmet ice cream, magnetic
information media, movies, music, oil, Olympic gold medal
speed skating blades, pharmaceuticals, roller ball pens,
surgery, skiing, scuba diving, telecommunication, textiles,
windows, and X-treme competition all share a common
bond. Breakthroughs in every one are driven by disciplined
observation, measurement, the recording of data in an orderly
data matrix fashion, and analysis.

The cornerstone of evidence has stood the test of time. Its


principles run deep and far beyond rote, routine, mainstream
business thinking.13 We welcome you to the world of the
data matrix, vector analysis, standards of evidence, improved
business decisions, and the New Management Equation.
Welcome to the universe of Profit Signals.

How to Read This Book

You can speed read this book in about a week. To get the “big
picture” quickly, skim the illustrations. Read the captions to
these exhibits. “Closing arguments” at the end of each chapter
summarize the key content.

After this initial overview, you may want to read it again at


a more leisurely pace. The ideas, analogies and activities are
presented in a particular sequence for good reason. So it is
best to read the book front to back. If possible, complete the
suggested experiments as you go. Feel free to collaborate on
these with colleagues, friends, family members, neighbors—
even your old high school teachers.

Chapter 1: The Five-Minute PhD – The opening chapter


lets you earn your PhD in evidence-based decisions. It takes
only five minutes. Your Five-Minute PhD grants you the
power of vector analysis. Call it vector power if you will.
Once you get a handle on profit signals, you will be able to
systematically quantify and prioritize the effects of multiple
factors on any manufacturing, health care, service or financial
process.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Premise 21

Chapter 2: Standards of Evidence – We review the


distinction between story telling and evidence. You will learn
the difference between vector analysis applied to a data matrix
and spreadsheet calculations applied to arbitrary clusters of
numbers.

This chapter’s inside joke and secret handshake are that a


Greek named Pythagoras invented the “New Management
Equation” 2500 years ago. (The New Management Equation
is easier to say and spell than Pythagorean Theorem. It is
also sweeter to the ear.) In the early 1920s a genius named
Ronald Fisher discovered how to apply the New Management
Equation to identify profit signals in raw data and quantify
strength of evidence.

Fisher’s method is the international standard for quantitative


analysis in all professions save two: accounting and business
management. In this chapter we trace the history of the cost-
accounting variance analysis and show how to improve it with
vector analysis.

Chapter 3: Evidence-based Six Sigma – If you are new


to Six Sigma, this chapter has all the basics you need to
know. It reviews the traditional Six Sigma tool set. We cover
organizational guidelines, project selection criteria, process
maps and financial model graphs. We review the traditional
Six Sigma breakthrough project cycle: Define, Measure,
Analyze, Improve, Control (DMAIC).

Chapter 4: Case Studies – We each have more than 20


years of consulting experience in the field of evidence-based
decisions. Our results have been published in peer-reviewed
textbooks. CEOs, middle managers and line workers have
signed affidavits and testified to the value of our work.14 Dr.
George E.P. Box, a Fellow of the Royal Society and elected
member of the Academy of Arts and Sciences, endorsed one
of our three-dimensional analysis books in 1997. 15
In this chapter, we tell a few of our favorite breakthrough
project stories. These include:

• Improving the quality of state government customer


services with a $525,000 pay off.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


22 Premise

• Reducing the days in accounts receivable by 30 days


with a 14-day project and a $425,000 bottom line
impact.

• Improving the operations of a hospital’s Emergency


Department (ED) with a gross margin of $18
million for a 38.2 percent gain over the prior year’s
performance.

• Dramatically improving the patient outcomes in


cardiovascular surgery while putting $1 million in
additional profits on a hospital’s bottom line.

• Tool grinding breakthroughs worth $900,000.

• Doubling the productivity in a vinyl extrusion process


while reducing the product material costs by 50%
in three months time. Bottom line value for this
company for each of the next three years is $1 million,
equaling a grand total of $3 million.

Chapter 5: Using Profit Signals – This chapter presents


the fundamentals of vector analysis with a few pages of
reading and a physical model. You will learn how to expose
the profit signals in your own data and represent them with
bamboo skewers and Sculpey Clay®. You will tackle the
following challenges facing the Corrugated Copter Company:

• Establishing baseline performance metrics.


• Comparing two ways of doing things.
• Comparing three ways of doing things.
• Comparing 256 different ways of doing things.

Chapter 6: Predicting Profits – Corrugated Copter


managers want to be able to accurately predict flight times,
profits, costs, inventory, and other important things. If they
could improve the quality of their predictions, they could
confidently take better advantage of market dynamics.
Fortunately, this management team is up to speed on
regression analysis.

Is this something new and different? No, it is just another way


to use the New Management Equation. It is vector analysis
applied to a data matrix.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Premise 23

Chapter 7: Sustaining Results – At Corrugated Copter


best business results always means earning the greatest revenue
with the least expense. This is more than a politically correct
platitude. It is responsible stewardship. The team learns to
monitor and perfect their production processes through
process capability studies and process control charts.

Are these new and different? No. They are just other ways
of using the New Management Equation. They are vector
analysis applied to a data matrix.

Chapter 8: The Three Rs – In its time, the Six Sigma


business initiative created new breakthroughs in quality,
productivity and profitability. Corrugated Copters now
believes traditional Six Sigma organizational ideas are
outdated.

A team of Corrugated Copters leaders has proposed


an education system that would render their Six Sigma
bureaucracy obsolete.

Appendices – Here you will find a glossary of Profit Signal


terms that will help you learn the language of evidence-based
decisions. There is a complete bibliography of the essential
evidence-based decision bookshelf. We have also included
a Profit Signals Black Belt Curriculum and production
information on the Six Sigma tools we used to write and
produce this book. We trust this information will serve as
your outline for future study.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


24 Premise

Endnotes
1
Box, Joan Fisher. R.A. Fisher, Life of a Scientist. John
Wiley & Sons, New York. 1978.

2
Box, Joan Fisher. R.A. Fisher, Life of a Scientist. John
Wiley & Sons, New York. 1978.

3
Harrison, G. Charter. Cost Accounting to Aid Production
– I, Standards and Standard Costs, Industrial Management,
The Engineering Magazine, Volume LVI, No. 5, October,
1918.

4
Harrison, G. Charter. Cost Accounting to Aid Production –
II, Standards and Standard Costs, Industrial Management,
The Engineering Magazine, Volume LVI, No. 5,
November, 1918.

5
Harrison, G. Charter. Cost Accounting to Aid Production –
II, Standards and Standard Costs, Industrial Management,
The Engineering Magazine, Volume LVI, No. 5,
December, 1918.

6
Johnson, H. Thomas, and Kaplan, Robert S. Relevance
Lost, The Rise and Fall of Management Accounting.
Boston, Harvard Business School Press, 1987.

7
Garrison, Ray H. and Noreen, Eric W. Managerial
Accounting, 10th Edition. Boston, McGraw-Hill Irwin,
2003. Page 431.

8
Anthony, Robert N., and Reece, James S., Accounting:
Text and Cases, Eighth Edition. Homewood, Irwin, 1989.
Page 941.

9
Fisher, Roger, and Ury, William, Getting to Yes,
Negotiating Agreement Without Giving In. New York,
Penguin Books. 1981.

10
Royall, Richard. Statistical Evidence, A likelihood
paradigm. New York, Chapman & Hall. 1997.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Premise 25

11
Shewhart, Walter A. Nature and Origin of Standards
of Quality. The Bell System Technical Journal. Volume
xxxvii, number 1, January, 1958.

12
Six Sigma is a registered trademark and service mark
of Motorola Incorporated. The Motorola web site is a
recommended resource for researching this history of Six
Sigma. For a summary overview please read: Barney, Matt,
“Motorola’s Second Generation,” Six Sigma Forum Magazine,
May 23, 2002, pages 13-16. http://mu.motorola.com/pdfs/Mot_
Six_Sigma.pdf

Fuller, F. Buckminster. Critical Path. New York, St.


13

Martin’s Press. Page 8.

14
Sloan, M. Daniel and Torpey, Jodi B. Success Stories
on Lowering Health Care Costs by Improving Health Care
Quality. Milwaukee, ASQ Quality Press. 1995.

Sloan, M. Daniel. Using Designed Experiments to Shrink


15

Health Care Costs. Milwaukee, ASQ Quality Press, 1997.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


26 Premise

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 1

The Five-
Minute PhD

P
hD’s, medical doctors, scientists, engineers,
mathematicians, statisticians, economists, managers,
and executives don’t own the lock and key to data
analysis. Anyone can learn to do a vector analysis. What was
once the high water mark of postgraduate study is now as
simple as a Google web search. You don’t need a certificate on
your wall to analyze data.

Knowledge and its application are the taproots for


professional stature and income. Few are eager to debunk the
presumption of specialized knowledge that justifies position
and paycheck. So long as information and knowledge remain
shrouded by jargon, professions and authority remain secure.

Knowledge and information challenge authority. They can be


disrespectful of bureaucracy and hierarchy. They applaud the
pointed question. They reward the cross-examination of high
priests and presidents. The Five-Minute PhD is a democratic
degree that exemplifies our age. It can be earned by anyone
who is willing to work at it.

With knowledge and information, people can and do


effectively solve more of their own problems. Solving one’s
own problems saves time and money. This is fun. Companies
that know how to solve problems quickly make more money
than those that don’t.

We all acquire memories through the experience of our


everyday lives. Memories make life rich and rewarding.
Memories can teach, but they rarely bring innovation to our
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
28 Five-Minute PhD
work places. Except for an occasional stroke of dumb luck, we
get no new knowledge from casual experience. With the sole
exception of pure mathematics, we obtain new knowledge
only by applying the basic disciplines of experimentation,
observation, and analysis. Companies that apply knowledge
and intelligence make more money than those that depend on
memories.

You will now learn how these three basic disciplines—


experimentation, observation, and analysis—work, and prove
that they do work, in less than five minutes. Neither previous
experience nor training nor calculations are required.

Start Your Stopwatch Now

Raw data contain information. In our digital world all


information can be, and is, turned into numbers. Good
information leads to reliable predictions. Telephones work,
airplanes fly, music is played, products are manufactured,
medical treatments are rendered and services are delivered, all
through the use of numbers.

Experimental data, measurements, come from disciplined


observation by design. When time and money are valued,
experiments must be sized with economy in mind. Table 1
arrays the eight observations in an economical experiment.
The first column in the array labeled “Experiments Called
Runs,” establishes the order in which eight observations were
made.

The factors1 in this three-dimensional experiment are gender


(x column), backpack weight bearing (y column), and activity
(z column). Heartbeats is the measured response (dependent
variable) in the experiment.

For convenience, each factor was set at only two levels. The
high setting is coded +1. The low setting is coded -1. Table
1 contains all eight possible combinations of a three-factor
experiment with two levels for each factor. This is called a 23
(two raised to the third power) experimental design. “Two
raised to the third power” is a mouthful, so it is usually
pronounced, “two cubed.” Think of tea with two cubes of ice,
rather than an equation, and the idea will be more refreshing.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 29

Table 1 The cube or “design of


experiments” (DOE) array is an
ideal data matrix. As you can see in Figure 1, this array does in fact create a
cube. Come back to this illustration after you complete your
PhD. The clock is ticking.

Figure 1 The ideal data matrix


forms a cube.

For each of the eight experiments, or runs, the resulting


number of radial artery heartbeats were measured and
recorded. (You can feel your own radial artery pulse by
touching the inner aspect of one of your wrists with the
index and middle fingers on the opposite hand.) These

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


30 Five-Minute PhD
measurements are arrayed in the far right column of this data
matrix. Each of the eight response measures is the output
from a unique combination of the three factors. For example,
the measurement 70 for Run #1 was for a sitting man
who had no weight in his backpack. This is an example of
“disciplined observation.”

Please turn your attention to the pattern of the response


heartbeat measurements in the far right column. Consider the
following questions:

• Which combination of variables produced the two


highest heartbeats rates?
• Which combination produced the two lowest
heartbeats rates?
• Which variable appears to have the least effect on
heartbeats?
• How would you predict future outcomes from similar
experiments?

Please pause now and stop reading. Go back. Take time to


look at the evidence patterns in the data matrix. When you
have answered all four questions in the list, continue.

Are you finished? Check your watch. We predicted that you


could successfully complete a three-dimensional, doctoral-
level vector analysis in less than five minutes. We bet we were
right. If you concluded that aerobic exercise has the strongest
effect on heartbeats, you have earned your Five-Minute PhD!

If you noticed that carrying a 50-pound weight in a backpack


increases the number of heartbeats for aerobic activity, but
not for sitting, you are at the top of your class. If, in addition,
you concluded that gender doesn’t really make much of a
difference when it comes to the number of heartbeats, you
graduated Cum Laude.

Business Art and Science

By using a special kind of row and column array called a data


matrix, you were able to correctly identify one main effect
(activity), one inert factor (gender) and a two-factor interactive
effect (the combination of activity and backpack weight). Your
eyeball vector analysis of eight numbers was accurate.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Five-Minute PhD 31
Consider the economies of using this technique to solve
business problems. Imagine the possibilities. Yes. You are
absolutely right. With relatively small amounts of data
framed in a data matrix, you can simultaneously quantify and
prioritize the effects of several process variables. This applies
to any manufacturing, business, service or health care process.
This is the vast Generalization of the mathematical/scientific
variety we mentioned. In other words, it is true.

The eight numbers in the far right hand column of the 23


data matrix in Table 1 actually form a single entity. This
entity is an eight-dimensional vector! You have now entered
hyperspace.

Science-fiction writers use the mathematical term hyperspace


when they need a word to describe faster-than-light travel.
Hyperspace is actually the mathematical term for a space with
four or more dimensions. So, fasten your seat belts. Please
keep your hands and arms inside the analysis rocket.

You succeeded in analyzing the three-dimensional experiment


in Table 1 because you were able to visually compare
the eight-dimensional vector for heartbeats to the eight-
dimensional vectors for activity, gender, and backpack weight.
These vectors are the basis for profit signals. We explain the
details in Chapter 5.

Hyperspace is not as easy to accept as a free ride on Disney’s


Space Mountain. For most of us, visualizing more than three
dimensions is out of the question. The hallmark of Ronald
Fisher’s genius was his ability to visualize n dimensions.2 This
was Imagineering at its very finest. The evidence we now
have about our universe confirms that Fisher’s vision of n-
dimensional, hyperspace was correct.

Fisher’s vector analysis is the elegant simplicity that underlies


myriad, seemingly unrelated analysis techniques. Nowadays,
inexpensive software effortlessly applies vector analysis
to any data matrix. Software automatically calculates the
profit signals. It ranks them by importance and determines
the strength of evidence. Then, with the grace of a high
technology thrill ride, software applications create three-
dimensional graphs annotated with accurate predictions.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


32 Five-Minute PhD
For example, repeated experiments with the setting of (-1,
-1, -1) or (Male, No Weight, Sitting) will produce an average
heartbeat of about 71.25. This predicted value labels the
lower, left, front corner of the cube in Figure 2.

Figure 2 Vector analysis applied ������ ������

to a data matrix gives analysts the


power of three-dimensional graphics.

��
�����
The differences in appearance
�����

between this illustration and richer


ones elsewhere in our book can be
explained: the superior tables and

���
illustrations were created using
vectors. ������ ������ �������

Activity

����� ����� �������

���� ������ ������

Repeated experiments at the setting of (+1, +1, +1) or


(Female, 50-pound weight, Aerobics) will produce an average
heartbeat of about 188.75. This predicted value labels the
upper, right, back corner of the cube in Figure 2.

New vector analysis users are often amazed at the accuracy


of predictions based on cubic and higher-dimensional
experiments. You will discover this for yourself as you
complete the exercises in this book.

You intuitively used data matrix and vector analysis principles


to interpret the data in the Five-Minute PhD experiment.
Working at the English Rothamsted Experimental Station,
Ronald Fisher conducted the first cubic and higher-
dimensional experiments in 1919. He applied these principles
to solve difficult, important problems using small, economical
sets of data.

William Gosset, the student of Fisher who first conceived the


theory of statistical inference for small samples in 1907, used
these statistics at the Guinness Brewery. Those who enjoy a
stout beer now and again have been thankful ever since.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 33

Eighty years of revolutionary advances in agriculture,


biotechnology, communications, computing, finance,
information technology, manufacturing, medicine, space
technology, and transportation support Fisher’s mathematical/
scientific Generalization. Vector analysis applies to everything.

Vector analysis is used to coordinate all commercial jet


landings at Orlando International. It is used to describe
Einstein’s special and general theories of relativity. It is used to
graph voltage variations resulting from the depolarization and
repolarization of the cardiac muscle. 3 It ought to be used to
create financial statements. It is proven. It is practical.4
Frank Netter, the Norman Rockwell of medical illustration,
drew a beautiful picture of a vector analysis (Figure 3). In
Netter’s drawing, the x-, y-, and z-axes are labeled using
medical terminology.

Figure 3 The 23 cube you used to


analyze heartbeats is identical to EKG
theory and Rh+/Rh- blood groups. The
plus (+) and minus (–) signs for Rhesus
blood groups symbolize vector analysis
reference points.

The x-axis refers to the sagittal, or side planes, of the body.


The lower, horizontal y-axis is illustrated while the upper
y-axis plane is implied. The back “frontal” plane is the
illustrated z-axis plane. This physiological phenomenon is
called a spatial vectorcardiographic loop. It was used in 1908
by Willem Einthoven to create the electrocardiogram (EKG).5

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


34 Five-Minute PhD

The EKG made it possible to observe, measure, and graph the


heart’s electrical impulses over time. The patterns that emerge
from the EKG vector analysis are critical to the prediction
of a beating heart’s behavior. Knowledge produced by this
Nobel Prize-winning achievement led to the creation of the
most profitable niche in American medicine—cardiovascular
care. Revisit this illustration when you read the Six Sigma
case study on “beating heart” Coronary Artery Bypass Graft
(CABG) surgeries in Chapter 4.

Table 2 lists a few of the thousands of proven, profitable


applications of vector analysis to a 23 data matrix. Consider
the vast Generality, and the enormous profit potential of this
single tool. The only limitation is imagination.

So imagine. Take time to write down factors (inputs) and


responses (outputs) that could help you make more money.
Once you have performed the disciplined observations and
recorded the measurements demanded by the cube’s data
matrix, the eight-dimensional vectors—especially the all-
important profit signals—will lead you directly to the most
profitable solution.

Table 2 The cube experiment


works for any process in any
system.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 35
Profit, loss, productivity, inventory turns, taxes, time, and
sales volume—any response you can measure—depends on
many factors and the interaction of those factors in your
business. Some of these are under your control; some are not.
Complexity is the rule, not the exception.

The only way to distinguish profit signals from noise is to


apply vector analysis to a data matrix. The ratio of the length
of the profit signal vector to the length of the noise vector
quantifies the strength of evidence in the data. A “long,
strong” profit signal vector and a “short, weak” noise vector
indicate large, statistically significant effects and reliable
predictions (Figure 5).

Figure 5 A long, strong Profit


Signal with a short, weak
Noise vector means there is a
statistically significant effect.

A “short, weak” profit signal vector and a “long, strong” noise


vector indicate small, statistically insignificant effects and
unreliable predictions (Figure 6).

RAW DATA
Figure 6 A long, strong Noise
vector and a short, weak Profit
N

Signal indicate no statistically


O

NOISE
IATI

significant effect. The variation is DA


TA
VAR

most likely due to Chance. AV


E RA
GE

IT
R OF AL
P GN
SI

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


36 Five-Minute PhD

Data in the matrix must be obtained through a process of


disciplined observation. Trial and error is expensive, time-
consuming, crude, and ineffective. It is not a viable business
strategy for the 21st Century.

The cube is three-dimensional; it has three factors. It


has served as a keystone of professional knowledge and
profitability since 1630 when Rene Descartes introduced the
method for three-dimensional thinking.6,7

Despite the fact that we inhabit a world of three physical


dimensions, there is no reason to limit ourselves to three-

Table 3 Multi-dimensional experiments improve profits in every industry.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 37
dimensional experiments. To illustrate, Table 3 lists a few n-
dimensional experiments with n ranging from 2 to 6.

Disciplined, multi-dimensional observations and vector


analysis reduce the financial risk associated with every
important business decision. Business leaders must not excuse
themselves from mastering this knowledge or the skills to go
with it. To do so is to gamble the future of their companies on
needlessly risky decisions.

Senior managers and corporate directors are knowledge


workers. They, more than any other members of an
organization, need to know how these tools work. Typically,
they can acquire this knowledge in just four days of
accelerated, hands-on training.

Profit Signals

Only a few years before Fisher used the cube and higher-
dimensional experiments to dramatically increase profitable
crop yields in England, Pablo Picasso and George Braque
created a new art form called Analytic Cubism. The analogies
between Picasso’s and Fisher’s cubes are intriguing.

Picasso and Braque aimed at presenting data as perceived


by the mind rather than the eye. “Every aspect of the whole
subject is seen in a single dimension.”8 In Picasso’s original
Analytic Cubism, “objects were deconstructed into their
components…. it was used more as a method of visually
laying out the FACTS…”9

Fisher explained his model and methods using virtually


identical words. He referred to vector analysis as the Analysis
of Variance. In Fisher’s terminology, Variance is a statistical
measure based on the squared length of the variation vector.
In cost accounting, a variance is a difference between an
actual value and a standard or budgeted value. Fisher’s
definition pre-dates the accounting definition by forty-some
years. We will discuss this further in Chapter 2.

Six Sigma Black Belts, high school teachers, college professors,


statisticians, spreadsheets and statistical programs often
employ the hideous acronym ANOVA for Analysis of
Variance. We are stuck with it, so we use it. An ANOVA
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
38 Five-Minute PhD
“deconstructs” a data vector into the basic pieces essential for
evidence-based decisions:

Raw Data = Data Average + Profit Signal(s) + Noise

We will discuss and illustrate various aspects of this vector


equation in subsequent chapters. For now, we focus on the
component of greatest immediate interest, the Profit Signal.

Table 4 contains a coded version of the Five-Minute PhD


cube experiment. Actual factor names are represented as
generic X, Y and Z variables. The numbers –1 and +1 are
traditionally used to designate low and high levels of each
factor. Actual heartbeat data are used in the response column.
As a Generalization, these could be any measurements of
interest to you.

Table 4 The coded version


of the Five-Minute PhD cube As we saw in Figure 1, the three factors in a cube experiment
experiment. form the edges of a three-dimensional solid, a cube. In this
sense, a cube experiment is three-dimensional. However, the
data matrix for a cube experiment has eight combinations.

The response column contains eight measurements, one


for each combination. These factor combinations and
measurements correspond to the eight corners of the cube. In
other words, here comes hyperspace again.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 39

The corners of the cube give a three-dimensional


representation of an eight-dimensional data vector.

Acquiring the knowledge and mastering hyperspace analysis


skills is well within everyone’s intellectual reach. Accepting
this beyond-belief reality is easier said than done.

Consider what Fisher’s ingenious, genius-level model suggests.


Hyperspace—the real one rather than the realm of Luke
Skywalker—is a very big idea. It is beautiful art and art is the
dream of a life of knowledge.10

Figure 7 The opposing planes


represent the eight-dimensional profit
signal vector for the overall effect of
factor Z. These planes correspond
to the grouping of the eight response
measurements shown in Table 5.
This is a case of a “long/strong” Profit
Signal and “weak/short” Noise.

We can use the cube to create three-dimensional


representations of the eight-dimensional profit signal vectors
in a cube experiment. For example, Figure 7 shows shaded,
opposing planes corresponding to the –1 and +1 levels of the
most important factor Z, Activity.

These two opposing planes represent the eight-dimensional


profit signal vector for the overall effect of factor Z, Activity.
This main effect is defined as the difference in the average
response values for Z = -1 (Sitting) and Z = +1 (Aerobics).

Overall Z effect = (Average of 140, 136, 180, 190) minus


(Average of 70, 68, 86, 88) = 161.5 - 78.0 = 83.5

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


40 Five-Minute PhD

Table 5 This grouping of the


eight response measurements
corresponds to the opposing
planes in Figure 7. This is a case
of a “long/strong” Profit Signal and
“weak/short” Noise.

Table 5 displays in a spreadsheet format the two groups of


measurements from the original Five Minute PhD experiment
corresponding to the opposing planes in Figure 7.

If Z (activity) had no effect, the average response on the front


plane of the cube would roughly equal the average response
on the back plane. This is not the case. Z had a large effect.
The average of the back plane is 83.5 heartbeats larger than
the average of the front plane.

In the Five-Minute PhD experiment, activity and backpack


weight had noticeable effects on heartbeats. We reached
these conclusions by comparing the column of response
measurements in the data matrix to the columns representing
the factors. In actual practice, we must also quantify the
strength of evidence for each profit signal. We will discuss
this, and the related concept of statistical significance, in
Chapter 2.

Figure 8 Opposing planes


representing the eight-dimensional
profit signal vector for the main
effect of factor Y.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 41

Figures 8 and 9 show the shaded, opposing planes


representing the profit signal vectors for the main effects
of factors Y and X. If you think about Star Wars while you
mull over these images, they will be more entertaining. Plus,
believe it or not, people do get up and cheer at the end of a
multimillion-dollar breakthrough Six Sigma project, just like
they do when good guys finally win on the big screen.

Figure 9 Opposing planes


representing the eight-dimensional
profit signal vector for the main effect
of factor X.

When the effect of one factor depends on the level of another,


they are said to have an interactive effect. For example, in the
Five-Minute PhD experiment, activity and backpack weight
had an interactive effect. Increasing the weight affected the
number of aerobic activity heartbeats, but not the number of
sitting heartbeats.

Pairs of planes on the cube can also represent interactive


effects, but they are perpendicular rather than parallel.

For example, one of the perpendicular planes in Figure


10 contains the four corners where X and Z have opposite
signs (X × Z = -1). The other plane contains the four corners
where X and Z have the same sign (X × Z = +1). These planes
represent the eight-dimensional profit-signal vector for the
interactive effect of X and Z, defined as the difference in the
average response values for X × Z = -1 and X × Z = +1.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
42 Five-Minute PhD

If X and Z had no interactive effect, the average response


on each plane would be the same. If X and Z had a large
interactive effect, one plane would have a much larger average
than the other.

Figure 10 Perpendicular
planes representing the eight-
dimensional profit signal vector for
the interactive effect of factors X
and Z.

Figure 11 shows the shaded, perpendicular planes


representing the profit signal vector for the interactive
effect of factors X and Y. Figure 12 shows the shaded,
perpendicular planes representing the profit signal vector
for the interactive effect of factors Y and Z.

Figure 11 Perpendicular
planes representing the eight-
dimensional profit signal
vector for the interactive
effect of factors X and Y.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 43

Figure 12 Perpendicular
planes representing the eight-
dimensional profit signal vector
for the interactive effect of
factors Y and Z.

These familiar geometric models convey esthetic beauty and


analytical power. This is not the 10th grade geometry taught
by Mr. Greismeyer at Centerville High. This is the geometry
of Michelangelo, da Vinci, Galileo, Einstein, Guglielmo
Marconi, Orville and Wilbur Wright, Alexander Calder, and a
Fellow of the Royal Society named Sir Ronald Fisher.

Data Recycling

The three-dimensional cube diagrams provide a looking glass


into eight-dimensional hyperspace. They allow our three-
dimensional eyes to make some interesting observations.

For example, have you noticed that all the corner values are
used repeatedly? Every data point appears six different times,
once in each of the six profit signal vectors shown above! This
is a lot of work for only eight little numbers to do.

This data-recycling phenomenon is a characteristic of all cubic


and higher-dimensional experiments. For most companies,
profit signals mean they can eliminate at least 83% of the data
collection and storage costs incurred with primitive trial and
error methods. Eighty-three percent is not a typographical
error. It is simply a fact.

The larger the number of factors, the greater the savings. This
bottom-line result is enhanced by the fact that vector analysis
gives you the right answers to your most pressing business
problems.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


44 Five-Minute PhD

The Full Circle of Data Discovery

We are back to where we started in the Premise. The data


matrix and the New Management Equation, c2 = a2 + b2, form
the backbone of vector analysis. Once data are entered into
a data matrix, a computer automatically calculates the profit
signals, determines the strength of evidence and graphs the
predicted values. It even explains the results.

Since 1920, vector analysis has been the path to credible,


quantitative evidence. Vector analysis points to the entire
family of common statistical distributions. It was the
foundation of science in the 17th Century when Galileo
proved that the earth revolved around the sun. It was the
foundation of science at the turn of the 20th Century when
Einstein created his special and general theories of relativity.11
It is the foundation of science at the turn of the 21st Century.

Since 1935 the application of vector analysis to data matrices,


better known to college students as ANOVA, has produced
huge financial returns in agriculture, manufacturing,
engineering, health care and process industries.12 ,13 ,14 ,15,16 In
those days this tool set was called the Design of Experiments.
In the 1980s it was repackaged as one of many Six Sigma
tools.

In 2003, it is simply Profit Signals that come from the New


Management Equation.

The New Management Equation

Use your own imagination to conduct experiments to verify


the insights you gained from your Five-Minute PhD. We
encourage you to use analogies to accelerate your learning. An
analogy illustrates similarities between ideas or things
that are often thought to be dissimilar. To quote one of our
past students, “An analogy is like a comparison.”

We have found that analogies, parables and old-fashioned


story telling are the most effective tools for teaching people
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Five-Minute PhD 45

Table 6 Like any true the principles of evidence-based decisions. Yes. This is yet
Generalizaion, the cube another paradox in evidence-based decisions.
experiment is a Law of the
Universe. Have some fun
practicing with your new PhD in After you have completed your experiments with family,
universes of your own. friends, and colleagues at work, discuss the implications
of these analogies. Since you are now a PhD, feel free to
throw around phrases like “Hegelian Dialectic” during your
conversations. This crowd-pleaser will let other doctors of
philosophy know that you know what you are talking about.

Table 6 lays out two proven favorites. Specifically, how can


you use what you now know about a data matrix and vector
analysis to save time and/or make more money?

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


46 Five-Minute PhD

Closing Arguments

The following testimonies were transcribed in various


historical hearings and trials about the Five-Minute PhD.

Archimedes: “Eureka. I have found it. I have found it. By


comparing the weights of solids with the weights of equal
quantities of water, I solved the mystery of specific gravity.
By using Euclid’s magical formula, c2 = a2 + b2, one can
transform the earth and the heavens into a vast design of
intricate configurations.

“Euclid, you made the impossible possible by the simplest


of methods. But please isn’t there a shorter way of learning
geometry than through your method?”17

Euclid: “Sire, in the country there are two kinds of roads—


the hard road for the common people and the easy road for
the royal family. But, in geometry all must go the same way.
There is no royal road for learning. Now if I were to speculate
a bit, and if there were a computing machine 2500 years from
now that ran a vector analysis program with a data matrix, all
might be able to travel an easier road.”

Galileo Galilei: “Philosophy is written in this grand book


the universe which stands continually open to our gaze. But
this book cannot be understood unless one first learns to
comprehend the language and read the letters in which it is
composed. It is written in the language of mathematics, and
it characters are triangles, circles, and other geometric figures
without which it is humanly impossible to understand a
single word of it.”

Albert Einstein: “The Gaussian coordinate system of


Chance variation is a logical generalization of the X, Y, Z
Cartesian coordinate system. I used the data matrix cube
with a vector to suggest the passage of time in my 1916 best
seller, Relativity, The Special and General Theory, A Simple
Explanation that Anyone Can Understand.”18

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Five-Minute PhD 47
James Turrell is a hyperspace sculptor of international
reputation. Mr. Turrell uses light in his search for mankind’s
place in the Universe.

James Turrell: “I want to create an atmosphere that can be


consciously plumbed with seeing like the wordless thought
that comes from looking in a fire. I use the X, Y, and Z axes
of light to achieve my objectives. I use the same Cartesian
coordinate system to pilot my aircraft.”19

Walt Disney: “I only hope that we never lose sight of one


thing—that it all started with a mouse. Born of necessity,
the little fellow literally freed us of immediate worry. He
provided the means for expanding our organization. He
spelled production liberation for us. Disneyland will never
be completed. It will continue to grow as long as there is
imagination left in the world.”20

Endnotes

1
These factors are also commonly known as independent
variables.

2
Box, Joan, Fisher. R.A. Fisher, Life of a Scientist. New York,
John Wiley and Sons, 1978.

3
Netter, Frank. The CIBA Collection of Medical Illustration,
Volume 5, Heart. Commissioned by CIBA. 1969.

4
Netter, Frank. The CIBA Collection of Medical Illustration,
Volume 5, Heart. Commissioned by CIBA. 1969.

5
Dubin, Dale. Rapid Interpretation of EKG’s, Edition V.
Tampa, Cover Inc., 1996. Page 4.

6
http://www.rothamsted.bbsrc.ac.uk/pie/sadie/reprints/
perry_97b_greenwich.pdf

7
http://www-gap.dcs.st-and.ac.uk/~history/
Mathematicians/Descartes.html

8
http://www.ibiblio.org/wm/paint/tl/20th/cubism.html

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


48 Five-Minute PhD

9
http://www.artchive.com/artchive/P/picasso_
analyticalcubism.html

10
Inscription on the southern ceiling of the rotunda leading
to a James Turrell Skyspace installation at the Henry Art
Gallery on the University of Washington campus.

11
Einstein, Albert. Relativity, The Special and General Theory,
A Clear Explanation that Anyone can Understand. New York.
Crown Publishers, 1952. Page 32.

12Fisher, R.A. Statistical Methods for Research Workers,


Thirteenth Edition. New York: Hafner Publishing Company
Inc. 1967.
13
Fisher, Ronald A. The Design of Experiments. New York:
Hafner Press, 1935.
14
Fisher, R. A. “Frequency Distribution of the Values of the
Correlation Coefficient in Samples from an Indefinitely Large
Population”, Biometrika, 10: 507-521, 1915.
15
Fisher, R.A. “On the Probable Error of a Coefficient of
Correlation Deduced from a Small Sample.” Metron 1: 3-32,
1921.
16
Box, George E.P., Hunter, William G., Hunter, J. Stuart,
Statistics for Experimenters: An Introduction to Design, Data
Analysis and Model Building, New York, John Wiley and Sons,
1978.

Thomas, Henry, and Thomas, Dana Lee. Living


17

Biographies of Great Scientists. Garden City, Garden City


Books, 1941. Pages 4-5.

18
Einstein, Albert. Relativity, The Special and General
Theory, A Simple Explanation that Anyone Can
Understand. New York, Crown Publishers, 1952. Page 90.

19
http://www.pbs.org/art21/artists/turrell/

http://goflorida.about.com/library/bls/bl_wdw_
20

waltdisney_quotes.htm

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 2

Standards of
Evidence

W
hat are the objective standards of evidence your
business uses to make decisions? We ask all new
clients this question. Too often the answer is an
uncomfortable silence or, “We’ve never asked ourselves that
question before.”

Evidence is the foundation for making better, more profitable


business decisions. But evidence provides an operational basis
for making decisions only if we have standards by which to
judge the strength of evidence.

Demands for improved financial performance put old-


school managers in a bind. For the first time in history, they
are competing head-to-head with managers in developing
nations like Brazil, China, India, Mexico, and Malaysia—take
your pick. Labor is a tiny fraction of the total cost of doing
business in these newly emerging competitive economies.

As a result, North American businesses, large and small,


must find ways to reduce production and delivery costs by
at least 30 percent. Many must achieve this within the next
five years or go out of business. If you doubt this possibility,
visit Seattle, Washington. Since September 11, 2001 a good
portion of our world-famous gridlock traffic jams vanished
along with more than 30,000 jobs.

For many old-school managers, staying inside their corporate


cultural comfort zone, with an atmospheric vacuum of
evidence standards, is more important than achieving any
business goal, including profitability. This is a counter-
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
50 Standards of Evidence
productive and, given the painfully apparent need for jobs, a
socially irresponsible attitude.

Comfortable or not, spirited capitalism has put evidence-


based decisions on the map. Whether they know it or
not, vector analysis and standards of evidence are now on
every manager’s radar screen. The only question is who will
recognize and respond to the signals.

Poetry versus Science

Efforts to understand the world we live in began with story


telling. Stories thrive in many forms, oral and written, poetry
and prose. Stories convey values. They define and maintain
cultures, including corporate cultures. Stories evoke fear,
hope, joy, anger, sympathy, humility, respect, wonder and
awe. Stories build like pearls around grains of historical fact.
They tell us much, mostly about ourselves.

Stories are not laws. They do not, and are not intended to,
reliably describe historical facts or physical realities. Story
telling does have its place, but it can be at odds with science.
Story telling often involves tales of trial and error.

Scientific discoveries inspire as much wonder and awe as


any Paul Bunyan tale. But, the driving force behind science
is disciplined observation, experimentation and analysis.
The scientific method, which can be equated with Six
Sigma, embraces affirmative skepticism. This skepticism is
diametrically opposed to the innocence of credulity. Credulity,
or as some prefer to say naïveté, is the suspension of critical
thinking. Credulity allows us to experience the emotional
impact of a good story. Credulity makes Disneyland,
Disneyworld and the Epcot Center fun.

The tension between story telling and science dates to poet


John Keats’ criticism of scientist Isaac Newton’s prism.
Newton discovered that “white light” contains an invisible
spectrum of colored light. He made this spectrum visible by
shining ordinary light through a prism. The process Newton
used is called refraction. Refraction comes from a Latin word
which means to break up.1 If you have ever had an eye exam
for corrective lenses, your ophthalmologist or optometrist
used refraction to determine your prescription.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Standards of Evidence 51
Newton used his prism to create rainbows (Figure 1). Keats
was appalled. Newton ruined rainbows by explaining them.2

Glasses, contact lenses, fiber-optic cables, lasers, big-screen


TV and digital cameras work because Newton stuck to his
intellectual guns. We are glad he did.

The process of refracting white light into a visible spectrum


of colors is a form of vector analysis. We are not being overly
lyrical when we say that Ronald Fisher’s vector analysis
“refracts” data. Refraction makes profit signals visible. This is
essentially what you did to earn your Five-Minute PhD.

Figure 1 A diamond sparkles with


colorful data vectors refracted from
ordinary light.

For poets, this perspective is unwelcome. They are not alone


in this feeling. Again and again, we hear Keats’ critique
of Newton echoed in the protests of old-school managers
who reject profit signals as well as the process of disciplined
observation, experimentation and analysis.

“Scientific” Management

Managing any business is a challenge. Complexity arises


from materials, work methods, machinery, products,
communication systems, customer requirements, social
interactions, cultures and languages. The first step in solving
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
52 Standards of Evidence
complex business problems is to frame them in terms of a
manageable number of key variables.

Bottom-line profitability is the ultimate objective, but other


metrics must also be considered. Sales, earnings per share,
cost and time to develop and market new products, operating
costs, inventory turnover, capital investments and days in
accounts receivable are just a few. Profit signals from one
or more of these variables often demand timely, reasoned
responses.

Frederick W. Taylor mesmerized the business community


of his day with the 1911 publication of The Principles
of Scientific Management. Taylor aimed to explain how
any business problem could be solved “scientifically.” As
an engineer for a steel company, Taylor had conducted a
26-year sequence of “experiments” to determine the best
way of performing each operation. He studied 12 factors,
encompassing materials, tools and work sequence. He
summarized this massive investigation with a series of multi-
factor predictive equations.

This certainly sounds like science. Unfortunately, trying to


solve complex business problems with Taylor’s methods is
akin to surfing the Internet with a rotary phone. In his 1954
classic How to Lie with Statistics, Darrel Huff characterized
Taylor-style science as follows: “If you can’t prove what you
want to prove, demonstrate something else and pretend that
they are the same thing.”3

Taylor studied his 12 factors one at a time, holding the


other 11 constant in each case. 4 This invalidates his multi-
factor equations. One-factor-at-a-time experiments are so
thoroughly discredited, that they have their own acronym,
OFAT. It is physically impossible for OFAT experiments to
characterize multi-factor processes. OFAT experiments are
also notoriously time consuming. This is probably why it took
Taylor 26 years to complete his study.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 53
Cost Accounting Variance Analysis

Businessmen in the early twentieth century enjoyed


comparing themselves to Einstein, Marconi, Edison, the
Wright Brothers and other celebrity scientists of the day. G.
Charter Harrison, an accountant with Price, Waterhouse
and Company in London, chose Taylor as the celebrity
“scientist” he wanted to emulate. Harrison published a
series of articles in 1918 in support of his assertion that,
“The present generally accepted methods of cost accounting
are in as retarded a state of development as were those of
manufacturing previous to the introduction by Frederick W.
Taylor of the idea of scientific management.”

A tidal wave of popularity was carrying Taylor’s book to


best seller status. Harrison rode this wave. He advanced
“scientific” principles for cost accounting. He proposed that
“standard costs” be established for various tasks, and that
actual costs be analyzed as deviations from the standard
costs. This was an advance over previous methods. Harrison
went on to describe an assortment of things that could be
calculated from such differences, including “productivity
ratios.”

A 1964 Times Review of Industry article first used the term


variance to describe Harrison’s difference between actual
and standard costs.5 Perhaps old-school accountants and
managers thought “variance” sounded more scientific than
“difference.” They had good reason to do so. By 1964 Ronald
Fisher’s vector analysis solution to a wide variety of statistical
problems were widely known under his general term for
them, Analysis of Variance.

Analysis of Variance is the international gold standard for


quantitative work in virtually every profession. Prior to the
invention of Six Sigma in 1986, two notable professions were
the only exceptions to this rule: accounting and business
management.

By 1978, business journalists were using the phrase “variance


analysis” to refer to the examination of differences between
planned and actual performance. The expression persists in
today’s accounting textbooks: “The act of computing and
interpreting variances is called variance analysis.”6

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


54 Standards of Evidence
Needless to say, the cost accounting variance analysis of 1978
bore no relation to the Analysis of Variance invented by
Fisher some 58 years earlier. The elements of a standard cost
accounting variance analysis are shown in Table 1.
� �

� � � � � �

� � � � � � � � �


Table 1 Cost-accounting variance �
� � � � � � � � � �
report formats vary. The key element � � � � � � � � �
is a column labeled “Variance Ratio.” �
It is the signed difference between an � � � � � � � � � �
actual value and a standard, budgeted � � � � � � � � � �
� � � � � � � �
or forecast value, expressed as a �
percentage.7, 8, 9 � � � � � � � � � �
� � � � � � � � � �
� � � � � � � � �

� � � � � � � � � �

It is unfortunate that the word “variance” was redefined


in 1964 to mean a difference between actual and standard
values. There is nothing inherently wrong with analyzing such
differences. In fact, it is a good idea. The problem comes in
the type of “analysis” that is done with such differences, and
the actions “variance analysis” conclusions can lead to.

For example, the manager who is responsible for the $1,000


revenue “variance” in Table 1 will be asked to explain
himself or herself. After all, the result is 20% under forecast!
The explaining of this unacceptable negative variance occurs
at the monthly meeting of the Executive Committee.

This monthly ritual creates tremendous pressure to conform.


It subverts critical thinking. Managers are forced to develop
story-telling skills. A plausible explanation is produced. The
manager vows not to let this bad thing happen again. After
a month or two or three, the offending “variance” happens
again. A plausible explanation is produced. The manager
swears never to let this new bad thing happen again. And so
on.

The highest-paid employees in the company waste hours,


days and even weeks every month, grilling each other over
G. Charter Harrison’s 1918 productivity ratios. Objective
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Standards of Evidence 55
standards of evidence are nowhere to be found in their
discussions.

The Executive Committee may as well try to produce


rainbows without a prism. The monthly cross-examination
over variance analysis rather than Analysis of Variance
(ANOVA) is an indefensible waste of time and money. It is
every company’s greatest obstacle to evidence-based decisions
and Six Sigma breakthroughs.

Accounting versus Science

Today’s Generally Accepted Accounting Principles (GAAP)


are loose guidelines. Little has changed from those submitted
to the United States government by a committee of Certified
Public Accountants in 1932. The authors of the book
Relevance Lost: The Rise and Fall of Management Accounting,
winner of the 1989 American Accounting Association’s
award for Notable Contribution to Management Accounting
Literature, judged the accounting profession to be functioning
70 years behind the times.10 In 2003, that makes it 84 years
behind the times!

How could this happen?

Perhaps it is a function of what the customers want. The


assistant dean of a leading business graduate school recently
told us that her university continued to teach a core
curriculum subject—cost-accounting variance analysis—that
she personally knew to be false. She reasoned, “Businesses
in this region hire graduates who know how to use cost-
accounting variance analysis.”

Another, more revealing explanation runs deeper. Empirical


laws of science are forced to evolve. Over time, an inexorable
process of disciplined observation, experimentation and
analysis leads to improvements. Occasionally, a body
of evidence becomes so compelling it becomes a new
Generalization or Law. New laws force old ones to be revised
or scrapped.

By contrast, in the words of a Harvard Graduate School of


Business Administration textbook, “Accounting principles
are man-made. Unlike principles of physics, chemistry, and

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


56 Standards of Evidence
the other natural sciences, accounting principles were not
deduced from basic axioms, nor can they be verified by
observation and experimentation.”11

In other words, cost accounting principles cannot be


tested for validity. They have no objective standards of
validity. There is no process of disciplined observation,
experimentation and analysis to force improvements.

Until one-dimensional, cost-accounting arithmetic is


upgraded to vector analysis applied to a data matrix, GAAP
will remain wide enough to drive a Six Sigma tractor-trailer
rig through.

Delusions and Bamboozles

In his 1841 classic Memoirs of Extraordinarily Popular


Delusions,12 Charles Mackay described massive losses related
to business practices just like today’s cost-accounting variance
analysis. He could well have been writing about 20th and 21st
Century popular culture when he penned the chapter “The
Love of the Marvelous and Disbelief of the True.”

“In reading the history of nations, we find that, like individuals,


they have their whims and their peculiarities; their seasons of
excitement and recklessness, when they care not what they do. We
find that whole communities suddenly fix their minds upon one
object, and go mad in its pursuit; that millions of people become
simultaneously impressed with one delusion, and run after it, till
their attention is caught by some new folly more captivating than
the first.” 13

This passage is eerily familiar to those of us who have watched


businesses become captivated by one management fad after
another: “Excellence”, “Re-Engineering”, “Zero-Based
Budgeting”, “Zero Defects”, “Total Quality Management”,
“Activity-based Cost Accounting”, “Management by
Objective” and “Balanced Scorecards” are a few of the greatest
hits.

None of these well-intentioned initiatives were, or are,


particularly bad in and of themselves. They simply lack the
firm foundation and objective standards of evidence sound
theory provides.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 57

Occasionally, a superstition, fad or fallacy—astrology,


homeopathy, phrenology, cost-accounting variance analysis,
take your pick—manages to survive for a few years or
decades. Eventually, a kind of critical mass of delusion is
established. The capacity for critical thinking erodes.14 Carl
Sagan explained it this way:

“One of the saddest lessons in history is this: If we have been


bamboozled long enough, we tend to reject any evidence of
the bamboozle. We’re no longer interested in finding out the
truth. The bamboozle has captured us. It’s simply too painful to
acknowledge, even to ourselves that we have been taken. Once
you give a charlatan power over you, you almost never get it back.
So the old bamboozles tend to persist as new ones rise.” 15

All business leaders—plant managers, doctors, billionaire


CEOs—face this dilemma when they try to bring evidence-
based decisions into their organizations. Generally Accepted
Accounting Principles place no premium on truth or even
facts. They prize only internal consistency. Once you have
bamboozled the public, the stockholders, or the employees
of your company, the path of least resistance is to keep on
bamboozling.

Our proposal? Replace cost-accounting variance analysis with


a reliable, transparent, proven method of analysis based on
objective, quantitative standards of evidence. That proven
method is profit signal vector analysis applied to a data
matrix.

Vector Analysis 101

John Keats, a poet of the Romantic Period, lived in a world


of pure expression. If he were an accountant today, he would
demand a certain freedom of expression. He would present
his data in free verse or any other format he desired. He
would analyze data however he wanted, according to his
prevailing emotions. He would assign to his analysis whatever
weight of evidence felt right.

Amazingly, he would be granted all these freedoms today


without so much as a raised eyebrow. According to a 1990s
Professor Emeritus at the Harvard Graduate School of
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
58 Standards of Evidence
Business Administration, “There are no prescribed criteria
[for variance analysis] beyond the general rule that any
technique should provide information worth more than the
costs involved in developing it.”16

The lack of any prescribed criteria for financial analysis


explains why spreadsheets are so popular. (See Table 2.) Just
like Keats, most people like being free to arrange their data in
any way that suits their fancy. What better way to spice up the
workday?

The situation is quite different for a Law or a Generalization


like Fisher’s Analysis of Variance. Unlike Taylor and Harrison,
Fisher actually was a scientist. In sharp contrast with the
artificial guidelines of cost accounting, Fisher’s work was
grounded in rigorous mathematics and physical reality:
Cartesian coordinates, right triangles, Pythagoras’ Theorem,
plane and solid geometry and trigonometry, multi-variable
calculus and vector analysis.

� � � �
Table 2 A spreadsheet consists of �� �� ��
rows and columns. The cells can �
contain text, numbers, graphics, � � � �
symbols or formulas. There are no �� ������ ������ ������
rules governing the interpretation �
of rows and columns. There are � � � �
no laws for arranging or analyzing �� ������ ������ ������
data. �
� � � �
�� ������ ������ ������


Ironically, Fisher developed the Analysis of Variance in a farm
near London right around the same time Taylor and Harrison
were promoting their “scientific” management and cost-
accounting principles.

Managers would do well to follow his lead; with statistical


software they can do so immediately at virtually no cost.
Evidence-based decision companies repeatedly demonstrate
why this is a profitable choice.

Fisher coined the term Variance in 1918. This was 46 years


before its debut in the Times Review of Industry.17 It is an
understatement to say the two definitions are significantly
different. They are as different as Generalization and
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Standards of Evidence 59
generalization. To quote Mark Twain, “It is the difference
between lightning and a lightning bug.”

Instead of a difference between an actual value and a standard


value, Fisher’s Variance measures the degree of variability of
a set of values around their average. It is based on the length
of the variation vector. Fisher called his method “Analysis
of Variance” because its purpose is to break up the variation
vector into profit signal and noise components. Fisher’s work
defines today’s international standard for analyzing components
of variation.

In 1919, at age 29, Fisher was hired to “examine data and


elicit further information” from his employer’s database.
According to his employer, “It took me a very short time to
realize that he was more than a man of great ability, he was in
fact a genius who must be retained.”18

Fisher’s job was to re-evaluate a business report identical


to the ones managers use for decisions today.19 There were
measurements recorded in rows and columns. Fisher’s boss
subtracted average annual production numbers from each
other to “determine” which years were most productive. The
boss wanted to increase the annual yield in bushels of wheat.
Like most people, he wanted to make more money while
working shorter hours and using fewer resources.

Fisher knew exactly how to help his boss achieve these


objectives: apply a vector analysis to a data matrix. Like a
spreadsheet, a data matrix consists of rows and columns.
There the similarity ends.

The rows of a data matrix represent records—the individual


objects or events on which we have data. The number of rows
is called the sample size. The columns represent fields—the
variables measured or inspected for each object.

Table 3 shows a simple data matrix. There are two variables


measured on two objects. Each column in a data matrix
contains the measurements which are the data vector for
the variable associated with the column. Each object is
represented by a particular coordinate or position in the
vector.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


60 Standards of Evidence

� � �
Table 3 Like a spreadsheet, a ��������� ���������
data matrix consists of rows and
columns. The rows of a data matrix
����������� �����������
represent records—the individual ����� �����
objects or events we have data on. � �
The columns represent fields—the � � �
variables for which we have data. �������������� �� ���
Each stack of numbers in a data
matrix column is a vector.

� � �
�������������� �� ���


For example, the data vectors in Table 3 are (3, 4) for Variable
1 and (5, 2) for Variable 2. These vectors are plotted in Figure
2 . The two coordinate axes correspond to the two objects.

Fisher’s innovation was to think of the data matrix in a


geometric framework.

In this example the vectors are two-dimensional because


there are two objects in the data matrix. In general, vectors
are n-dimensional, where n is the sample size. We are back
into hyperspace. Like the inside of a black hole, hyperspace
will remain forever beyond our three-dimensional vision.
Nevertheless, it is there. It is real. Evidence-based decision
companies use hyperspace to make more money in less time
while using fewer resources.

Figure 2 The two columns of Table 3,


plotted as vectors.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 61

Figure 3 illustrates the first basic rule of vector analysis:

The shortest distance between a point and a line


is along a path perpendicular to the line.

Figure 3 The shortest distance


between a point and a line is along a
path perpendicular to the line.

This is no arbitrary accounting rule; it is a property of


the physical universe. It is a law, a mathematical/scientific
Generalization.

The first step in a vector analysis is to find the constant vector


closest to the data vector. Examples of two-dimensional
constant vectors are (1, 1), (2, 2) and (0.5, 0.5). The
dotted line in Figure 4 locates the set of all possible two-
dimensional constant vectors. Figure 4’s center vector masks
a long segment of this dotted line, moving from the lower
left point of origin to the upper right. Only a portion of the
dotted line is visible at the upper right hand portion of the
illustration. For our data vectors, D1 and D2, the closest point
on this line is (3.5, 3.5).

It is not a coincidence that 3.5 is the average of 3 and 4. It is


not a coincidence that 3.5 is the average of 5 and 2. The closest
constant vector is always the vector of averages.

It does seem coincidental that (3, 4) and (5, 2) have the same
average, but we did this on purpose. So, how do these vectors
differ?

Well, (3, 4) is closer to the vector of averages than (5, 2)


is (Figure 4). A data vector close to its vector of averages
has less variability than a data vector far from its vector
of averages. This means Variable 1 has less variability

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


62 Standards of Evidence
than Variable 2. You can also tell this just by looking at
the numbers in Table 3. This “eyeball” analysis is just for
illustration; it is not recommended for your real data sets.

Figure 4 The dotted line is the


set of all constant vectors. The
constant vector closest to any data
vector is the vector of averages,
shown here in bold.

Figure 5 identifies the variation vectors, V1 and V2 , for


Variables 1 and 2. The length of the variation vector is
directly related to the degree of variability in the data vector.

Figure 5 A is the vector of averages


for both Variables 1 and 2. V1 and
V2 are the corresponding variation
vectors.

How do we calculate the length of a vector? For this we


need the second basic rule of vector analysis: The New

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 63
Management Equation (a.k.a. the Pythagorean Theorem).

The square of the length of the long side of a right triangle is


equal to the sum of the squares of the lengths of the other two
sides. (c2 = a2 + b2)

Once again, this is no arbitrary accounting rule; it is a


property of the physical universe. The New Management
Equation is so well known in professional financial and
investment analysis circles that a bi-monthly newspaper,
Financial Engineering News was founded in 1997 to
disseminate case studies.

In Figure 6 we use the New Management Equation to


calculate the lengths of data vectors D1 and D2.

Figure 6 The length of a vector is the


square root of the sum of the squares
of its coordinates.

Now we can figure out the lengths of the variation vectors in


Figure 5. Only the alphabetic notation differs from the New
Management Equation. Using the letters in Figure 5, the
New Management Equation for Variable 1 is:

(D1)2 = A2 + (V1)2

We can see in Figure 6 that (D1)2 = 25. Also, the squared


length of the data average vector A is:

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


64 Standards of Evidence

A2 = 3.52 + 3.52 = 24.5


We can now plug these two numbers, 25 and 24.5, into the
New Management Equation for data vector D1:

25 = 24.5 + (V1)2

25 - 24.5 = 0.5 = (V1)2

V1 = square root of 0.5 = 0.71.

This final number, 0.71, the length of the variation vector for
Variable 1, is called the sample standard deviation for Variable
1.

A sample standard deviation is symbolized in technical


writing by the letter s. The Greek letter sigma (σ) refers to the
standard deviation of a population. This is where Six Sigma
gets its name.

In Six Sigma practice, the sample standard deviation, s, is


often casually referred to as “sigma” or σ. This substitution is
a grievous breach of statistical theory, but everyone who uses
statistics does it.

The New Management Equation for Variable 2 works the


same way:
(D2)2 = A2 + (V2)2
We know from Figure 6 that
D2 = 5.39.
Please do keep your eyes on the right triangles in the
illustrations. We already know that A is the square root of
24.5, which equals 4.95. We can now plug these into the
New Management Equation:
5.392 = 4.952 + (V2)2

29.05 = 24.5 + (V2)2

4.55 = (V2)2

V2 = square root of 4.55 = 2.13.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 65

The sample standard deviation for Variable 2 is 3 times larger


than that for Variable 1! Variable 2 is 3 times more variable
than Variable 1.

Six Sigma values smaller variation because outcomes are more


predictable. Predictions are more accurate. There is less waste
and rework. Everything just works better when the profit
signals are large/strong and the noise is small/weak.

Degrees of Freedom

Don’t panic. Think of what follows as a mandatory Federal


Communications Commission announcement on your
National Public Radio station. It has to be here to ensure we
are not breaking any Laws of the Universe. You can skip this
section if you want, or you can stay tuned. In either case,
software takes care of all this stuff. This is just background
information.

Sometimes an analyst might want or need to know the


actual coordinates of a variation vector. (Whenever our
airplane takes off or lands, we certainly hope our pilot and
co-pilot have this information at their fingertips.) We get the
coordinates of the variation vector by subtracting the data
average vector from the data vector.

The clearest way to explain the subtraction of vectors is to


give the vectors a vertical orientation, the way they appear
in a data matrix. The coordinates of the variation vector for
Variable 1 are given by:

For Variable 2, they are given by

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


66 Standards of Evidence

So far, so good. Now, here is an important Law of the


Universe:

The coordinates of a variation vector always add up to


zero.

Because of this, the second coordinate in a two-dimensional


variation vector is always equal to the negative of the first
coordinate. This means that a two-dimensional variation
vector is completely determined by its first coordinate. We
express this by saying that a two-dimensional variation vector
has one degree of freedom.

Suppose now we have a three-dimension data vector, for


example (3, 4, 5) or (5, 2, 5). The vector of averages for both
of these is (4, 4, 4). Once again using the vertical data-matrix
orientation, the first variation vector is:

and the second is:

In a three-dimensional variation vector, the third coordinate


is always equal to minus the sum of the first two coordinates.
This means that a three-dimensional variation is completely
determined by its first two coordinates. We express this by
saying that a three-dimensional variation vector has two
degrees of freedom.

Now let n stand for the number of objects in your data set.
This is the same as the number of rows in your data matrix. It
is your sample size. All the vectors are now n-dimensional.

Yes. We are back into genuine hyperspace again. The more


often you go there, the less scary it becomes. Visits become
more profitable. They become fun.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 67
The last coordinate of an n-dimensional variation vector is
always equal to minus the sum of the first n – 1 coordinates.
This means that an n-dimensional variation vector is
completely determined by its first n-1 coordinates. We express
this by saying that an n-dimensional variation vector has n -1
degrees of freedom.

The upshot of all this is this: the standard deviation is exactly


equal to the length of the variation vector only when n =
2. When n is greater than 2, as it usually is, we have to
divide the length of the variation vector by the square root
of its degrees of freedom. Don’t blame us—it’s a Law of the
Universe. We will come back to this later in the chapter, and
also in Chapters 5 and 6.

We now return to our regularly scheduled program of writing


with an improved degree of simplicity.

Bar Chart Bamboozles

Bar charts and pie charts symbolize old-school management


thinking as no other icon can. They are the “Gee Whiz”
graphs in Huff ’s Lying with Statistics. They present data in
superficial ways. They are easy to use. Consequently, they
frequently are used to misrepresent data.

The typical bar chart presents totals or averages with no


consideration of variability. There are no deviations from the
average. There is no Chance variation. In other words, there is
no Noise.

This violates a Law of the Universe. There is always Noise,


which is statistical variation. Variation is a physical property
of objects and measurements. A vector analysis forces us to
consider both average and standard deviation.

At best, we might say bar charts have a 50/50 chance of giving


correct information because they consider only one of two
aspects. At worst, they encourage managers to use Frederick
Taylor’s thinking, “This bar is bigger than that bar is and I
know the reason why because I am a scientific manager and I
say so.”
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
68 Standards of Evidence

As an example, consider the monthly revenue data in Table


4. This is a snapshot of data entered into a spreadsheet.

Table 4 Monthly revenue for four


years.

The annual totals are plotted as a bar chart in Figure 7.


The upward trend looks very encouraging. The Marketing
Manager would certainly want to take credit for this.

Figure 7 Excel’s popular bar chart/


trend line combination is like Romantic
poetry. This poetic license gives
everyone the freedom to take credit
for good results, whether or not they
are true.

All Figure 7 really does is graphically frame the differences


between the annual totals. Because Laws of the Universe are
ignored, there is no way to tell whether the “trend” is a profit
signal or noise. This is a bit like trying to ignore gravity.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 69
Corporate cultures that use cost-accounting variance analysis
as the standard decision-making tool often use bar charts and
trend lines to present “results” like Figure 7 based on data
like that in Table 4. The credibility of the results portrayed
by the chart, and the explanation for them, comes from the
status of the person telling the story rather than the evidence
in the data.

There is no cross-examination of the reported results because


it is considered poor form, not to mention career limiting, to
question the President, Managing Director, Chief Financial
Officer, or a company founder who created spreadsheet
software.

In corporate cultures that base decisions on objective


standards of evidence, the analysis method itself is held to
high standards. Quite simply, it must follow the Laws of the
Universe. It must follow the rules of vector analysis.

Evidence is admissible if and only if the analysis method


takes all aspects of the data into account. The analysis must
have transparency. All elements must be available for review,
including the raw data. Anyone can ask any question because
all the data are in view. The vector analyses illustrated below
represent the international standard.

There are several things wrong with the “analysis” in Figure


7. For one thing, it uses only the annual totals instead of the
original monthly data. For purposes of illustration, we will
present two vector analyses that use only the four annual
totals. The first of these is given in Table 5.

Table 5 lays out the basic vector calculations for the sample
standard deviation, s. In this case s = 0.14. This is a vector
analysis in four-dimensional hyperspace, because there are
four data points.

The data average vector has one degree of freedom because


one number, the average of the four data points, determines
it. This leaves three degrees of freedom for the variation
vector. The lengths of the vectors are related by the New
Management Equation. [C2 does equal A2 plus B2, 140.36 =
140.30 + 0.06.]

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


70 Standards of Evidence
We used Microsoft Excel to create the visual presentation in
Table 5. The squared lengths of the vectors were calculated
by using the cell function SUMSQ. This function name is
short for “sum of squares”. This is appropriate because the
squared length of a vector is the sum of the squares of the
coordinates. The syntax for the Excel calculation is:

= SUMSQ(cell range)

�������� ������������ ���������


Table 5 Basic vector analysis of the ������ ������ ������
four annual totals (millions of dollars). ���� ���� �����
���� � ���� � ����
���� ���� �����
���� ���� ����
������������������ � � ���� � ����
��������������� ������ � ������ � ����
��������� ����������������� � ������������������� �� ����
������������������� ���������������������������� ����

Figure 8 shows the Normal distribution curve corresponding


to a mean of $5.92 million and a sample standard deviation
of $0.14 million. The dots just above the horizontal axis
represent the four annual totals. Each vertical dotted line
represents one standard deviation.

Figure 8 The four annual totals


from Table 4 and the corresponding
0.14
Normal distribution curve.

-3s -2s -1s 0 +1s +2s +3s


5.50 5.92 6.34

All four data points lie within two standard deviations of the
mean. We must conclude that the deviations from the mean
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Standards of Evidence 71
value are a result of natural, or Chance, variation. There is
certainly no evidence of significant differences among these
totals.

Our second vector analysis addresses directly the validity of


the bar graph trend line in Figure 7. The null hypothesis for
this analysis is the following statement:

There is no significant trend in the annual totals.

This is not a foregone conclusion. It is a special kind of


hypothesis. It is used in applied research all over the world.
The idea is to see whether or not the evidence in the data is
strong enough to discredit the null hypothesis. Then, and
only then, can we say there is a significant trend in the annual
totals. The visual presentation of the analysis is shown in
Table 6.

The variation vector is broken up into the sum of profit


signal and noise vectors. These three vectors are related by the
New Management Equation (a.k.a. Pythagorean Theorem).
The squared lengths of the vectors are also called “sums of
squares.”

The profit signal vector is equal to the best-fit line in Figure


7 minus the data average, 5.923 in this case. The coordinates
of the profit signal always add up to zero, so it is completely
determined by the slope of the best-fit line. As a result, the
profit signal vector has one degree of freedom. That leaves two
degrees of freedom for the noise vector.

To get the profit signal and noise variances, we divide the


sums of squares by the degrees of freedom. This is a Law of
Universe. Without this adjustment, the variances would be
biased.

When we divide the profit signal variance by the noise


variance we get a signal-to-noise ratio that measures the
strength of evidence against the null hypothesis. It is called
the F ratio, or F statistic, because Ronald Fisher invented it.
Larger values of F imply stronger evidence against the null
hypothesis. In this case F = 2.843.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


72 Standards of Evidence

Table 6 Ilustration of the vector


analysis for a linear trend in the
four annual totals. The squared
lengths of the vectors are also
called “sum of squares.” This is a This number doesn’t seem very large. But there is no standard
reference to the New Management scale of comparison for the F ratio. Instead, we interpret
Equation, which involves a sum of it relative to a statistical distribution representing chance
squared numbers. variation. This distribution depends on the degrees of
freedom for the profit signal and noise vectors. The p-value
of 0.234 in Table 6 is the probability of getting an F ratio as
large as 2.843 by chance alone.

If the p-value is small enough, we reject the null hypothesis.


By established international standards, the evidence against
the null hypothesis is ‘clear and convincing’ if the p-value is
less than 0.05. If the p-value is greater than 0.05 but less than
0.15, there is a ‘preponderance of evidence’ against the null
hypothesis. The p-value in Table 6 does not meet even this
lowest standard of evidence. There is no significant trend.

Table 7 shows the monthly revenue numbers in data matrix


format. This data set is too large to use as a tutorial. We
present some smaller examples in Chapter 5.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 73

Table 7 The monthly revenue


numbers in data matrix format
(thousands of dollars).

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


74 Standards of Evidence
Meanwhile, a great deal can be learned simply by plotting
the data in time sequence. This is done in Figure 9.
It doesn’t take a Statistician to see that there is no trend here,
just random variation. The only features of note are the three
low points at the beginning of the series. In turns out these
were the last three months before a change in the accounting
procedures. They should have been omitted from the analysis.

���������������������

�������
Figure 9 The monthly revenue
numbers plotted in time sequence. ������� �������

�������
�������
�������

�������

�������



��
��
��
��
��
��
��
��
��
������

The Game is Afoot

Another example of a full vector analysis is the shoe-sole


wear rate workshop in the classic 1978 text Statistics for
Experimenters: An Introduction to Design, Data Analysis and
Model Building by George Box, William Hunter and J.
Stuart Hunter. This example uses the small data set presented
in their book, with an invented story line based on our
consulting experiences.20 It achieves the following objectives:

1. You can quickly see the differences between a


typical spreadsheet analysis and vector analysis applied
to a data matrix.

2. The manufacturing design, cost and margin


analogies are appropriate.

A design team is arguing over the wear rates of shoe-sole


materials A and B. Material A, the current specification, is
more costly than Material B. The manager wants to go with
Material B because it is cheaper, and his spreadsheet analysis
shows there will be no significant loss of durability. Engineers
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Standards of Evidence 75
are concerned that Material B is not sufficiently durable. Data
has been collected and arrayed in a spreadsheet. (See Figure
10.)

����������� �����������
��� ���� ��������� ���� ���������
��� ���� ����� ����� �����
����� ���� ���� ����� ����
��� ����� ����� ���� �����
������ ����� ����� ����� �����
������� ����� ����� ���� �����
Figure 10 Wear rate data as arrayed ���� ���� ���� ����� ����
��� ���� ���� ����� ����
in Excel. ����� ����� ����� ����� �����
������ ����� ���� ���� ����
���� ���� ����� ����� �����
��������������������� ������ ������
����������������������� ����� �����
���������� ����
������������������
���������������������������
���������� �����

Ten boys were enlisted for the test. Each boy wore one shoe
made from Material A and one from Material B. Coin tosses
were used to randomly assign Material A to the left or right
foot for each boy.

The average wear rate for Material B comes out 0.41 units
higher than for Material A, an increase of 3.86%. Given
the price difference between the two materials, the manager
concludes that the difference in durability is irrelevant.

Furthermore, as shown by the bar chart in Figure 11, there


were a number of cases where Material A actually wore out
faster than Material B! The manager is elated. By using

��������������������
Figure 11 Wear rate data as analyzed
by a spreadsheet bar graph. �����
�����
����
����������
����
� � � � � ����������

����

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


76 Standards of Evidence
material B instead of A, the shoe manufacturer can increase
profit margins and maintain product durability. This change
will be worth millions to the bottom line.

After a long and difficult team meeting, consensus is reached.


The company will replace material A with the less costly,
equally durable material B.

As the meeting is wrapping up, a Six Sigma Black Belt in


training asks if she can analyze the data herself using a vector
analysis applied to a data matrix. It is getting late. People have
places to go, things to do. Nevertheless, to maintain good
relationships, they give her five minutes.

She imports their Excel spreadsheet into her statistical


package. For present purposes, we recreate her vector analysis
data in Excel. This is shown in Table 8. (We timed both
methods. The Excel reconstruction literally took 10 times
longer than doing a correct vector analysis in the statistical
package.)

Table 8 Vector analysis of the wear-


rate data. �

The Black Belt trainee starts her extemporaneous presentation


by stating the null hypothesis for the analysis: “There is
no difference between the average wear rates of the two
materials.” The trainee explains that this is a hypothesis, a
straw man to be pulled apart by evidence, rather than a
foregone conclusion. The idea is to see whether or not the

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 77
evidence in the data is strong enough to discredit the null
hypothesis.

She goes on to explain that we should be looking at the


differences between A and B for each boy—that was the
whole point of having each boy wear one shoe of each kind.
If the null hypothesis were true, the differences should be
symmetrically distributed around zero. Also, the average
difference should be close to zero.

With three clicks of her mouse, the trainee produces a


frequency histogram of the differences (see Figure 12).
Pointing at the graph, she says, “As you can see, all but two
of the differences are positive. This casts doubt on the null
hypothesis—the wear rates for Material B are consistently
higher than those for Material A.
“But let’s not jump to conclusions. We need to complete the

������������

�������������
Figure 12 Frequency histogram of
differences in wear rate (B minus A).

����� � ��� �� ��� � ����

vector analysis to establish the strength of this evidence. As


you can see, the vector analysis (Table 8) breaks the vector
of differences into the sum of the data average vector and the
noise vector. For analyzing matched pairs like we have here,
the data average and the profit signal vector are one and the
same.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


78 Standards of Evidence

“As you can see, the lengths of the vector of differences,


the profit signal vector and noise vector are related by
The New Management Equation. The vectors are 10-
dimensional because there are 10 differences. We are way
into hyperspace. The profit signal vector is determined by
one number, the average difference of 0.41, so it has one
degree of freedom. That leaves nine degrees of freedom for
the noise vector.”

Her presentation was interrupted by one of her friends.


“Let’s take a pause for just a moment here to do a little
yoga stretching while our minds are bending.” After some
uncomfortable laughter, the Black Belt’s Six Sigma analysis
continued.

“OK. We are back on task. We have to adjust the New


Management Equation (a.k.a. sums of squares a.k.a.
squared lengths of vectors) by dividing by the degrees of
freedom.

“This gives us Variances that measure the strength of the


profit signal and noise vectors. When we divide the profit
signal variance by the noise variance we get a signal-to-
noise ratio that measures the strength of evidence against
the null hypothesis. It is called the F ratio because a guy
named Fisher a long time ago invented it. As you can see,
the F ratio in this case is 11.215.

“The F ratio can’t be interpreted on its own. We have to


compare it to a distribution to see how likely it is that a
value as large as 11.215 could have occurred by chance
alone. This probability is called the p-value. If the p-value
is small enough, we have to reject the null hypothesis.

“By established international standards, the evidence


against the null hypothesis is ‘clear and convincing’ if the
p-value is less than 0.05, and it is ‘beyond a reasonable
doubt’ if the p-value is less that 0.01 (Table 9). As
you can see,” she said, pointing at her computer screen,
“the p-value in this case is 0.0085. This means there
is a significant difference between A and B, beyond a
reasonable doubt.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 79

Table 9 The Black Belt showed the


table of evidence to the team.

One engineer says, “That makes a lot of sense. Even though


the difference was less than 4%, we felt that a difference of
0.41 units could cause problems. We were afraid yield losses
would exceed the savings on material costs.”

A potential disaster is narrowly averted by using an evidence-


based decision in the nick of time. Critical-to-quality
characteristics and financial margins are protected. The
company’s reputation for quality is preserved. Just another day
in the life of a Six Sigma company.

The next Black Belt, Green Belt, Yellow Belt and Champion
courses are filled to capacity. The waiting lists for the
following sessions are long. The company takes the next step
forward by implementing Six Sigma across all projects and
functional responsibilities in the corporate matrix. Their first-
wave Black Belts are now in Master Black Belt training using
their own case studies.

Spreadsheet versus Data Matrix

Spreadsheet arithmetic is today’s cost-accounting variance


analysis computing engine. While teaching the real Analysis
of Variance we often hear the comment, “So what’s the big
deal with a data matrix? You can do all that in a spreadsheet.”
This is true. We know because we have done it. Some of that
work has been presented in this chapter. There is more to
come in Chapters 5 and 6.

It is also true that you could eventually compute the orbital


trajectories of all the planets in our solar system with an
abacus.21 Unless you and your loved ones have nothing better
to do with the rest of your lives, our question is this, “Why
would anyone want to?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


80 Standards of Evidence
The spreadsheet is a marvelous invention. It automates
arithmetic. You can write formulas. Like Keats, you can put
your data wherever you want it and analyze it however you
want. If you add in enough add-ins, you can actually do some
statistics, even Analysis of Variance.

Adding in the add-ins is a clumsy way of trying to reinvent


the machinery of a vector analysis that already exists in
modern statistical software. These programs give you access to
this machinery with a mouse click.

The greater liability in trying to do everything with a


spreadsheet stems from the very freedom that makes
spreadsheets so popular. Spreadsheet applications are unruly
and Lawless. The Laws of the Universe do not apply to them.
Statistical packages, on the other hand, follow the Law. They
require the correct data matrix structure—each row an object
of interest, each column a vector of data on the objects.

Data vectors are the principle components of vector analysis.


(Hence the name.) Vector analysis provides the transparency
required to satisfy international accounting standards and
scientific standards of evidence. For example, statistical
packages automatically create the variation, profit signal and
noise vectors shown in Tables 5, 6 and 8.

One can create this table in a spreadsheet, although it


is tedious. We did in fact make Tables 5, 6 and 8 in a
spreadsheet, but nothing forces other users to do so. The
undemanding nature of spreadsheets lures unsuspecting users
into sins of omission. There is no requirement for vector
analysis, no requirement for transparency.

Other spreadsheet characteristics are simply inconvenient


or annoying. For example, many spreadsheet functions
treat blank cells as zeroes. This works fine for adding and
subtracting. In reality, a blank cell indicates a missing value in
a data vector. A missing value changes the degrees of freedom
and dimension of the vector. The vector analysis can handle
this, although it does affect the results. By contrast, the
cavalier insertion of zeroes for missing values wreaks havoc on
vector analysis, giving incorrect results.

These and related comments are summarized in Table 10.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 81

Table 10 Comparing and contrasting


a spreadsheet and a data matrix.
Inductive and deductive reasoning
are built into data matrix software.
No such discipline exists in a
spreadsheet.

P-values, Profit Signals, Confidence Levels


and Standards of Evidence

A null hypothesis always consists of a negative assertion. The


phrasing of a null hypothesis is not a law of the universe, but
it is an odd standard. Here are some examples:

• There is no difference between these two ways of


doing things.
• There are no differences among these three or more
ways of doing things.
• There is no relationship between these two variables.
• There are no relationships among these three or more
variables.
• There are no relationships between these two groups
of variables.

The null hypothesis often plays the role of a “straw man”


in inductive reasoning. According to the on-line folklore
database Wikipedia, the straw man concept began as a rodeo
safety tactic.22 A straw man would distract bulls. It could be
torn apart with no harm done. We can tear apart the straw
man, the null hypothesis, if it is something we would like to
disprove based on the data.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


82 Standards of Evidence
The F ratio, or F statistic, is a signal-to-noise ratio that
measures the strength of evidence in the data against the null
hypothesis. As the F ratio increases, the strength of evidence
against the null hypothesis increases. We evaluate an F ratio
by comparing it to a statistical distribution to see how likely
it is that a value that large could have occurred by chance
alone.

The distribution to which the F ratio is compared depends


on the degrees of freedom for the profit-signal and noise
vectors. As a result, there is no standard scale of comparison
for the F ratio.

We get around this by working with a probability computed


from the F value. This probability, called the p-value, is the
probability of getting an F ratio as large as the value we got
by chance alone. If the p-value is small enough, we reject the
null hypothesis.

In Microsoft Excel, the cell formula syntax for calculating the


p-value is this:

= FDIST(value of F ratio, degrees of freedom for the profit


signal vector, degrees of freedom for the noise vector)
For example, the formula to produce the p-value 0.0085 in
Table 8 is as follows:

= FDIST(11.215, 1, 9)

11.215 is the value of the F ratio, 1 is the number of degrees


of freedom for the profit signal vector, and 9 is the number
of degrees of freedom for the noise vector. Enter this formula
into your Excel spreadsheet and you will get the correct
answer: 0.0085.

The formula to produce the p-value 0.234 in Table 6 is as


follows:

= FDIST(2.843, 1, 2)

2.843 is the value of the F ratio, 1 is the number of degrees


of freedom for the profit signal vector, and 2 is the number
of degrees of freedom for the noise vector. Enter this formula
into your Excel spreadsheet and you will get the correct
answer: 0.234.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 83
We do not like writing spreadsheet formulas. We do like the
fact that statistical software does it for us automatically.

As the F ratio increases, the p-value decreases. As the p-value


decreases, the strength of evidence against the null hypothesis
increases. This tends to confuse people. It is easier to think in
terms of confidence levels (Table 11). The confidence level
is one minus the p-value, usually expressed as a percentage. As
the confidence level increases, the strength of evidence against
the null hypothesis increases.

Table 11 Standards of evidence in


a nutshell. A p-value less than 0.05
yields a confidence level greater than
95%. A p-value less than 0.01 yields
a confidence level greater than 99%.

Closing Arguments

Themis is the Blind Lady of Justice in Greek mythology.

Themis: “As an oracle, I used to advise Zeus when he made


decisions. I did my job so well I became the goddess of divine
justice. You can see from some of my portraits that I used to
carry a sword in one hand and a set of scales in the other. The
blindfold I wore was more than a fashion statement. It meant
I would be fair and equitable in my judgments. My whole
existence hinges on objective standards of evidence.”23

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


84 Standards of Evidence
Endnotes

1
American Heritage Dictionary of the English Language, Third
Edition. Boston. Houghton Mifflin Company. 1992.

2
Dawkins, Richard. Unweaving the Rainbow, Science Delusion
and the Appetite for Wonder. Boston, Houghton Mifflin
Company, 1998.

3
Huff, Darrell and Geis, Irving. How to Lie with Statistics.
New York, W.W. Norton and Company. 1954.

4
Taylor, Frederick Winslow. Scientific Management, Mineola:
Dover Press, 1998. pages 55-59. The original 1911 version
was published by Harper and Brothers, New York and
London..

5
Oxford English Dictionary, 1989.

6
Garrison, Ray H. and Noreen, Eric W. Managerial
Accounting, 10th Edition. Boston, McGraw-Hill Irwin, 2003.
Page 431.

7
Harrison, G. Charter. Cost Accounting to Aid Production
– I. Application of Scientific Management Principles. Industrial
Management, The Engineering Magazine, Volume LVI, No.
4, October 1918.

8
Harrison, G. Charter. Cost Accounting to Aid Production
– I, Standards and Standard Costs, Industrial Management,
The Engineering Magazine, Volume LVI, No. 5, November,
1918.

9
Harrison, G. Charter. Cost Accounting to Aid Production
– I, The Universal Law System. Industrial Management, The
Engineering Magazine, Volume LVI, No. 6, December, 1918.

10
Johnson, H. Thomas, and Kaplan, Robert S. Relevance
Lost, The Rise and Fall of Management Accounting. Boston:
Harvard Business School Press 1991. Pages 10-12.

11
Anthony, Robert N., and Reece, James S., Accounting: Text
and Cases, Eighth Edition. Homewood, Irwin, 1989. Page 15.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Standards of Evidence 85
12
MacKay, Charles, Memoirs of Extraordinarily Popular
Delusions, Copyright 2002 eBookMall version available for
$1.75. http://www.ebookmall.com/alpha-authors/m-authors/
Charles-MacKay.htm

13
MacKay, Charles, Memoirs of Extraordinarily Popular
Delusions, Copyright 2002 eBookMall version available for
$1.75. http://www.ebookmall.com/alpha-authors/m-authors/
Charles-MacKay.htm Page 8.

Gardner, Martin. Fads and Fallacies in the Name of Science.


14

New York, Dover Press, 1957. Page 106.

15
Sagan, Carl. The Demon Haunted World, Science as a Candle
in the Dark. New York, Ballantine Books, 1996. Page 241.

16
Anthony, Robert N., and Reece, James S., Accounting: Text
and Cases, Eighth Edition. Homewood, Irwin, 1989. Page
941.

17
Oxford English Dictionary, 1989.

18
Box, Joan Fisher. R.A. Fisher: The Life of a Scientist. New
York: John Wiley and Sons, 1978. Page 97.

19
Box, Joan Fisher. R.A. Fisher: The Life of a Scientist. New
York: John Wiley and Sons, 1978. Page 100-102.

20
Box, George E.P., Hunter, William G., and Hunter, J.
Stuart. Statistics for Experimenters, An Introduction to Design,
Data Analysis, and Model Building. John Wiley & Sons. New
York. 1978.

21
Dilson, Jesse. The Abacus, The World’s First Computing
System: Where it Comes From, How it Works, and How to Use it
to Perform Mathematical Feats, Large and Small. New York, St.
Marten’s Press. 1968.

22
http://www.wikipedia.org/wiki/Straw_man

23
http://www.commonlaw.com/Justice.html

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


86 Standards of Evidence

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 3

Evidence-based
Six Sigma

S
ix Sigma (6σ) is a proven, pursuit-of-perfection business
initiative that creates breakthroughs in profitability,
productivity, and quality. It is a highly structured,
project-by-project way to generate bottom line results. It
produces significant dollar value through a never-ending
series of breakthrough projects. Evidence-based decisions
characterize the 18-year, 6σ record of accomplishment.

The essential elements of Six Sigma breakthrough projects are


vector analyses applied to data matrices.

Hundreds of millions of dollars have been placed directly


onto the bottom line of companies around the world using
this improvement model and its tool set. Though large
multi-national corporate results have attracted the most
media attention, we have personally seen a 26-employee
plastic pressure and vacuum forming company achieve
proportionally identical results.

Six Sigma knowledge and know-how have evolved since


the notion of perfect 6σ quality was first conceived by
Motorola engineer Bill Smith. Motorola’s Chief Executive
Officer at the time, Robert Galvin, was the first Six Sigma
Champion. He enthusiastically led the entire program. He
personally removed bureaucratic obstacles to breakthrough
improvements.

Six Sigma became an education and training commodity


during in the late 1990’s. It gains momentum as it matures.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


88 Evidence-based Six Sigma
The catchy three syllable “Six Sigma” moniker is value-added
packaging for vector analysis and objective evidence. Six
Sigma also conveys substance. Wall Street likes 6σ because it
ties customer satisfaction directly to corporate profitability.
Customer satisfaction, quality information, speed, and lean
organizational structures are Six Sigma cultural values. What
is valued gets measured, analyzed and is rewarded.

Six Sigma measurements are recorded in data matrices. Since


data matrix applications are essential to vector analysis, every
true 6σ company has its own corporate software standards.
Every Six Sigma champion executive and, if she or he expects
to be promoted, every manager in a Six Sigma company has
data matrix software loaded on their personal computers.
Though many products are available, two currently dominate
the market: Minitab and JMP.

A Six Sigma analysis is a vector analysis applied to a data


matrix. As we graphically detailed in Chapter 2, Six Sigma
gets its name from the vector analysis results. This analytic
process is sometimes called an Analysis of Variance,
or ANOVA. Since the acronym and its equations are
traditionally presented in ways that are guaranteed to bore
even motivated academics, calling them Six Sigma Tools has
worked wonders. Corporate executives embrace them even
though only a few know what the phrase and acronym mean.
That is a remarkable accomplishment in anyone’s marketing
book of records.

An ANOVA breaks raw data into six vectors (Figure 1).


Two are priceless business intelligence commodities: 1) Profit
Signals and 2) Noise. (Historically, Profit Signals have been
called “treatment deviations.” That appealed to engineers and
statisticians. The mass market of Six Sigma calls for better
branding. We answered that call.1)

Computing power transforms what was once an almost


impossibly difficult series of matrix algebra calculations into a
single computer command “Run Model.”

Anyone who wants to correctly analyze measurement data can


now do so in seconds. When a company combines computing
power, the principles of accelerated adult learning and hands-
on improvement projects, breakthroughs routinely lead to
quantum leaps in profitability.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 89

Figure 1 A complete analysis is


composed of six vectors. Profit
Signals quantify what matters most.

Six Sigma (6σ) Basics

Here is the bullet list of Six Sigma basics. The jargon


side of this business initiative is as real as it is regrettable.
Acronyms and algebraic symbols are Six Sigma grammar.
We identify these hieroglyphics as a courtesy orientation to
newcomers.

1. Top-level executives personally lead the Six Sigma


initiative in highly visible ways. Authentic 6σ executives
eschew the use of spreadsheet bar graphs and pie charts.
Correct, rule driven analyses of financial and productivity
data are evident in Six Sigma executive presentations.
Executive compensation and promotion are tied to the
use of data-driven, evidence-based decisions. The litmus
test of leadership is the replication of high dollar value
breakthrough projects.

If an executive champion does not meet the challenge of


these responsibilities, the Six Sigma initiative will fail to
produce promised results.

2. Education and skill training in the recognized body


of knowledge (BOK) permeate Six Sigma organizations.2
Computing literacy, which means decision makers know
how to use a vector analysis applied to a data matrix, is an
expected competency for every leader.

3. Exponential rates of improvement are an expected


outcome. New ways of getting work done, with fewer
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
90 Evidence-based Six Sigma
resources, and in a fraction of the time required by
previous methods, take precedence over incremental
process improvements.

4. Measurements and Six Sigma metrics are tied to short-


term and long-term financial performance.

Executive Six Sigma leaders allocate significant personal


time and resources for 6σ projects. In addition to their
own investments, they assign the company’s most capable
people full-time to lead Six Sigma breakthrough projects.
The Executive’s job is to remove bureaucratic roadblocks to
improvement so that managers who have an aptitude for
implementing productive changes can succeed.

The corporate Six Sigma job description hierarchy resembles


titles earned in a martial arts dojo. Full-time Six Sigma
professionals, called Black Belts, are expected to be able to
“kick the heck out of ” any variation that leads to waste or
rework.3 In addition to a Karate/Tai Kwan Do/Kung Fu/
Judo level of intellectual aggressiveness, Black Belts must
demonstrate leadership and good interpersonal skills. They
must be masters of evidence-based decision principles.

Ideally, sensei executive champions coach and mentor 9th


degree Master Black Belts, who in turn coach, mentor and
lead Black Belts. Black Belts then coach and supervise Green
Belts and Yellow Belts. Education and training permeate the
organization. Eventually every employee actively contributes
to the production of breakthrough project results: cold cash to
the bottom line.

The Six Sigma Profit Strategy

Six Sigma improves profits by aiming at perfect products,


services, and processes. In a 6σ culture, everyone is expected
to enthusiastically argue in favor of perfection. A passionate
work ethic attitude carries weight in a Six Sigma culture.
Protests over the possibility of a “diminishing rate of return”
indicate an individual does not understand 6σ fundamentals.

The lower case Greek letter, σ, is pronounced ‘sigma.’ In


the professional world, σ is the symbol for the population

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 91
standard deviation. The sample standard deviation, along with
the five other elements in a complete vector analysis, comes
from raw data. It quantifies the amount of random or chance
variation that occurs around the average in any, and every,
given set of data. To understand and embrace the universal
Generalization of Chance Variation is to enter the world of
Six Sigma. Try the following experiment to demonstrate this
physical law for yourself.

First, find a friend you admire. Choose someone with whom


you can discuss controversial information. Now, each of you
needs to print the letter “a” 10 times on a piece of paper in
the exact same way with no variation.4 Go on. Try it.

This exercise is a trick. The task is completely impossible.


Differences in writing tools, variations in ink, paper texture,
handedness, fatigue, font, attention span, concentration, your
interpretation of our instructions, and an infinite number
of other variables all contribute to natural variation. Natural
variation is present everywhere and always. It is ubiquitous.
It is a law of our universe, as powerful as gravity. Every good
product and every service suffers from the inconsistencies
caused by variation.

J. Bernard Cohen, the eminent historian, considers knowledge


of Chance and/or statistical variation to be the distinguishing
characteristic of our generation’s Scientific Revolution.
“If I had to choose a single intellectual characteristic that
would apply to the contribution of Maxwell [though not
directly to his revolutionary field theory], Einstein [but not
the revolution of relativity], quantum mechanics and also
genetics, that feature would be probability.”5 We agree.

This Six Sigma Revolution in business and science is


defined by evidence that is based on Probability rather than
determinism.6 Like it or not, probability overthrows old
doctrine. There is no polite way to summarize the impact
variation has on an individual’s world view. Probability,
dressed up in the Six Sigma costume, is replacing old ways
of knowing—revelation, intuition, and reason—with the
disciplined analysis of experimental observations.

Six Sigma unifies the scientific method and business.


Evidence-based decisions and the power in a vector analysis
are the router connections between the two disciplines. In
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
92 Evidence-based Six Sigma
answer to the meta-questions, “Does this Six Sigma stuff
really work?” and, “Can you prove it by replicating your
results?” The answer is unequivocally, “You bet.”

With any and every set of raw data we can construct a


tetrahedron, the cornerstone of statistical evidence. When a
standard deviation is combined with an average, we can make
valuable predictions based on a family of probability curves
and surfaces (Figure 2). When one knows the average and
standard deviation (σ) of a process, one can improve that
process to near perfect, 6σ, performance. Perfect quality first
time every time is valuable. This value can be measured with
money.

Figure 2 Data matrix software


automatically transforms the
cornerstone of evidence into
probability distributions.

Figure 3 illustrates old school 1980s corporate Quality


Improvement (QI) aims. Way back then, ‘three-sigma’ quality
was the target.7, 8 This means that the 6σ total process spread
just fits between the lower and upper specification limits
(LSL and USL). At best, this means that 99.7% of process
outcomes satisfy customer requirements. This near 100%
quality sounds better than it is. Recall the unacceptably
wide variation in the prior chapter’s bar chart bamboozling
comparison. At its best, a three-sigma 99.7% distribution
promises ‘only’ 2,700 defective outcomes per million
produced.

A three sigma process may actually produce as many as


67,000 mistakes or defects per million (DPM). This is
because processes typically drift by about 1.5 standard
deviations around their long term average.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 93

Figure 3 Three-sigma quality


means that the 6σ total process
spread just fits between the lower
and upper specification limits (LSL
and USL). At best this means that
99.7% of process outcomes satisfy
customer requirements.

To put these numbers into perspective, ‘three-sigma’ aviation


safety would mean several airline crashes each week. In health
care, it would mean 15,000 dropped newborn babies per year.
Banks would lose thousands of checks daily. As it is, three
sigma (3σ) quality costs businesses between 25-40% of their
annual operating income in waste and rework.

Six Sigma breakthrough projects aim to reduce the standard


deviation. High-leverage processes that affect business,
manufacturing, or health care delivery are the prime targets.
The Six Sigma bell curve in Figure 4 covers only one half
of specification range. This illustrates the effect of a smaller
standard deviation, σ.

The Six Sigma one-part-per-billion (PPB) Six Simga bell


curve in Figure 4 covers only one-half of the specification

Figure 4 A Six Sigma capable


distribution covers only one half
of the specification range.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


94 Evidence-based Six Sigma
range. This illustrates the dramatic financial benefit of
reducing the standard deviation.

Even when the process drifts, only 3-4 defective outcomes


per million (DPM), can occur. In a σ =$1.00 example, a Six
Sigma breakthrough would result in a standard deviation
that equaled $0.50 or less. When this goal of perfection is
achieved, costs related to waste, rework, inelegant designs, and
needless complexity disappear.

The proven rewards for achieving 6σ are, 1) excited customers


and 2) improved profits. Historically, each Six Sigma project
generates a 100-250K benefit. Full-time corporate 6σ
Experts, Black Belts who currently earn about 120K in salary
and benefits, lead three to four projects per year that generate
$1 million in hard dollar, bottom line business benefit. This
10:1 rate of return is so dependable it has become a tradition.

Prior to the development of Six Sigma in the late 1980s, the


only people earning their livings full time using these tools
for breakthrough projects were consultants. We were the only
ones willing to study out-of-date textbooks, use handheld
calculators, rulers, graph paper, and DOS programs.

Thank heavens those days are behind all of us now. Anyone


and everyone can enjoy the benefits of vector analysis applied
to a data matrix. Six Sigma style profits are now a matter of
personal choice.

The Lucrative Project Results Map

Flow diagrams and process maps simplify work. They make


hidden process dynamics visible. Seeing waste and complexity
helps people eliminate both. Flow diagrams like Figure
5 can also be used to create processes that produce perfect
results. To read the diagram, begin with the hard copy
documentation symbol at the upper left hand corner. Follow
the arrows through each of the four levels to the right hand
page bottom.

The acronym used to describe the classic 6σ process is


DMAIC. DMAIC stands for the iterative 6σ project cycle
of Define, Measure, Analyze, Improve, and Control. Once a
project is completed, the process described by this map begins
again. This cycle never ends.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 95

Figure 5 This flow chart has guided projects toward bottom line business results for years.

�������������������������������������������
����������
��������� ������������
���������
���������� �������������� ����������
������������
������������� ����
��������� ��������������� ��������
��������� ������������
��������������� ��������������
���������
�������������� ����������������
�������

� ���������
������������ ����� ������������
������� ����������� �����������
������������ ������
�������������
�������� ��������
��������� ���������� ��������������
������ ����� �����������
����������
����� ����������� ������������
�������� ����
������� �������������
��������
�����������

�������� �������� ����������


��������� ������ ���������
�������� ���������
������������������ ����������� �����
������ �����������
����������� �����������
������������� ��������
�����������������

���������
������ ��������� ��������
���������� ������������ ��������
��������������
������� ��������� �������������
�������������
������ ������������

������������������������� ���������
���������������������� �������
������������������
�������� �������������
���������������
�������

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


96 Evidence-based Six Sigma

Define, Measure, Analyze, Improve, Control

The voice of the customer (VOC), customer satisfaction and


profit goals come first and last in the Six Sigma DMAIC
cycle of improvement. The map marks the boundary of
each phase. As 6σ breakthroughs help companies surpass
quarterly and annual financial targets, long-term objectives
are continuously upgraded to sustain momentum.

The series of five steps in the top row and the two final steps
in the bottom row are top-level management and leadership
responsibilities. The middle three levels are Black Belt project
tasks. Each of these steps takes time, so every 6σ project result
needs to be substantial and financial.

Results interpretation, improvement and control require close


collaboration between top-level leaders and Black Belts. The
process of interpreting statistical results, making an evidence-
based decision, optimizing a system, and implementing
improvements can and does flatten bureaucracy. Occasionally
organizations that value bureaucracy manage to “do
Six Sigma” while they find ways to sustain paperwork,
committees, and supervisory redundancy. Six Sigma window
dressing is immediately apparent to any knowledgeable
observer.

We advise potential clients who are fond of their bureaucracies


to stick with Old School Management methods. Evidence-
based decisions and Six Sigma will bring them nothing but
trouble. Employees will openly question executives. Cost-
accounting reports and risky capital investment Proformas
will be challenged with physical models.

Six Sigma programs are seen as disruptive when a business


values group think. Don’t laugh. Many do. The ones we have
worked with and for are populated with delightful, friendly
people. These folks just happen to draw an interesting set of
Six Sigma project boundaries. Senior management processes
and decisions are off limits. “Don’t go there.”

In companies with a full commitment to evidence-based


decisions, there is broad-based organizational involvement.
This commitment is the key to perpetual breakthrough
project success at the highest levels of the company.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 97
Six Sigma employs just about every effective management
tool that has ever been developed. Any project management
tool you can think of that has proven to be useful is now
called a Six Sigma Tool. For example, the project management
chart developed by Henry L. Gantt in 1917, called a Gantt
chart, is still useful and very much in vogue.9

A PERT (Evaluation and Review Technique) chart, which


provides an alternate Gantt chart view of a project, is also
popular.

In actual practice Six Sigma focuses relentlessly on completing


projects within 90-120 days. Experience shows that if a Six
Sigma project improvement team fails to deliver bottom line
business dollar value within this time frame, organizational
commitment to 6σ instantly wanes.

We saw a most eloquent occurrence of this phenomenon


in a CEO’s behavior. After a few months of Six Sigma
hoopla, people noticed that he wasn’t using evidence unless
it supported the foregone corporate agenda. Projects were
not being completed on time. Resistance to evidence-based
decisions grew. One day, he casually observed to the vice
president in charge of implementing Six Sigma, “Six Sigma is
ephemeral.” The VP looked the word up and discovered, to
his dismay, ephemeral means “dead in a day.”

As a side note, it was interesting to see this Six Sigma


initiative generate about $6 million in bottom line benefits
by the year’s end. The dollar per dollar return on investment,
ROI, was only 5:1. Nevertheless it was informative to watch
a Black Belt compete, particularly an experienced Master
Black Belt. They do what it takes to bring home the bacon.

Though it is management’s responsibility to keep the


improvement fires burning, project delays and passive
criticism are favored benign neglect techniques.

Old school managers can and do successfully use neglect to


sabotage Six Sigma. Make no mistake. The Six Sigma field
is littered with the corpses of failed Black Belt Projects. The
successful disruption of projects generally returns the culture
to less demanding performance standards. The Institution of
Old School Management thinking does not surrender until it
is surrounded and expelled.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
98 Evidence-based Six Sigma

Therefore, based on experience, we strongly recommend that


once a company commits to evidence-based decisions, it stay
focused on the money and deadlines.

� �������� ���������� ������������ ���������� ������������� �����


����������������� �������� �������� ����� ���������� �������
���������� ��������
��������������������������
Table 1 You can program a �������������������� � � � � � � � � � � �����
spreadsheet to help you choose ��������� � � � � � � � � �� � �����
best projects.10 ���������� � � � � � � � � �� � �����
�������������� � � � � � � � � � � ����
����������� � � � � � � � � � � �����
��������������������������� � � � � � � � � � � ����
������������� � � � � � � � � � � ����

Lucrative Project Selection

Selecting and prioritizing the most rewarding projects is a


most important first step. Since time is money and money is
time, the selection process must be efficient and fast. Table 1
is a simple, virtually universal project evaluation spreadsheet
that has emerged as a favorite around the United States.
Six Sigma team leaders put breakthrough improvement
project ideas into this hopper. The hopper is always open,
but depending on the culture, new projects are usually given
serious review at quarterly and annual intervals.

During project review meetings, each idea is ranked from


low to high, 1-10, in five or more categories. These values
are multiplied to create a priority rating. This project
prioritization process promotes a consensus style agreement
that has some quantitative structure to it. We have seen it
improve interpersonal working relationships as it generates
lists of breakthrough project targets. The suggested project
with the highest total project priority number is first and so
on.

Clear operational definitions are a fundamental part of project


selection. The project and its related issues must be defined
operationally. Operational definitions must be practical. If
senior management, the project champion, or the Black Belts
who are assigned to the projects do not share a common
understanding of these definitions, problems arise.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 99

For example, write down your definition of the word ‘pan.’


Good work. Your definition is correct. So are at least 20
others. In Spanish, pan means bread. Pan is a cooking
container, a depression in the earth, a cavity in the lock
of a flintlock, and the Greek god of the woods. You
can pan a camera or pan for gold. Believe it or not, the
misinterpretation of a three-letter word has been known to
derail projects. For this very good reason, experienced Six
Sigma Master Black Belts and Black Belts are very specific
when they define what it is that they intend to count or
measure. Do more of what works.

Here is another classic example that illustrates why clear


operational definitions are crucially important to even a
simple process like counting. Count the number of f’s in the
following paragraph.

FOR CENTURIES IMPORTANT PROJECTS


HAVE BEEN DEFERRED BY WEEKS OF
INDECISION AND MONTHS OF STUDY AND
YEARS OF FORMAL DEBATE.

How many did you count? Pause to write your answer here
before moving on. _______

Depending on how you decided to define the letter “f ” there


are seven possible correct answers. There are no lower case,
italicized f ’s. So, zero is one correct answer. If you decided
to count any F, there are 6. If you proof read phonetically, in
other words you defined “f ” by the sound of the letter, the
F in each OF sounds like a “v.” So, if you defined an F by
the way it sounds you could have counted 1, 2, 3, 4, 5, or 6.
Any one, or all of these answers, taken in the context of its
definition, could be considered to be correct.

During the project selection phase, perfect Six Sigma


performance expectations called Critical to Quality (CTQ)
or Key Quality Characteristics (KQC) are defined in
statistical terms. Without a statistical definition, there can
be no objective evidence. The definition must include an
average and a standard deviation. Both come from a vector
analysis applied to a data matrix. Since the Profit Signal
is also automatically produced by this analysis process, it
is considered to be part of a comprehensive operational

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


100 Evidence-based Six Sigma
definition. These high analytic standards are used at all levels
in the organization.

Once operational definitions are agreed to, an average and a


desirable standard deviation for project outcomes are targeted.
These figures are accompanied by the expected dollar value of
benefits the company can look forward to harvesting. When
projects have been identified and Key Quality Characteristics
are defined, financial models are used to create credible
bottom line profit signal estimates.

Financial Modeling and Simulation

Six Sigma budget models are dramatically different from, and


superior to, spreadsheet arithmetic Proformas. Every manager
who has actually participated in the old-school ritual called
“spreadsheet scenarios” must candidly admit to making the
numbers up.

The old school cost-accounting variance analysis encourages


confabulation by eliminating 5 vectors, or 83 percent, of
all the information contained in raw data. This is a covert
impropriety if there ever was one. By using only one vector,
and masking the other five vectors, almost any story holds
water. A reliable forecast is as transparent as an authentic
analysis. All data and all elements are revealed.

When correctly employed, Six Sigma tools raise the standards


of what does and does not constitute a credible Proforma,
scenario, or forecast. Increasingly accurate predictions put
the world of continuous spreadsheet revisions to shame. An
abacus cannot beat a super-computer no matter how fast one’s
fingers are.

Legitimate financial forecast models are created using vector


analysis rules. Financial Engineering News is one of many
trade publications that helps professionals get up to speed on
the use of these tools.11 Dr. John M. Charnes is a frequent
contributor. As the Area Director for Finance Economics,
and Decision Science at the University of Kansas, School
of Business, Dr. Charnes exemplifies leadership in the field.
He used the Decisioneering product called Crystal Ball, to
create a 16 module self-guided study course that we think is
excellent.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 101

Figure 6 One popular Six Sigma


software program uses flow
diagrams to graphically detail the
iterative cycle used to create and
improve financial forecasts.

His open system flow diagram, with clouds representing


thought processes, is shown in Figure 6.

In a data matrix driven budget forecast, the historical data


underlying each budget assumption are graphed in 2, 3 and
more dimensions prior to including that assumption in the
forecast model. Once assumptions are validated, multivariate
models incorporating factor interactions, correlations, and
entrepreneurial assumptions are created. The model can then
be simulated thousands of times in seconds. The output is
presented graphically.

Simulation is proving to be as beneficial to financial managers


as it is to engineers, jet pilots, doctors, and student drivers.
Working under old school constraints, engineers had to build
expensive physical models to test their ideas. Pilots had to
practice first solo flights at 600 miles per hour. Surgeons had
to test new techniques on live patients. Parents had to take
their teenager into traffic and hope for the best.

With multi-dimensional computer simulations, new


designs, flying skills, surgeries, and even freeway entrances
can be tested “off-line” first to minimize risk. The benefits
to simulation are objective and overwhelming. This is why
computerized simulation is a “Six Sigma Tool.” If you look
closely under the hood of reputable simulation applications,
each has a data matrix and vector analysis for sparkplugs.12

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


102 Evidence-based Six Sigma
Beginning in the late 1980s, inventive software manufacturers
began to develop programs that forced spreadsheets to behave
like a data matrix. The geometry that guided their design
creates graphic results that look terrific. These macros are
now mature modeling programs. They are a joy to use. Every
day we thank the General Electric Senior Vice President who
took time out of her day in 1997 for a cold call telephone
interview. She explained how these programs push Six Sigma
forward. Her counsel was, and remains, rock solid.

With the finance simulation tool add-in, budget forecasts


inherit the power of a vector analysis. Vector models do an
impressive job of helping decision makers visualize probable
outcomes. They are affordable tests. They let leaders
meet and beat breakthrough project goals. Analysis rules,
quantification, continuous feedback and discipline improve
model forecasts over time.

Beyond the covert elimination of 5 analysis vectors,


spreadsheet arithmetic budget models and “what-if ” scenarios
fall short of evidence-based decision standards in significant
ways.

1. With a spreadsheet, an individual number in a cell is


accepted on face value. This number frequently misleads
because it is not framed in a meaningful context. Without
analysis context—an average, a standard deviation,
probability information, and an analytic graph—people
must guess at the number’s meaning in relation to the
other variables in a system.

2. Spreadsheets encourage analysts to believe in the great


bamboozle. Many now are convinced simple addition,
subtraction, multiplication and division are appropriate
tools for analyzing complex, multivariate systems. Because
the columns and rows in a spreadsheet look just like
the columns and rows in data matrix, many conclude
equivalent answers are automatically produced with each
one. This error is serious.

a. Spreadsheets are wildly popular because they let


anybody do absolutely anything with any number. It
all looks legitimate! Though spreadsheet arithmetic
sets the standards of evidence at a comfortably low
level, they produce illusion rather than insight.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 103

3. Spreadsheet scenarios are usually created using One-


Factor-at-a-Time (OFAT) methods. OFAT analysis and
experimentation methods are no more reliable now than
they were when Frederick Taylor used them in the 26
years leading up to 1911. Not only do they not yield an
accurate answer, they waste time. Conclusions reached
using this method are at odds with physical Laws of the
Universe. By over-simplifying problems, answers are
notoriously unreliable.

4. Spreadsheet scenarios create a false impression of


precision. Spreadsheet numbers are dressed up in
impressive looking data arrays. These images shout out,
“Hokum!” Nevertheless they are routinely presented,
accepted, and framed as “certain forward thinking
statements” in a social gesture of courtesy that smacks
of hubris. In corporate hierarchies these courtesies force
otherwise intelligent, well-meaning people to forget what
they know about mathematics.

People are persistent when it comes to juggling numbers.


Human nature is tireless in its allegiance to irrational beliefs.
Martin Gardner’s Fads and Fallacies observation rings as true
today as it did when he wrote it in 1952.

“How easy it is to work over an undigested mass of data and


emerge with a pattern, which at first glance is so intricately put
together that it is difficult to believe it is nothing more than the
product of a man’s brain… Consciously or unconsciously, their
perceived dogmas twist and mold the objective facts into forms
which support the dogmas, but have no basis in the exterior
world.” 13

Simulation programs give spreadsheets like Excel a new lease


on life. Macros that follow the rules of evidence are bringing
high standards to the world of accounting and finance.14 This
is a very good thing. Once a 3D cube model is embedded in a
spreadsheet, analysts have a much better grasp on the range of
possible budget outcomes.

Likelihoods and probabilities are presented automatically


in attractive visual graphs. With simulation, managers
can perform tens of thousands of multivariate scenarios in
minutes. This is less time than it takes a skilled controller to
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
104 Evidence-based Six Sigma
complete a single, One-Factor-At-a-Time (OFAT) budget
forecast scenario.

In addition, and for no extra charge, the simulation


automatically produces a sensitivity chart that resembles
a spreadsheet bar graph. Vector analysis sensitivity charts
rank Profit Signals according to the strength of the evidence
for each factor. Sensitivity charts expose counter-intuitive
patterns that are masked by spreadsheet arithmetic.

A sensitivity analysis ensures that management focuses on the


key variables that have the most impact, rather than being
distracted by variables they think may be most important.
Figure 5 is a sensitivity chart illustration.

Simulation can increase one’s level of confidence as business


decisions are made in the face of uncertainty.

Compare and Contrast Analysis

The classic budget forecast (Table 2) is usually created by


estimating three outcomes: 1) best case, 2) worst case, and 3)
most likely case.15 Since there are no rules, personal opinion
and a consensus are the only evidence required for making a
decision on the decision to pursue the NanoTech Widget.

NanoTech Widgets solve problems. They are multi-purpose.


With a projected profit of $9.2 million, they are a sure fire
new product. Note how the forecast in the bottom right hand
cell catches the eye.16

When this spreadsheet was analyzed 1,000 times in under six


seconds using the data matrix introduced in the Five-Minute
PhD, a much different picture emerged. Figure 7 does not
tell the manager what to do. However, it does point out that
the most probable outcome is a $14.4 million loss rather than
a $9.2 million gain.

Once the simulation is complete, the analyst or manager can


evaluate which of the variables has the greatest impact on the
bottom line. Predictably, factors are ranked in importance
according to the relative strength of statistical evidence
(Figure 8).
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 105

Table 2 The classic old school


budget forecast for new product
development presents assumption
numbers without the benefit of
either context or evidence. The
average, standard deviation, p-value,
and analytic graphs are ignored.
Forecasting spreadsheet analysts are
simply expected to correctly guess all
values, evidence strength, and factor
interactions.

Figure 7 Based on all the actual data


at hand, there is a 77.9% chance of
breaking even with the NanoTech
Widget. There is only about a 50/50
chance of making the projected $9.2
MM. The most probable outcome
is the $14.4 million dollar loss
highlighted at the left side of the
forecast.

Simulations and legitimate financial forecasts are


standards in Six Sigma breakthrough projects. We have
seen simulations effectively tackle budgets with up to 77
variables. The level of thoughtfulness this tool creates is
well worth the time investment required.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


106 Evidence-based Six Sigma

Figure 8 Success in launching


with the NanoTech Widget product
depends on the company’s ability to
penetrate the market.

Process Maps

Maps, from Babylonian clay tablet cartography to


downloadable Internet driving directions, are universal
communication tools. Since maps have proven their value
they too are a “Six Sigma Tool.” It is no accident that a
process maps are the first-choice tool taken down from the
shelf after a lucrative new project has been selected.

A good process map is as multidimensional as a set of nested


Chinese Boxes or Russian dolls. Though nested boxes and
Russian dolls are not official Six Sigma tools, these analogies
encourage people to look more deeply than a surface
appearance.

The outermost Russian doll is called a Matreshka or


grandmother. Succeeding generations are contained within
her.17 Miniaturized generations are refined replicas. Each
three-dimensional replica must be produced using fewer
resources and a higher degree of precision. So it is with Six
Sigma process maps.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 107
First blush drawings can span pages. Over time, these are
simplified and distilled into diagrams that illustrate and
endorse only essential elements that pull the system forward
efficiently.

In the same way that the Space Shuttle Radar Topology


Mission used vectors, geometry, and computing power to
map 80 percent of the earth’s landmass in only 10 day’s time,
Black Belts are expected to map the nested dimensions of a
work process in about a week. Practice makes perfect. This is
one of the skills that is worth the practice time investment.
These maps are impressive.

The ‘Six Sigma Matreshka’ in Figure 9 is a Suppliers, Inputs,


Processes, Outputs, and Customers map or SIPOC for short.
It is assumed that this system has feedback loops throughout.
These loops are not illustrated here in order to present a clean,
simple picture.

Figure 9 Black Belts use personal interviews, first hand observations, and measurements to complete
this map. Drawing these maps from the end to the beginning is the best way to produce a meaningful
SIPOC map showing all the relationships.

��������� ������ ��������� ������� ���������

�������
��������� �������� ��������������������
���������� �����������
���������������

�����������
������ �� ���� ����
�������
����������
������������

����� ����
�������
������� ������������

����������

�������������

�����������
������

�������� �������� �������� ��������

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


108 Evidence-based Six Sigma
Invariably, every Six Sigma map uncovers Figure 10.
This Six Sigma drawing describes the “hidden-factory.” Its
appearance has a Charlie Brown and Lucy Van Pelt quality to
it. Just as the start of every football season is marked by Lucy
tricking Charlie into trusting her for the inevitable betrayal,
the hidden factory always makes an appearance.

�������
�����
��������� ��������� ��������
����������
� �������
��� �������
���������� ��������� �������
Figure 10 The hidden factory of
rework in this map includes Processes �����
4-6 and the related delay.
��
���������
���������
���������
���������
�����������������
������������
�������������������� �������� ���������
����������� ���������
���������� ��������� ���������

Waste and rework plague every production and service


delivery process. They always happen where: 1) a loop reverses
the forward motion of the product, 2) a delay, bottleneck or
constraint slows process flow, or 3) a barrier stops production
altogether.

Hidden factory maps are often posted in conspicuous places,


from the boardroom to the individual work space on the
factory floor. Mapping is a documentation discipline that
is rewarding and informative. The most sophisticated ones,
called lean process maps, are drawn in an old fashioned, low-
tech way using paper and pencils.

Lean maps have a lexicon and icon system that is worth


studying.18 Lean is a separate business tool that deserves, and
has, its own literature. Suffice it to say lean maps document
the entire value stream. They track information and
material flows from start to finish through an organization.
Management responsibilities are visible and a host of snapshot
measurements are recorded.

Lean metrics make sense. They include uptime, working


time minus breaks, cycle time (C/T), changeover time (C/
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Evidence-based Six Sigma 109
O), value added time (VA), Takt time, and production lead
time (PLT). In addition to its own acronym, each time has
an exceptionally specific operational definition. Times are
recorded in days, hours, minutes, and seconds. The ‘time-
is-money-money-is-time’ theme dominates lean thinking.
Rightfully so.

With the lean Six Sigma strategy in place a second can be, and
often is, literally worth thousands of dollars. For example, in
one San Jose Internet router factory a 2 foot by 2 foot by2
foot pile of scraped motherboards was time-valued at more
than $6 million dollars.

In addition, lean maps record production batch sizes for


every product interval (EPE), First-In-First-Out (FIFO), the
numbers of operators, inventory turns, a plan for every part
(PFEP), the number of product or service variations, and the
scrap rate.

Lean measurements and flow mapping earned their way


into Six Sigma the old fashioned way. They work. This
is why lean maps are a “Six Sigma Tool.” Their record of
extremely profitable achievements began in the 1950s Toyota
production system and continues to this day.19

Like a vector analysis, those who are familiar with lean tools
do not argue against them. To do so would be as foolish as
arguing against the speed of light, the existence of gravity, or
the impact of variation on measurement.

These maps help people identify process factors, known as the


X’s, that may be driving the system toward profits or losses.
Profits and losses, the dollar value of a process outcome, are
called Y’s.

A thought map, Figure 11, shows how these factors become


a series of hypotheses in a data matrix. Once the matrix is
filled with measurements, a vector analysis will point out
strong and weak Profit Signals with objective standards of
evidence.

“Hidden Factory” costs, or the costs of waste and rework are


called the Costs of Poor Quality (COPQ) Since the 1950s,
breakthrough projects have focused on eliminating these
expenses.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
110 Evidence-based Six Sigma

�������������������������
������������������������

������������������
�������

Figure 11 Maps help improvement ��������


teams identify variables that will be ���������
subjected to a 3D vector analysis.

����������� ����������� �����������


� � �

������� �������� ��������


�������� �������� ���������
���������� ��������� ����������
��� ��� ���

���� ���� ������


���������� ���������� ��������

The Costs of Poor Quality

The origins of the “Costs of Poor Quality” idea can be traced


to Walter Shewhart’s invention of the quality control chart on
May 16, 1924.20 The quality control chart is yet another way
of graphically viewing a vector analysis.21

Shewhart was a physicist. He was also the accomplished


statistician, friend, and colleague of Ronald Fisher who
volunteered to care for Fisher’s six children during World War
II. A German U-boat’s torpedo sank this plan in 1940.22

The dollar figures Dr. Shewhart symbolically placed on the


corners of Fisher’s work 80 years ago are today’s Six Sigma
costs of quality. In concrete terms, Armand V. Feigenbaum
is credited with developing the first dollar based, quality
reporting system while working at General Electric in the
early 1950s.23 It is worth noting that this development
ultimately can be linked to Jack Welch’s 1990’s Six Sigma
initiative. Why are the costs of poor quality so important to
Six Sigma breakthrough projects? Simple.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 111
Revenues are taxed. One dollar in newly earned revenue can
produce as little as one penny in new earnings. One dollar
saved through the elimination of waste and rework drops
to the bottom line as one dollar. To a certain extent, once a
Black Belt gets the hang of the breakthrough project system,
saving major dollars is like shooting fish in a barrel.

Each of four poor-quality cost categories can be leveraged.


Figure 12 Prevention and appraisal investments, often
referred to as costs, are relatively static. Internal failure
costs are hidden factory expenses that remain invisible to
customers. External failures are mistakes and errors that are
highly visible to the customer.

Since Feigenbaum first created this classification system,


prevention investments have been expected to produce, and
Figure 12 Textbook example of a have produced, a 10:1 return. For Six Sigma these costs
Cost of Poor Quality (COPQ) flow include training, education, analytic software, planning,
chart used by a Black Belt engineer,
Scott Erickson, to persuade senior
vendor certification using Six Sigma quality metric standards,
management to embrace evidence- and quality assurance system costs.
based decisions.

Appraisal investments include quality audits, inspections,


testing, maintenance, and information systems. Information
systems (IS) designed using principles of evidence-based
decisions are far less expensive than those that are not. If

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


112 Evidence-based Six Sigma
your company is looking for a place to begin Six Sigma, we
encourage you to create an IS strategy that is based on sound
geometric principles. The easiest way to learn if your system
meets these standards is to ask your IS department to show
you their data matrices and cube experiment arrays.

This question invariably raises eyebrows. The vast majority of


IS systems are modeled after spreadsheets. Transforming this
system, or transferring the information in it to a data matrix,
entails rework costs. Once the investment is negotiated, the
payback is spectacular.

Internal failure costs include all scrap and rework. Retests,


Failure Mode Effects Analysis (FMEA), excess inventory costs,
Corrective And Preventive Actions (CAPA) and productivity
losses are recorded here.

Finally, external failure costs are problems that land in


the customer’s lap. Liability suits, warranty costs, returns,
engineering changes, marketing and sales errors, complaint
handling, and related equipment required to rework products
must all be tallied up. In our increasingly litigious society it is
almost impossible to overstate the costs of failure.

Shewhart wrote humorously about the reality of 1939 quality


costs, “I am reminded of the old saying: when a doctor makes
a mistake he buries it; when a judge makes a mistake, it
becomes law. I would add in the same vein: when a scientist
makes a mistake in the use of statistical theory, it becomes
part of ‘scientific law’; but when an industrial statistician
makes a mistake, woe unto him for he is sure to be found out
and get into trouble.”24

The best way to protect any business from external failure


costs is to produce a perfect quality product every time a
product is produced. Delivering perfect quality services 100
percent of the time is a powerful a business strategy. Perfect
processes can and do produce virtually perfect outcomes.

Only processes that are capable of producing perfection


do produce this level of quality. A process capability index,
known as Cpk, is the analytic measure used in graphic
presentations of evidence documenting perfect quality.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 113
Process Capability

We will use dice rolls for our example. (Chapter 7 will extend
this experiment to include 4 die.) Feel free to roll your set
of dice until you get 5,000 measurements. Or, you can
accurately simulate the outcome of 5,000 rolls in a minute
using software. Comparing both methods will give you a good
feel for the value of Six Sigma vector analysis software.

Each of us has personally rolled dice more than 5,000 times.


We chose the simulation method for this example.

Statistical software does not know that the outcome of


throwing a pair of die is constrained to the range 2 to 12.
Therefore, it calculates and estimates statistical limits for a
distribution as if it were not constrained. In this way our
teaching analogy is flawed. Skeptics complain, “See. Statistics
lie! They can’t even handle dice rolling.”

These complaints are ludicrous. Ignore them. The point in


this exercise is a principle. And, by now you get the point.

With the click of a mouse button, software graphs our data


and tells us how capable it is. For this example, you can see
we set our Lower Specification Limit (LSL) for perfection at 2
and our Upper Specification Limit (USL) of perfection at 12.

Figure 13’s process capability curve tells us our process


is not capable of producing perfection. The Cpk value is
calculated by taking that old favorite, σ, and dividing it into
the spread of the data. A Six Sigma process yields a Cpk of 2
or more. If and when the tails of our curve fall above and/or
below our perfection specifications, these portions would be
scrap and rework.

Let’s game this system and improve our Cpk by setting our
LSL and USL perfection expectations at –10 and 30. Figure
14 shows that our process is a smoking Six Sigma process
fully capable of producing perfect quality outcomes 99.99999
percent of the time! We’re in the money. Note how tightly the
distribution curve is centered on the target of 7.

In real life, Six Sigma companies earn “high” Cpk values not
by lowering their standards, but by raising them relentlessly,
geometrically, and exponentially.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
114 Evidence-based Six Sigma

�������������������
���� �� ������� ����� �� �������
��� ������������� �������� ����� ����
���������������� �������� ���������
Figure 13 This process is not capable ��������
��������������� �����������
of perfection. Its Cpk value is only
0.640. ����������
������������

��� ���� ���

��� ���

� ��

To achieve these levels of perfection, they use a vector analysis


applied to a data matrix. The only way perfection can be
pursued and achieved is by using quantitative measurements
and analysis.

The only set of tools that makes this rate of improvement


possible is the scientific method and the geometry of a vector
analysis. This is why Six Sigma is not a fad. This is why there
is such a bandwagon rolling with Six Sigma Breakthrough
Projects and 6σ tools.

�������������������
���� �� ������� ����� �� �������
��� ������������� �������� ����� ����
���������������� �������� ���������
Figure 14 A Six Sigma process will
produce perfection every time.
��������������� �����������
��
�����������

��� ���� ���

��� ���

� ��

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Evidence-based Six Sigma 115
Endnotes

1
Box, George E.P., Hunter, William G., Hunter, J. Stuart.
Statistics for Experimenters, An Introduction to Design, Data
Analysis, and Model Building. New York. John Wiley & Sons,
Inc. 1978. Pages 170- 201.

2
The body of knowledge that is widely regarded as the most
comprehensive is posted by the American Society for Quality
http://www.asq.org/cert/types/sixsigma/bok.html

3
Mikel Harry, a popular leader in the Six Sigma field,
reported this history on a video tape recorded in 1995.
4
Shewhart, Walter. Economic Control of Quality of
Manufactured Product. Brooklyn, D. Van Nostrand Company,
Inc. 1931, page 5.
5
Cohen, J. Bernard. Revolution in Science, Cambridge, 1985,
Belknap Press of Harvard University Press. Page 96.
6
Cohen, J. Bernard. Revolution in Science, Cambridge, 1985,
Belknap Press of Harvard University Press. Page 96.
7
Our bell curve illustrations were inspired by a drawing
originally produced by Control Engineering Online.
8
Deming, W. Edwards. Out of the Crisis. Cambridge.
Massachusetts Institute of Technology, Center for Advanced
Engineering Study. 1982.
9
As a sidebar note, it is interesting to know that Gantt
patented a number of devices in collaboration with Frederick
Taylor when they worked together at the Bethlehem Steel
Mill on Taylor’s Scientific Management theory.
10
Inspiration for this particular grid came from
Moresteam.com. http://moresteam.com/ Their on-line Six
Sigma Black Belt course is interesting and informative.
11
http://www.fenews.com/
12
http://www.processmodel.com/

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


116 Evidence-based Six Sigma
13
Gardner, Martin. Fads and Fallacies in the Name of Science.
New York, Dover, 1952. Page 184.
14
http://www.decisioneering.com

http://www.decisioneering.com This spreadsheet is used


15

with permission along with the flow diagram for financial


models.
16
http://www.decisioneering.com The numbers and layout
of this budget come from Decisioneering’s tutorial example,
ClearVision.
17
An interesting history of this symbolism can be found at
http://www.nestingdolls4u.com/history/history.htm
18
http://lean.org
19
Womack, James P., Jones, Daniel T., and Roos, Daniel.
The Machine that Changed the World. New York, Rawson
Associates Scribner Simon and Shuster, 1990.

Harrington, H. James. Poor Quality Cost. New York, Marcel


20

Dekker, Inc. 1987, page v.

Shewhart, Walter A. Economic Control of Quality of


21

Manufactured Product. New York, D. Van Nostrand and


Company, 1931. Page 40.
22
Box, Joan Fisher. R. A. Fisher, Life of a Scientist, New York.
John Wiley and Sons, 1978. Page 377.

Harrington, H. James. Poor Quality Cost. New York, Marcel


23

Dekker, Inc. 1987, page xiv.

Shewhart, Walter A. Statistical Method from the Viewpoint of


24

Quality Control. New York, Dover Publications, Inc. 1986.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 4

Case Studies

C
ase studies needed to meet four criteria. Though
names, places, and data were altered to protect
privacy, each story had to be true. It had to be
entertaining. Each example also needed to graphically explain
how evidence-based decisions produced crowd pleasing
financial returns. Finally, the story needed to be a fair,
representative sampling of what we each have repeatedly seen
over the past 20 years of our professional life.

Occasionally, the stories in this chapter trouble some


managers. Though we tried, we were unable to completely
resolve this perplexing, vexing journalism quandary.

One senior executive reviewer echoed Daniel Sloan’s own


1986 Vice President of Marketing’s sentiments. “The stories
in this chapter upset me. As a senior executive, maybe I
just took them personally. They hit too close to home. It is
difficult for me to keep reminding myself that what is past
is past. I have to keep telling myself that evidence-based
decisions can and will prevent me from repeating history.”

Case studies are essential to understanding. We bit this bullet


and chose to include them. We decided to tell them the way
clients tell them.

Recounting a Six Sigma project victory is like explaining a


magic trick. Once a wizard’s secret is revealed, someone in
the audience thinks, “Shoot! I could have done that.” But no
one, including the magician, can do it without knowing how.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


118 Case Studies
Magic tricks are illusion. Evidence-based decisions put real
dollars in real banks.

Evidence is a funny thing. Many of us are interested


in evidence only when it confirms an existing belief
or policy. This human tendency creates a resistance to
transparent reporting systems in business and government.
Confidentiality is necessary in business and government. Too
often, confidentiality is used to justify secrecy.

Another human tendency is to equate evidence with


authority. Then both are tarred with a brush of cynicism.
This position was summarized in the March 1998 issue of
Discovery Magazine:

“Anybody who claims to have objective knowledge about anything


is trying to control and dominate the rest of us…There are no
objective facts. All supposed ‘facts’ are contaminated with theories,
and all theories are infested with moral and political doctrines…
Therefore, when some guy in a lab coat tells you that such and
such is an objective fact,…he must have a political agenda up his
starched white sleeve.” 1

This “know-nothing” doctrine stems in part from inadequate


science and mathematics education. It is contradicted by the
documented successes of the evidence-based decisions that
power Six Sigma breakthroughs.

Still, it is also true that data can be suppressed, ‘massaged’


or just plain falsified. Disraeli’s comment “Lies, damn lies
and Statistics” was a reference to this problem. No analysis
method can deliver us from the unethical corruption of
reported data.

However, given good data in a data matrix, vector analysis


makes it virtually impossible to misrepresent the information
in that data. Vector analysis is based on immutable Laws
of the Universe. It is transparent. All aspects are revealed.
It uncompromisingly tells the truth. The cornerstone of
evidence, a tetrahedron, symbolizes ‘solid evidence’.

Transparency, full disclosure and international standards


for data analysis are the reasons Six Sigma works. They
are also the characteristics that some find most disturbing
about Six Sigma. It is no surprise that spreadsheets have
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Case Studies 119
sensational appeal. Spreadsheets snap tightly to the New Age
mantra, “Tell your own truth.” New Age know-nothings can
structure data any way they want, and they can analyze it
any way they want.2 Transparency and secrecy, honesty and
misrepresentation, are equally weighted options.

Spreadsheets are the engines for the cost-accounting variance


analysis. These methods are inherently one-dimensional. Each
uses only one of the six vectors in the cornerstone of evidence.
None of them recognize the essential Profit Signal and Noise
vectors.

In this sense, cost-accounting variance analysis suppresses


five-sixths—83 percent—of the information needed for an
evidence-based decision. Because break-even thinking and
cost accounting variance analysis allow management to ignore
five of the six reality vectors, it is easy to construct any story
that is consistent with any one vector. Naturally, people tend
to construct stories that favor their point of view.

As Master Black Belt teachers, we use forthright honesty,


computers, software, graphics, the cornerstone of evidence,
and the New Management Equation to dispel the mystery
surrounding evidence-based decisions. Once people harvest
Six Sigma profits by making better decisions, objections
diminish.

Everyone wants to make more money in less time, with less


work, and using fewer resources. Doing more of what works
is a doctrine to embrace. Knowledge and reliable information
start the Six Sigma DMAIC ball rolling. It leaves the trial-
and-error methods of old-school management in the dust.

Customer Service – Governmental Agency

Political pressure was forcing a Washington State government


department to improve the quality of their services or face the
loss of $500,00 in funding as a penalty.

The department’s Executive Director gave employees the


opportunity to choose a consultant to help them in their
efforts to maintain current funding levels. A Five-Minute

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


120 Case Studies
PhD demonstration and evidence-based decision tools
attracted their attention.

Define: Jobs, including management jobs, were on the line.


One-half million dollars in legislative funding was at stake.
Negative regional news coverage over departmental problems
made state citizens angry. Poor customer satisfaction had put
this department on the legislative target hit list.

To begin the project, flow diagrams and a Pareto analysis


exposed breakthrough improvement opportunities. A specific
criticism concerning this department’s performance had to
do with the way it answered its telephones. Several full time
clerical staff answered phones that literally rang off the hook.

The agency’s executive director knew calls went unanswered.


Armed with her own good judgment, she had instituted a
department policy by edict. “All phones will be answered by
the third ring. Answering machines are prohibited because
they symbolize poor quality service.” The ringing phones were
right outside her office, and she monitored her policy.

Measure: The team of secretaries who answered the phones


claimed they had a good solution to the problem. “We can’t
get anyone to listen to us. We just do what we are told.”
We suggested that they use a check sheet to collect data; we
promised to help them present their evidence.

There were a number of suspected causes, or hypotheses, for


the unanswered phone flash point. These included:

• Hypothesis 1 (H1): The day of the week makes a


difference. Some days are busier than others.

• Hypothesis 2 (H2):The time of day makes the


difference. Some times are busier than others.

•Hypothesis 3 (H3): The telephone line made the


difference. One line was busier than the other one.

Using a paper and pencil, the team of secretaries constructed a


check sheet to record matches with the cube experiment data
matrix (Table 1).

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 121

Table 1 The cube experiment


data matrix guided the collection
of data recorded by hand on a
check sheet.

Analyze: The data matrix revealed a distinct profit signal. The


main effect was so obvious everyone could see it immediately
just by looking at the matrix. Figure 1 presents the data in
a cube plot. Hypothesis 3 was the “big hitter”. The numbers
of calls on line 2, the back face of the cube, were an order of
magnitude larger than the number of calls on line 1, the front
face of the cube. No other variable had an effect.

���������

������� �������
Figure 1 All of the high numbers fell
���������

on the back plane.


������ ������
����

������� ������� �� �
����
����

������ ������ ���


����� ��� ������

Line two was sending a clear profit signal. It turned out


that line two had been listed incorrectly in telephone
directories across the entire state. This proofreading error was
embarrassing. It would have been expensive to fix. No one
had the courage to bring it to the attention of the agency’s
executive director.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


122 Case Studies
The executive director’s edict compounded the fear factor.
Rather than addressing the core issue, the workforce decided
it was much easier to keep their heads down. They became
telephone operators and gave dialing assistance to callers. We
took their evidence forward with a firm conviction that, in at
least this case, the messenger would not be shot so early in a
consulting engagement.

Improve: One hour after the presentation of our evidence, a


telephone answering machine was purchased and installed.
The executive director gave this improvement her blessing
with a belly laugh. The answering message announced the
Yellow Pages error and then the correct number to callers. Six
secretaries and other workers could now focus their attention
on real work.

Control: Telephone listing corrections were made the


following year. A breakthrough in the proofreading process
ensured 100 per cent, Six Sigma accuracy.

This breakthrough played a role in persuading legislators to


sustain funding at existing levels.

The total time to collect data for the data matrix was five
days. The analysis and presentation took one hour. Eventually
one full time position was eliminated through attrition for a
bottom line savings of more than $25,000. This, combined
with avoiding the loss of funding, brought the total value of
the project to $525K.

Days in Accounts Receivable

A service company needed to reduce its number of days in


accounts receivable, or AR days. The number of days in AR
ranged from 35 to 110 days. A breakthrough improvement
project could yield as much as 35K per month, or $420,000
per year in cash flow.

Define: For more than a year debate had raged over what
could be done to reduce AR days. Suspected causes for this

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 123
problem varied. These suspicions, or straw men hypotheses,
included the following:

• Hypothesis 1 (H1): Management is the solution.


Good managers have short AR days. Bad managers
have lots of days in AR.

• Hypothesis 2 (H2): Sales calls are the answer. The


number of visits made by a salesman to the customer
is key. The more visits, the larger the number of AR
days. The fewer the visits the smaller the AR days.

• Hypothesis 3 (H3): The customer is the main reason


for long or short AR days. Good customers pay fast.
Poor customers pay slowly.

• Hypothesis 4 (H4): The longevity of our customer


relationship makes the biggest difference. Long-term
customers pay more slowly because they know our
business depends on them.

• Hypothesis 5 (H5): The number of services provided


determines the number of AR days. More services
create complexity. Billing complexity slows payment.

Measure: Significant AR data had been collected. These


records were stored in file cabinets. Each customer, and there
were hundreds of them, had its own manila folder. It was with
a great deal of pride that the accounting team showed bills
were filed in near perfect chronological order.

The Chief Financial Officer of this company was committed


to keeping productive hours in line and on budget. Workers
in his department were required to do their jobs, as well as to
work on breakthrough projects. No overtime would be paid
for improvement tasks. A regular work schedule would be
continued. Moreover, in order to keep operating costs low, no
statistical software would be purchased. “Spreadsheets work
fine.”

We interviewed every employee and constructed process flow


diagrams. We identified five important variables that might
affect AR days.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


124 Case Studies

Figure 2 Statistical software


automatically determines the best data
matrix geometry for a vector analysis
involving five independent variables.

The CFO had vetoed a recent budget request for a PC


workstation and relational database software, so automatic
queries and data mining were out of the question. Going
to Plan B, we used our own statistical software to create an
optimal data matrix for five independent variables at two
levels each (Figure 2). This took all of five minutes. It is
virtually impossible with a spreadsheet.

Creating a data matrix is one thing. Collecting data that


fits the profile of each run is another. A billing clerk and a
billing manager volunteered to come in over the weekend and
pull records. They believed they were familiar enough with
customer profiles that they would be able to find bills that
would match each of the 32 different “runs” in the matrix.

These two front line leaders wanted to find out what


combination of variables actually made a difference. They
knew that if they found an answer, it would be valuable. Their
daily workload was so challenging, they simply didn’t have
time to array any more spreadsheet data than they already
were doing for the CFO during the regular workday.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 125
Figure 2 shows the first 28 rows of the data matrix with
the number of AR days visible in the far right hand response
measure column. The every-other-row pattern of a short
number of days in AR followed by a long number of AR
days was evident immediately to the accounting department
workers on a Sunday morning.

Analyze: Three strong profit signals emerged from the vector


analysis we applied to their data matrix. The p-values in
Figure 3 appear under the heading “Prob > F”. We could say
with better than 99.999 percent level of confidence that the
customer was a main effect. The computerized vector analysis
also showed, with a 99% level of confidence, the length
of the customer relationship was another active factor that
influenced the number of days in AR. The main effects were
controversial. Anxiety filled the air.

��������� ���
������ ����� �� ������ �������� ������� ��������
�������� ���� ���� ���������� �������� ��������
������������ ���� ���� ���������� �������� ��������
�������� ���� ���� ���������� �������� ��������
������������ ���� ���� ���������� �������� ��������
Figure 3 P-values less than
������� ���� ���� ���������� �������� ��������
0.05 imply a 95 percent level of ��������������������� ���� ���� ���������� �������� ��������
confidence or more in the results. ����������������� ���� ���� ���������� �������� ��������
The two factors, Customer and ��������������������� ���� ���� ���������� �������� ��������
Relationship, and their interactive ��������������������� ���� ���� ���������� �������� ��������
effect, were statistically significant ������������������������� ���� ���� ���������� �������� ��������
��������������������� ���� ���� ���������� �������� ��������
at this confidence level. ���������������� ���� ���� ���������� �������� ��������
�������������������� ���� ���� ���������� �������� ��������
���������������� ���� ���� ���������� �������� ��������
�������������������� ���� ���� ���������� �������� ��������

The key difference between the two customers was widely


known. Customer A was billed electronically. Customer
B was billed manually. New customers were able to bill
electronically. Old customers were not.

The company’s Chief Financial Officer and Chief Executive


Officer disliked computers. They still do. The CFO openly
opposed the use of statistics. The CEO had excused the
finance department from participation in breakthrough
projects “until the data matrix and vector analysis tools proved
to be useful.” A year earlier, the Chief Financial Officer had
refused to approve the purchase of a $15,000 PC workstation

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


126 Case Studies
in his department to keep costs down. The electronic billing
and relational data base topics were verboten.

Improve: The team spent a week gathering its courage and


preparing evidence for a presentation to senior management.
Following the presentation, the company purchased and
installed a top-of-the-line workstation. A $1,000 request to
purchase data matrix software for the finance and accounting
department was denied. “Spreadsheets work fine.”

Figure 4 explains part of the reason that executive resistance


to evidence-based decisions continued. 3D vector analysis
pictures do not look like bar graphs or pie charts. This
particular finance department found 3D cube graphs to be
upsetting. Figure 4 presents accurate AR day predictions for
Figure 4 The two-level, five-factor, differing combinations of all five factors. Note that both of
25, vector analysis compares all the the top cubes have shorter AR days. When the AR Days come
factor interactions using the traditional from customer A and a new relationship, AR days are lower
3D cube. than with any other combination of factors.

���������

���������������� ����������� ���������������������������

������ ����� ����� ������


������

������

�� ������ ������ ��
������������

������������

����� ������ ���� ������ ����� ����

�������� ��������
�����

�����

������ ���� ���� ���� ������ ����


���� ������� ��� ���� ������� ���

���������������� ����������� ���������������������������

����� ������ ������ �����


������

������

������ ���� ���� ������


������������

������������

������ ����� ���� ����� ������ ����

�������� ��������
�����

�����

���� ������ ���� ������ �� ����


���� ������� ��� ���� ������� ���

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 127
Control: Results produced by lowering days in AR by 30 days
exceeded the projected $420,000 cash flow gain in the first
year. Total time required to complete project was 90 days. As
the financial crisis passed, so did the use of evidence-based
decisions. The heads up improvement team put their heads
down and went back to work.

To this day, the company has refused to invest in either the


education of its finance and accounting workforce, or the
purchase of data matrix software. “Spreadsheets work fine.”

This experience taught us to present profit signals using a


special kind of a bar graph known as a Pareto Chart rather
than the more powerful cube. People just want the answer.
The Pareto chart in Figure 5, which rank orders profit
signals from strong to weak, gives customers what they want
in a way that poses no visual threat.

Figure 5 Profit signals are easy to


spot using a graph that ranks vectors
from strong to weak.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


128 Case Studies
Breaking the Time Barrier3

Long waits in hospital emergency departments are legendary.


The list of likely causes includes over crowding, a shortage of
nurses, an aging population, a shortage of inpatient and/or
long term care beds, and a saturated primary care system.
The Joint Commission on Accreditation of Health Care
Organizations (JCAHO) has recognized the critical nature
of Emergency Department (ED) overcrowding. JCAHO
has instituted new Emergency Department Overcrowding
Standards requiring a hospital’s serious attention.

The Emergency Department is the front door of a hospital. It


often accounts for a significant percentage of all admissions.
Service excellence that meets or exceeds the public’s
expectations is essential.

The newly hired administrator of a community hospital,


a certified Six Sigma Black Belt, selected the ED as the
hospital’s initial Six Sigma Project on her first day of work.
The ED Charge Nurse had told her, “Our ED is closed to
ambulances. We are on divert status.”

In response, the Black Belt RN, CEO administrator asked,


“What are the standards of evidence you use when you decide
to close the ED?”

The Charge Nurse responded, “Well, we are simply


overwhelmed. We cannot provide safe care if one more sick
patient comes through those double doors.”

The RN, CEO administrator, a 30-year veteran with a


Masters of Public Health administration degree, glanced
around. She saw vacant treatment rooms. Three staff
members were cautiously watching her from behind the
nurses’ station. They were all pretending to be charting. The
nearby ED physician gave her an apologetic smile and said,
“This happens all the time. You might as well get used to it.”

The following Monday, after reviewing the hospital’s


admission data and top revenue producing departments, the
administrator called a meeting to discuss the “ED Divert”
issue. The ED Medical Director, ED and Critical Care nurse
managers, directors of the laboratory, imaging, environmental
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Case Studies 129
services, and the Emergency Medical Treatment (EMT)
director from the local fire department responsible for the
paramedics all attended.

Everyone was resigned to the status quo. No amount of effort


could reduce ED divert time. It was an inevitable result of
growing volumes. “Every hospital in the city is having the
same problem. Why at Mid-Valley, their ED is closed twice
as much as ours.” “If you want us to put our nursing licenses
at risk, well . . .” “There are never any beds in the ICU.” “If
the Cath Lab crew was in-house 24/7, why that would solve
the problem.” “The CT tech takes call from home after
midnight, . . . We’re always waiting for her to come in.”

As they reviewed the actual numbers from data that had been
collected and arrayed in a data matrix, they were surprised.
During the past six months, the Emergency Department
had been closed or on diversion (divert) more than 5300
minutes/month. This totaled eleven, 8 hour shifts, or three
and two-thirds 24 hour days, or 12% of available time. Those
closures penalized patients. They cost the hospital hundreds
of thousands of dollars in potential revenue.

The list of suspected reasons for going on diversion status


were as varied as the professional team that sat around the
table. Every member had her or his own favorite reason,
or two or three, that they firmly believed was the primary
cause of ED divert. The general theme was that Emergency
Department diversions were caused, in large part, by “other”
departments in the hospital, from Admitting to X-ray.

The Black Belt CEO led a brainstorming process to identify


Critical To Quality (CTQ) factors. This is a crucial DMAIC
first step no matter what the project is. No one in the ED was
familiar with Six Sigma techniques or tools. Nevertheless
they began to wrestle with a complex process that involved
most of the hospital’s departments.

The Team began planning steps in the Six Sigma DMAIC


breakthrough process. They designed an experiment. They
ran it and analyzed their data.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


130 Case Studies

Break Through Results

The profit signal vector analysis showed that once a decision


to admit a patient was made, performance pressure was off.
The admission process simply slowed to a near stop. An
“acceptable” wait time for a patient being admitted was open-
ended. Any amount of wait time could be rationalized. Once
this practice was halted, everything changed. The department
was astonished. In the first two months their initial project
results list included:

• The average hours on ED Divert dropped from


88 to 50 per month, a 48% reduction from the same
period the prior year.

• The number of Emergency Department visits


increased by 12.6%.

• The average ED Length Of Stay (LOS) shrank from


3.6 hours to 1.9 hours.

• A 38.26% increase in Emergency Department gross


margin was generated.

• Patient satisfaction increased from 59% to 65%.

• Catheterization Lab time dropped from 93 minutes


to 10 minutes.

• Intensive Care Unit bed availability increased by


10.6%.

DMAIC

DMAIC is the standard Six Sigma breakthrough project


methodology. The application of DMAIC to this hospital’s
ED overcrowding and diversion problem produced dramatic,
measurable, sustainable results.

Define issues systematically, statistically and practically.


Issues identified included time on ambulance divert,
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Case Studies 131
unacceptable ED patient length of stay (LOS), low
patient and staff satisfaction levels, patients leaving
without treatment (LWOT), and considerable lost
revenue. The team established performance measures and
targeted benchmark targets for each goal.

Measure using maps, models, diagrams, and process flow


diagrams. Identify and array CTQ factors. Collect data,
observe the process and begin data mining. Door-to-
door Length of Stay (LOS), Left Without Being Treated
(LWOT) as a proportion of all patients, and patient
satisfaction levels were identified as the CTQ factors
the team wanted to study. They prioritized reducing
(minimizing) Emergency Department Length of Stay
(LOS) as the key response. Everyone felt that all other
issues would improve if LOS could be reduced.

Analyze data using a vector analysis applied to a data


matrix. Prepare appropriate quality control charts and
Design of Experiments (DOE) to determine CTQ factors.

Improve the process using evidence-based decisions to


power Six Sigma breakthroughs.

Control the process to insure that break-through


improvements were sustained.

Wisdom Gained Along The Way

The knowledge experts on the ED Six Sigma team used


inductive reasoning. They identified potential Critical
to Quality (CTQ) variables they believed influenced the
department’s Length of Stay (LOS). CTQ variables identified
for evaluation were the patient’s gender, a “slow” and “fast”
physician or nurse (identified by employee number), a
decision to admit or not. Ready availability of an ICU bed,
laboratory and imaging testing were other variables.

The Black Belt Administrator, created an 8-Factor Designed


Experiment on her laptop, using data matrix software, JMP
5.0. Evaluating these 8 factors required only 16 runs to
complete. The data were gathered in less than 24 hours
(Figure 6).

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


132 Case Studies

Figure 6 Custom design for an


8 factor, two level, Emergency
Department Length of Stay (LOS)
Experiment.

Before the project, ED staff and physicians ranked laboratory


turnaround time as the most significant CTQ factor that
influenced the length of stay in the Emergency Department.
CT technician availability ranked a close second. The results
of the Designed Experiment (DOE) were surprising to
members of the Six Sigma Project team (Figure 7). This was
one of many “ah ha’s.”

Don’t trust your assumptions or your “gut,” even if you are an


expert.

Six Sigma techniques, including a carefully designed


experiment and rigorous data analysis (computerized
software makes it easy) provided evidence at the 95% level
of confidence. This confidence level helped managers make
critical decisions quickly.

Figure 7 Results of an 8-Factor


Designed Experiment.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 133
The Project Team discovered the CTQ factor with the most
impact on ED LOS was admission status. Those patients
being admitted had a significantly longer Length of Stay than
those who were treated and released. Running a close second,
was the availability of an Intensive Care Unit bed. Both were
at the 99.99% confidence level of significance.

While these CTQ factors, admission status and availability of


an ICU Bed, may appear obvious now, they were not at the
outset of the project. Finding these two highly significant
factors focused the team’s efforts.

A drill-down of data revealed that less than 9 percent of the


patients who entered the hospital’s ED by ambulance were
ultimately admitted to the ICU. Yet, the most frequent
reason given for instituting the ED diversion status and
closing it to customers was “No ICU Bed.”

Staff and physicians operated under the false assumption


that “most” ambulance admits were very sick and “nearly
all” would require an admission to ICU. There was a
related assumption that the EMTs expected an ICU bed to
be immediately available or they would take the patient to
another hospital. The small percentage of admissions to ICU
was a surprise to the EMT medical director. He voluntarily
educated his staff so they would rely on the judgment of the
ED staff.

When the administrator began discussing ICU bed


availability with the nursing staffs in the ED and ICU, she
quickly uncovered an insidious attitude. It was, “Us against
them.” Nurses (and, to a lesser extent, physicians) believed
that “their” department worked harder and the “other”
department was attempting to shift their workload to them.

The lack of trust between the ED and the ICU required


immediate attention. Nurse managers evaluated and resolved
issues between their departments. They arranged schedules
and provided time for nurses to “walk in the shoes” of nurses
in the other department. Attitudes quickly changed. Nurses
gained an appreciation of the unique and essential role each
service provided to quality patient care.

ED and ICU service medical directors developed patient


admission and transfer criteria that were approved by the
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
134 Case Studies
Medical Staff. The criteria, based on a patient’s need for
intensive nursing care, authorized nurses to transfer patients
out of ICU to open a bed for a new admission.

Communication between the two departments was difficult.


The ED nurse was required to provide a detailed report to
the ICU nurse before she could initiate a patient transfer.
Telephones might go unanswered in the ICU due to the
immediacy of patient care needs. Working together, the
nursing staffs developed a 1-page report that the ED nurse
would fax to the ICU in the event they were unable to
complete a telephone report. A transfer of the patient to the
ICU occurred automatically 30 minutes after the report was
faxed, unless ICU notified the ED to hold the patient. This is
now an uncommon occurrence.

An unintended but exciting result of the Six Sigma ED


Project was a stunning reduction in “Door to Cath Lab” time.
(This is a measure of time from the patient’s arrival in the ED
to the initiation of treatment in the cardiac catheterization
lab.) Before the ED project, average door-to-cath-lab-time
was a respectable 93 minutes. While this time met national
standards, it was longer than the hospital’s nearby competitor.
EMTs transported their most critical patients to the
competitor hospital because of their superior door-to-cath-
lab-time.

A flow process diagram revealed the problem. Patients in


the field with a potential diagnosis of myocardial infarction
(MI) were evaluated by the EMTs, in consultation with the
ED physician. When they arrived in the ED, they were re-
evaluated by the ED physician, including completion of lab
work and a repeat EKG, before the cath lab team was notified.

Drill down analysis of outcome data revealed that the EMTs


diagnosed MI with nearly 100% accuracy. The delay in
calling in the cath lab team cost precious heart muscle-saving
time. With the support of the administrator for the potential
cost of additional cath lab salaries in case the EMTs diagnosis
was incorrect, ED staff were encouraged to rely on the EMT’s
field diagnosis and initiate the call to the cath lab team as
soon as the EMTs called in from the field. Door-to-cath-lab-
time plummeted to 10 minutes!

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 135
Effecting and sustaining significant change is hard work.
The need for change creates strong emotions in people,
particularly when you are the one who is expected to change.
People experience roller-coaster emotions of fear, loss, and
denial, before reaching acceptance. This is all normal. A
critical function of the Black Belt is to manage people’s
feelings and emotions so improvements occur and are
sustained.

The success of this project had a positive impact across the


hospital. All departments and staff learned to value ‘their’
ED as the ‘front door to their hospital.’ At the end of the first
year, with an ED diversion time of near zero, the Emergency
Department treated over 37,000 patients and realized a gross
margin of $18 million.

This was a 38.26% improvement over the previous year.

“Beating Heart” Bypass Grafts

Though altruism and evidence influence medical treatments,


economic pressure drives improvement. Multi-million dollar
savings created by “beating heart” or “off-pump” coronary
artery bypass outcomes are a case in point.

Historically speaking, medical “Six Sigma” style


breakthroughs have astonished the world. Near zero death
rates related to surgical anesthesia and the polio vaccine’s
safety record are but two near perfect success examples. Sir
Austin Bradford Hill’s 1951 sentiments sound as fresh as a
21st Century General Electric Six Sigma news release:

“In treating patients with unproved remedies we are, whether


we like it or not, experimenting on human beings, and a good
experiment well reported may be more ethical and entail less
shirking of duty than a poor one.” (Br. Med 2:1088-90, 1951,
Hill, 1952)

The ability to consistently replicate experimental outcomes


with a high degree of confidence is of paramount importance
to everyone in the health care system. Again, off-pump
surgical technique provides an ideal compass setting that
points the way to breakthroughs. Since health care Six Sigma
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
136 Case Studies
breakthroughs simultaneously improve the quality of patient
outcomes and profitability, “off-pump” coronary artery
bypass grafts (CABG) projects are substantive.

Limited financial resources fostered the early 1980’s


development of “beating heart” CABG surgeries in Argentina.
Compelling statistical evidence is leading to the reluctant
acceptance of this surgical technique in competitive, 2003,
USA health care markets. Patient demand for this lower cost,
higher quality procedure has forced, and is forcing, surgeons
to master a challenging, higher standard of care.

Again, the classic evidence-based decision cycle, Define,


Measure, Analyze, Improve, and Control, provides a
convenient way to summarize this story.

Define: For over 40 years, the use of cardiopulmonary


bypass (CPB) pumps defined coronary artery bypass grafting
(CABG) procedures. Good outcomes and the relative ease
of working on an arrested heart led most cardiac surgeons to
favor the use of CPB.4 Statistically significant blood utilization
and neurological side effects associated with on-pump
surgeries were considered to be acceptable—necessary—
collateral damage related to the bypass operation.

Though statistical evidence suggested off-pump operations


were safe and advantageous for select patients, the prevailing
beliefs of cardiac surgery sustain physician commitment to
the on-pump surgical technique. It has taken a decade for
surgical practice patterns to emerge that reflect sentiments
expressed by researchers in 1992. “Further research should be
directed to which subgroups can be operated on to advantage
off-pump and which, if any, groups of patients should be
confined to on-bypass operations.”5

Patterns and pattern recognition are key elements in the


identification of breakthrough improvements. Database and
computing systems accelerate both when they are included
in an open system feedback loop. Figure 8 illustrates the
classic, standard Six Sigma closed feedback system. The closed
feedback loop idea is a serious theoretical error that can be
traced to the 1990’s pseudoscience of “systems thinking.”6

Closed feedback loops create entropy. Closed feedback


systems are driven by opaque, spreadsheet analyses and story

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 137

��������������������
���������� �������������������
�������������
�������� ���������
������������������
�������� �����������
������������������
Figure 8 The recommended Six
Sigma closed loop feedback system
is contrary to evidence-based
decisions. Closed loops create
entropy.
������������������������� ����������������
������������������
�������������������������
������������������������� ����������������
����������� �������������

telling where 83 percent of the information contained in raw


data are suppressed.

Evidence-based decisions must have open feedback systems.


Open feedback systems depend upon the continuous entry
and flow of objective evidence into judgments.

Obviously doctors, nurses, allied health professionals, and


administrative leaders are the Six Sigma “executive champions
and Master Black Belt” experts who initiate breakthrough
improvement actions.

In addition to quantitative, open loop feedback measures,


qualitative impressions frequently expose opportunities. In
the off-pump/on-pump dialogue, one qualitative signal is
the long running practice of opinionated debates between
surgeons. Without a commitment to evidence-based
decisions, these discussions are generally sustained without
referencing or generating statistical evidence for analysis.

Measure and Analyze: Though surgical practice data are often


collected by hand, increasingly this data is automatically
entered into databases. Integrated statistical software packages
now make it possible to analyze measurement data almost as
quickly as they are recorded. Figure 9 shows columns and
rows of data for a single cardiac surgeon who, after a number
of his patients canceled their scheduled on-pump surgeries
in order to have them performed off pump by a different

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


138 Case Studies
surgeon at a competing hospital, decided to master the off-
pump surgical technique.

Figure 9 A data matrix arrays


historical data so a vector analysis
can be used to identify profit signals.
This array documents charges,
lengths of stay (LOS) for patients and
type of CABG surgery either off-pump
or on.

The computerized analysis of length of stay data in Figure


10 reflects findings that are similar to the 443 peer-reviewed
articles published on the on-pump/off-pump subject since
1992. The peer-reviewed literature on this topic is consistent
to a remarkable degree. Patients who undergo off-pump
CABG surgeries experience dramatically lower lengths of stay.

���� ������������������ ������������������


��
��

Figure 10 The strong profit signal ��



between the lengths of stay for on
������������

pump and off-pump surgeries are �

eye catching with a statistically �



accurate “flying saucer” graph. �

On the hyperspace vector �



analysis applied to a data matrix �� � ��

thrill ride, the difference between ���� �

data sets is significant at the 95% ���� ��������


������ �� ������������
confidence level if the saucers ������ �� ������ �������� ����������� ������� ��������

can fly past each other without


���� � ������ ���������� �������� �������� ��������
����� ������ ���������� ��������

crashing,
�������� ������ ����������
����� ��������� ��������
����� ������ ���� ��������� ��� ������ ���������
�� � ����� �������� ������� ������� �������
�� ����� �������� ������� ������� �������
������������ �������������� ��������� ���������������

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 139

Figure 11 The Profit Signal in patient


Lengths of Stay (LOS) were related
to off-pump CABG surgeries. These
improvements were dramatic.

Literature searches used to cross check statistical inferences are


a value added service physicians appreciate.

A quality control chart, Figure 11, provides another view of


the impact off-pump surgical technique brings to the quality
of patient care. As the average length of stay shrinks, so does
variation around the mean.

Since 1931, this pattern has symbolized the classic


breakthrough pattern of an evidence-based decision. These
breakthroughs now lead to near perfect performances known
as Six Sigma.

The surgeon’s database was stratified to facilitate a three-


dimensional statistical analysis to consider the effect a number
of other factors might have had on length of stay outcomes.
Factors we considered were diagnostic (ICD) code variations,
co-morbidities, age, gender, and race. An example is shown
in Figure 12’s cube plot. The Cartesian coordinate system’s
cube is an ideal graphic for presenting multidimensional
statistical evidence. The numbers contained in the rectangular
boxes at the cube’s corners are average values. Even a novice
can interpret the results at a glance.

In Figure 12, all four of the shortest lengths of stay related


to CABG are located on the cube’s left plane. The shortest
average length of stay, 1.875, was a result of an off-pump
surgery with a male patient with ICD code 36.11. All of the
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
140 Case Studies
longer lengths of stay are located on the cube’s right plane.
The longest average length of stay, 6.875, was the effect of on
pump surgeries for men with ICD code 36.12. Though three
factors are presented simultaneously, the only statistically
significant factor related to a lower length of stay was a
surgery performed off-pump.

This case study did not include a Pareto chart analysis


summary for two reasons. First, the data matrix software used
to produce the evidence in this case did not have that feature.
In addition, the organization had progressed beyond the need
to present data in a simplistic way. Decision makers wanted to
look at advanced, Six Sigma style, evidence charts.

Figure 12 Profit signals compare the


surgeon against herself. We can say
with a 95 percent level of confidence
that when off-pump surgeries are
used on appropriate patients, they
produce medically superior outcomes
and lower lengths of stay.

Improve: Sixteen years of experience in promoting


breakthrough improvements in health care quality and
productivity teach an important lesson. Before changes occur
in physician or hospital practice, benefits must be translated
into a compelling financial story. Though this reality can
be disheartening for caregivers who put patient safety first,
leaders must prioritize cost accounting if they expect to see
system wide improvements take place.

Simulation modeling using spreadsheets is a relatively easy


data matrix tool to master. The psychological impact of
seeing 1,000 or more iterations of multivariate spreadsheet
practice scenarios is significant. More often than not,
spreadsheet simulations are persuasive.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 141
Figure 13 shows the profit signal’s probable financial impact
for one surgeon. The low end of the forecast’s distribution
suggests that by mastering the off-pump procedure for the
majority of her patients, an additional 448K in revenue would
be generated. On the high end of the distribution, this change
could produce as much as $1.45 million.

Actual results fell near the center of the prediction parameters.


Savings were achieved through lower nursing care costs and
overhead. Off-pump patients avoided adverse side effects
while the hospital enjoyed improved profitability. These
results are classic hallmarks of a Six Sigma style breakthrough.

Figure 13 Spreadsheet add-ins


for modeling and simulation are
a compelling, persuasive use of
the data matrix and profit signal
analysis. Revenue gains for off-
pump surgeries are predicted to
range from a net gain of 448K to
$1.4 million.

Control: The final step in the Six Sigma DMAIC (Define,


Measure, Analyze, Improve and Control) process is to
standardize breakthroughs and hold the gains. Discipline is as
important to success here as it is with each of the other steps.

Leadership and culture determine the rate of adoption for


breakthroughs in productivity and quality. When the medical
staff and other senior leaders are disciplined, and when
they role model the use of science, statistical analysis, and
systematic experimentation, breakthrough improvements
occur.

Six Sigma culture evolves along with the breakthroughs. The


degree of success in every Six Sigma breakthrough is directly
related to the level of commitment that is demonstrated by
senior leadership.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


142 Case Studies

The Daily Grind

Don worked in the belt grinding department. Day after


day, he and his co-workers removed “gate stubs” from metal
castings to prepare them for final processing and shipping.

The grinders were paid a handsome hourly rate. The other


major expense for the area was the cost of belts. They went
through a lot of belts on a typical shift.

Define: If you try to use a belt beyond a certain point, your


efficiency in removing metal goes way down. The supplier
representative had given the area manager a rule to use for
deciding when the grinders should throw a belt away and put
on a new one. The rule was called “50% used up”. There were
examples of belts that had been “50% used up” hanging on
the walls in the grinding area.

The purpose of the rule was to minimize the total expense of


the operation. Don thought the rule was wrong. He thought
it caused them to discard the belts too soon. He had a
hypothesis that using the belts a little longer would reduce the
belt expense with no loss of grinding efficiency.

He also suspected that the supplier wanted to sell more belts.


We had no way to evaluate this, so we let it go.

Don had come up with a new rule called “75% used up”. He
proposed doing a designed experiment to determine whether
or not the new rule was more cost effective than the old rule.
We met with Don, the area manager and the supplier rep to
discuss the project.

To our surprise, the supplier rep was vehemently opposed


to the project. He said the “50%” rule was based on
extensive experimentation and testing at his company’s R&D
laboratory. He said we were wasting time trying to “reinvent
the wheel”.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 143
Don argued that laboratory tests may not be good predictors
of shop-floor performance. We thought he had a point. We
were also starting to see why he was suspicious of the supplier.

The area manager also thought Don had a good point. He


gave the go-ahead for the project. He allowed Don one full
day to complete the experiment.

Measure: Don figured he could get 16 castings done in one


day. When the other grinders heard about the experiment,
they suggested other things that could be tested at the same
time. The contact wheels currently used on the grinding tools
had a low land-to-groove ratio (LGR). One of the grinders
wanted to try a wheel with a higher LGR. Another wanted to
try a contact wheel made out of hard rubber instead of metal.
A third reminded Don that belts of at least two different grit
sizes were routinely used. He felt that both grits should be
represented in the experiment to get realistic results.

Table 2 The data matrix for Don’s


grinding experiment. There were
four factors at two levels each. The
response variable was the total
cost for each casting divided by the
amount of metal removed. The total
cost was calculated as labor cost
plus belt cost.

Table 2 contains the data matrix for the grinding experiment


as it was eventually run.

Analyze: An eyeball analysis applied to Table 2 suggested


that Don was on to something with his “75% used up”. It
also suggested that high land-to-groove (LGR) is better than
low, and rubber wheels are worse than metal ones.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


144 Case Studies

Figure 14 Pareto Plot ranking the


factors and interactions in the belt
grinding experiment by the strength of
their profit signals.

But let us not be hasty. Figure 14 shows the Pareto Plot


ranking the factors and their interactions by the strength of
their profit signals.

The strongest signal was the comparison of steel to rubber


contact wheels (MATL). This signal told us that rubber was
not a good idea. The next-largest signal was the comparison
of the 50% rule to the 75% rule (USAGE). It predicted
significant savings in line with Don’s idea. The third-largest
signal was the comparison of a low to high land-to-groove
ratio for the contact wheel (LGR).

The next two signals involved interactive effects. The message


here was that the actual cost reductions from implementing
the USAGE and LGR results would different for the two grit
sizes.

Improve: Don’s experiment produced two recommendations:

1. Use his 75% rule instead of the supplier’s 50% rule.

2. Use contact wheels with the higher land-to-groove


ratio.

The combined impact of these two changes was a predicted


cost reduction of $2.75 per unit of metal removed. This
multiplied out to about $900,000 in annual savings.

Don’s recommendations were quickly implemented


throughout the grinding department. The actual savings came
in a little under the prediction, but everyone was happy. Not
bad for a one-day project.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Case Studies 145

Control: Some degree of cost reduction was achieved by all


the grinders, but it did not apply uniformly. There was still
a lot of variability in grinder performance. Attacking this
variation was the obvious next step. We don’t know if our
recommendation was ever implemented.

“Die Tuning” for Vinyl Extrusion

A vinyl extrusion operation receives a “die package”


(blueprint) from a customer for a new “profile” (part). The
extruder then designs and machines the “die” (tooling) for
extruding the profile. The extruder bears the development
cost in exchange for a life-of-contract “sole supplier” status.

The process of machining, testing, and revising dies is called


die tuning. Each “revision” involves re-machining the die.
The average cost per revision is about $2000. The number of
revisions required to get a new die ready for production varies
unpredictably from 0 to as high as 30. As a result, the total
cost varies unpredictably from $2000 (no revisions needed) to
something like $50,000 (lots of revisions needed).

An extruder can easily spend $1.5 to $5.8 million or more


each year on die tuning. Reducing the dramatic variation
in the number of revisions was identified as a project with
potentially huge financial benefits.

Define: We started with a “Kaizen-blitz”, a very fast and


focused review of the die tuning process. Once the initial
machining of a die is completed, a tester runs that die on one
of several extrusion lines reserved for testing new die.

Once the production line stabilizes, the tester does visual


inspections and measures the control dimensions with a
caliper. The inspection results and the dimensions are taken to
a revision programmer who determines whether a revision is
needed. If it is, the revision programmer sends the die back to
the machine shop with a revision sheet describing the needed
changes.

The tester is also supposed to determine the best run


conditions for the new die. Potentially these factors could
include:
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
146 Case Studies

• Line speed
• Die-to-calibrator distance
• Calibrator vacuum
• Screw Revolutions Per Minute (RPM)
• Screw oil temperature
• Barrel zone temperatures
• Die zone temperatures
• Melt temperature
• Melt pressure
• Weight

Testers are under time constraints. They adjust some of these


variables by trial and error to get the dimensions closer to
nominal and improve the cosmetic quality. The variables most
commonly adjusted are line speed, die-to-calibrator distance
and weight. The other variables tend to remain at “baseline
run conditions” assigned before the die is machined.

Our findings were as follows:

1. Die revisions were based on single measurements


taken by a hand-held caliper on plastic parts. In all
industries the repeatability of such measurements is
notoriously bad.

2. The trial-and-error method has virtually no chance


of finding good run conditions.

3. Letting testers choose which variables to adjust may


have long-term economic consequences. Examples are
lowering the line speed or increasing the weight.

Item 1 looked like a possible “smoking gun” for the problem


of too many revisions.

We proposed that small series of designed experiments be


made a routine part of die tuning. This would require more
time for each revision cycle. But this process held the promise
of dramatically reducing the number of revisions. The basic
idea was this: before we cut metal again, let’s see if we can
“process our way out” of some of the dimensional or cosmetic
problems.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 147
We felt the Design of Experiments (DOE) approach could
address all three. Some of the team members wondered how
it could help with Item 1. The answer was that the results of
a DOE are always based on weighted averages rather than
individual measurements. This automatically improves the
reliability of the data used to determine revisions.

Measure: For the initial experiment, the team decided to


observe four continuous factors: line speed, die-to-calibrator
distance, calibrator vacuum and weight.

We used statistical software to generate a data matrix similar


to one shown in the first six columns of Table 6. The
matrix in Table 6 is the as-run version with the weights and
calibrator vacuums actually obtained in place of the nominal
values in the original matrix. The levels of the four factors are
coded to protect proprietary information.

The die in this case had a dual orifice. This means that two
profiles are extruded at the same time. Results for the two
profiles are distinguished in the matrix as Sides 1 and 2.

The responses included 13 control dimensions and a 1-


5 distortion rating, where higher is better. The control
dimension data are expressed as deviations from nominal in
thousandths of an inch.

Table 6 The data matrix that fills


Analyze: A matrix of distribution curves was the result of
the next page is from the die jointly optimizing all 14 response variables. The statistical
tuning experiment. There were four software performed this optimization in just a few seconds.
continuous factors at three levels Please accept our apologies for the fact that the complexity
each. The response variables of this statistical graph exceeds the boundaries of this
included 13 control dimensions and a
1-5 distortion rating, where higher is
introductory book. The quick story follows.
better.
Improve: The implications were staggering. By doubling
Remember, because this table is a the line speed and reducing material costs by 50 percent the
data matrix, each column is a single production line produced perfect quality product after just
entity or vector. A correct analysis
breaks up the variation vector in the
one revision and some very minor additional die tuning.
cornerstone of evidence into Noise
and Profit Signals. Additional key findings were as follows:

• We were able to run a four-factor die tuning experiment


in one day.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


148 Case Studies

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Case Studies 149

• We generated a wealth of information on how each


factor affects each response variable. Some results
confirmed prior beliefs, others contradicted prior beliefs.

• We showed that using weight and line speed as


adjustment factors in die testing lead to unnecessarily
high weights and low line speeds. This locks in
unnecessary costs for the life of a contract. It may also
contribute to problems with quality, which in turn lead
to a larger numbers of revisions.

A conservative estimate of the annual cost reduction from


extending this method to all new die was $1.2 million, half
of the current annual budget for die tuning.

Control: The process of changing the way die tuning is done


is underway. Similar experiments have been run on other
new die with similar results. In one case a die was saved in
the nick of time from going back for an incorrect revision
that would have spawned further revisions to repair the
damage. Much has been accomplished. More is expected.

Endnotes

1
Cartmill, Matt. “Oppressed by Evolution”. Discovery
Magazine, March, 1998, pages 78-83 as reported by Richard
Dawkins on page 20 in his book Unweaving the Rainbow.

2
http://gi.grolier.com/presidents/aae/side/knownot.html

3
Cheryl Payseno, an RN, former hospital administrator and
certified Six Sigma black belt wrote this case study for us.
Cheryl led the charge for the use of Designed Experiments
in health care in 1995 with Daniel Sloan. Results from those
early innovations were published by the American Society for
Quality’s Quality Press.

4
Pfister, Albert J., Zaki, M. Salah, et al. “Coronary Artery
Bypass without Cardiopulmonary Bypass.” Ann of Thorac
Surg 1992; 54:1085-92.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


150 Case Studies

5
Pfister, Albert J., Zaki, M. Salah, et al. “Coronary Artery
Bypass without Cardiopulmonary Bypass.”Ann of Thorac Surg
1992; 54:1085-92.

6
Senge, Peter M. The Fifth Discipline, The Art and Practice of
The Learning Organization. New York. Doubleday Currency.
1990.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 5

Using Profit
Signals

P
rofit signals show you the money. Profit signal vectors
literally and figuratively show you what works best
in any business, financial, health care, manufacturing
or service process. This chapter explains how vector analysis
applied to a data matrix showcases the information contained
in raw data. Once the tools have done their job, the graphic
presentation of evidence paves the way to breakthroughs in
quality, productivity and profitability.

Profit signals are like televisions, radios, cars, telephones and


the Internet. They attract attention. People want to play
with them. They want to use them. This natural occurrence
unsettles to old-school managers. Some react like the mythical
John Henry: “Before that steam drill shall beat me down, I’ll
die with a hammer in my hand.”

Vector analysis applied to a data matrix is the steam engine


that humbles them.

The spreadsheet is the first and only computing program


many business people learn to use. When all you have a
sledgehammer, everything looks like a spike. Hammering out
spreadsheet revisions keeps employees occupied. These people
are occupied reworking Proformas, business plans, and trying
to explain why actual monthly financial results do not fall
exactly on the predicted straight line of a one-dimensional
“variance” analysis. Though they are busy, they may not
necessarily be productive.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


152 Using Profit Signals
Fortunately, the cornerstone of evidence is appealing. By
constructing a cornerstone-of-evidence tetrahedron using
bamboo skewers as vectors and spheres of Sculpey Clay as
points-in-hyperspace connectors, people can weigh evidence
in their own hands.

The look and feel of an Analysis of Variance tetrahedron


in one hand and a single stick in the other, convert would-
be 21st Century Luddites into evidence-based decision
champions. Physical models win hearts. The following are
a few of the many reasons why so many former skeptics
embrace the use of profit signals to make more money.

1. With profit signals, you have only one formula to


remember.

2. With profit signals, you don’t have to solve


equations.

3. With profit signals, you can produce 10 times


the work in a fraction of the time now spent doing
arithmetic with a spreadsheet.

4. Profit signal pictures are aesthetically pleasing.

5. Profit signals help you make more money with less


work.

Money, number 5, is THE big reason Six Sigma projects are


so popular around the world.

A Better Way to Look At Numbers

Think back to your Five-Minute PhD. In a data matrix, each


number is an integral part of an entity called a vector. Each
column of numbers is its own vector. Each column is a field
or variable with a precise operational definition. Rigorous
inductive and deductive reasoning, which dates back to
Aristotle, is built into statistical software designed specifically
for the data matrix structure. In other words, a data matrix
channels the intelligence and logic of the best minds our
human species have produced. Each number is framed in the
geometric context of a profit signal vector.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Using Profit Signals 153

Vectors show you the money.

Measurements presented in the rows and columns of a


spreadsheet convey no sense of unity. There is no sense of
purpose. Each number is an orphan locked in its own cell.
Logic takes a back seat to manipulation. Commonsense
relationships between numbers are ignored. A ghost named
Zero inhabits empty cells. Arithmetic is the two-stroke engine
running Abacus Prison. There are no vectors, no arrows,
pointing to the money.

Vectors have physical properties. These properties can be


measured and displayed in three dimensions. We strongly
urge you to actually build a Sculpey-Clay/bamboo skewer
model whenever the dimensions of a vector analysis are
revealed to you in one of our examples. It costs about one
buck for the whole kit.

Corrugated Copters

C. B. Rogers created the helicopter analogy while working


at Digital Equipment in Marlboro, Massachusetts. Professor
George E. P. Box introduced us to it at the University of
Wisconsin, Madison in May 1995. Dr. Box, a Fellow of
the Royal Society and the American Academy of Arts and
Sciences, was the Fisher Professor of Statistics.

Box was also a riveting teacher who taught us that an analysis


of variance was so simple, “You could tell the answer just by
looking at the numbers on a cube.” He and his colleagues
used the helicopter in Figure 1 to illustrate.

Please take a moment to build one now so you can follow


along with our data explanation. First, tear a piece of 8.5
inch by 11 inch paper in half, long ways. If you have a pair of
scissors and quality paper use them. These tools will make the
construction process more satisfying. The results will be more
rewarding. If you are in a hurry, tearing paper works fine.

Next, cut or tear the top section to form the “blades.” Finally,
follow the folds at the bottom to form the helicopter’s

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


154 Using Profit Signals

������

���
Figure 1 This inexpensive product
is an analogy that works well for ������
teaching data matrix and vector
analysis principles to people in all
industries.
����
���� ����

���� ����
����� ����� ������

fuselage, or body. You may tape the body to give it some


rigidity if you like. Hold the finished product with the blades
perpendicular and away from the body at shoulder height.
Let it drop. Like seeds from a maple tree, the blades will catch
air while the aircraft spins to the ground. This is fun to do
and fun to watch. Now, time the flight using the black, blue,
purple, or pink plastic digital chronometer you wear on your
wrist.

For this game, each helicopter costs $9 million to build. For


each second of additional flight time, customers are willing to
pay an additional $1 million in price. Longer flight times are
worth quite a bit more money than shorter flight times.

Corrugated Copters learned a big lesson when their company


was founded in 1996. Eliminate all costs associated with
take offs and you can really make money.1 Their original
corporate slogan was, “Drive down costs!” Their current,
more enlightened view is wordier: “The best way is the most
profitable way.” This saying has become a ritual chant that
opens all management meetings.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 155
Take a moment now to draw a Six Sigma Supply, Input,
Process, Output, and Customer (SIPOC) flow diagram. You
will learn Corrugated Copters is a behemoth that demands
global logistical support. The paper began as a seed that was
planted on a tree farm in the Pacific Northwestern United
States in 1948. The quality and cost of that tree affects the
quality and cost of your building materials.

One company cuts down the tree, the Supply. Another ships
it as Input to the pulp mill. The pulp mill Process creates
the paper. The packaged Output is sold to its wholesale
Customer. Corrugated Copters is the retail customer who
buys it from the wholesale customer. You and your products
are parts of a system.

The company’s measuring device is a five-mode wristwatch


with alarms. It breaks hours into hundredths of a second. It
used to be silicon, some oil, and ore. Since time is money,
and money is time, the calibration of this instrument is
exceptionally important. Accuracy matters.

One of your employees has created a Lean flow diagram


to show the entire value stream for your watch. The most
efficient routes for delivering these devices to your engineers
are annotated with dollars, times, and inventory turns. The
store that supplies this watch keeps a supply of them on hand
just in case you need a new one in a hurry. Just-In-Time has
eliminated almost all of Corrugated Copter’s inventory costs.

The pen or pencil you used to record your measurements also


has an informative SIPOC diagram archived for reference in
the event another new Six Sigma breakthrough is needed.

Last and certainly not least, the brains behind Corrugated


Copter’s success have been, to varying degrees, educated. Not
everyone is cut out to be a helicopter pilot. Not everyone
could hope to be a timer. It almost goes without saying that
collecting data is a big job. The analysis of that data is yet
another specialized task that has its own job classification.

Complexity surrounds Corrugated Copters. The market is


filled with uncertainty and risk.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


156 Using Profit Signals

Testing the Current Way of Doing Things

Avona Sextant, a Corrugated Copter senior executive, has a


Five-Minute PhD. Avona is often called upon to facilitate
meetings. When she is not in the room, Copter teams seem to
argue amongst themselves. When Avona joins their dialogue,
teams just naturally converge on answers that lead to a
consensus and a “path forward.”

More than anyone else in the company, Avona is committed


to evidence-based decisions. Some suspect her peculiar
predisposition is a genetic disorder. In any case, Avona will
listen only to stories that have evidence in their punch lines.
Bets are routinely placed, money is won and lost, over when
and how many times she will say the word “evidence” in a
meeting.

Some think Avona is goofy. Others think she is crazy like a


fox. Though there is resistance to her methods, no one argues
with her fundamental point of view, “The best way is the
most profitable way.” On this they are in full agreement. The
problem is how to determine which way is best. On this there
is a considerable amount of debate.

Some employees have heard quite enough of her New


Management Equation speech. They suspect that Avona’s
little formula for calculating Chance variation only works
with simple numbers like 3, 4, and 5. They also know this Six
Sigma stuff is a passing fad. They are going to wait it out and
hope for the best.

During a recent productivity breakthrough, the mid-


management team of Tom, Dick and Mary produced a
double-digit flight time! Just yesterday they booked a record-
breaking 10 seconds. They are proud of themselves and
bragging when Avona walks in.

“Ten seconds. What an awesome and terrific flight time!”


Avona cheers. “That’s worth ten million dollars in gross
revenue. That would be a profit of one million dollars.” She
adds, “I can’t wait to see the rest of your evidence.”

“Evidence?” asks the team.


© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Using Profit Signals 157
“This is so exciting. You must have flown this machine more
than once. I just want to see your other measurements. If we
average more than 9 seconds when we launch the product line
I will be euphoric. Otherwise we won’t make any money in
the long run.”

The team showed her all their data: 9 seconds, 8.9 seconds
and 10 seconds.

“Oh, I see you’re using a spreadsheet,” said Avona. “I used


to use one of those. Plus I also had an abacus for my backup
system.”

Avona’s aunt in Hong Kong taught her how to use an abacus


when she was a little girl. The abacus was the world’s first
computing system.2 Because her abacus was a Chinese rather
than a Japanese machine, she learned long ago to translate
binary numbers into regular old numbers and back again
with the flick of her right index finger. When Avona first saw
vector analysis applied to a data matrix, she knew the time
had finally come to retire her abacus and her spreadsheet too.

Avona was still waiting for her statistical software purchase


order to be approved. In the meantime, she had programmed
a worksheet with vector-analysis formulas built into the
cells. With her templates, people didn’t have to type in
any formulas. She input the three data points. Her Excel
spreadsheet immediately produced the vector analysis
displayed in Table 1.

“Our objective of 9 seconds is a fixed number rather than a


measurement. So the first step is to subtract it from the raw
data,” she explained. “This gives us the Differences vector.”
She drew the picture in Figure 2 to illustrate the vector
analysis of the difference data in Table 1.

Mary observed, “Is that what accounting calls a variance?”

“Good call Mary. Yes it is.”

“I don’t think that department ever thought of a column of


numbers as a vector,” said Dick.

“They do now. The new Six Sigma Black Belts in Accounting


are changing history!” said Avona. “I have no idea how
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
158 Using Profit Signals

�������������
��������
�������� ������ ����������� ������������ �����
������ ������ ������ ������ ������

� � � ��� ����
��� � � � ���� � ��� � ����
�� � � ��� ���

������������������ � � � � �
��������������� �������������������������� ���� � ���� � ����

�������� ����������������� ��������������������� ���� ���� ����


� ������ ������������������������� ����������������� ����
� ������ ��������������������������������������������������������������� �����
������������������ �������������������������������� ����
Table 1 Vector analysis for testing the current helicopter design against the performance objective,
9 seconds. The raw data are flight times in seconds. The profit signal coincides with the average
difference from the 9 second objective. It has one degree of freedom because it is determined by a
single number—its average—0.3. Noise is calculated by subtracting the Profit Signal value from the
respective value in the difference vector. This arithmetic is a Law of the Universe. See Figure 2.

they are going to solve the 1,000 year old waste and
rework problems related to the 14th Century’s double entry
bookkeeping system. Our Black Belt CPA Peruzzi told me
the tip off for her was the word “double”. Get it? ‘Double
entry? Rework entry?’ Well, Peruzzi is convinced the entire
double entry ‘bookkeeping system’ is nothing more than a
massive hidden factory loop. With a properly designed data
matrix, the second entry is needless rework.

“Our Black Belt CPAs are arraying entries into a data matrix.
Some have already doubled their personal productivity. It is
a marvel what Six Sigma education and training can do, even
for a Merchant of Venice.”

“See how the squared length of the difference vector, 1.01,


is equal to the sum of the squared lengths 0.27 and 0.74?”
she shined (Table 1). “That’s how the New Management
Equation works. Everything always adds up. It’s wonderful.”

“Oh come on Avona,” chided Mary. “Having those numbers


add up is no big deal.”
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Using Profit Signals 159

Figure 2 This is the picture of the key


vectors in Table 1.

“Oh it is. It is!” said Avona. “Did you notice that when you
multiply a minus times a minus, the sign becomes a plus? And
look how confusing that –0.4 in the Noise vector column is. I
always have a hard time remembering that a negative number
like –0.1, minus a positive number like 0.3 turns out to be a
bigger negative number!”

“The last time I saw this stuff was when I had to learn to use a
slide rule in Mrs. Beamer’s algebra period.” complained Tom.

Though the team was tired of Avona’s boundless enthusiasm,


with a push of the square root button on their calculators they
could see that the sample standard deviation—the square root
of the 0.37 Variance—was about 0.6 seconds. Their average
flight time was about 9.3. Even if these numbers perfectly
described what would happen in long-run production, future
times would vary. Laws of the Universe strike again.

The best performance the team could expect would be that


future flight times were unlikely to fall below a lower “three-
sigma limit” of about 7.5 seconds [7.5 = 9.3 – (3 × 0.6)].

To make matters worse, Avona starting talking about


evidence. “The null hypothesis here is that our future average
flight time will be 9 seconds. We want to disprove this
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
160 Using Profit Signals

hypothesis. We want the average flight time to be higher. We


want to make money.

“The data will show this is true if and only if the p-value
is small enough. By international standards, a p-value less
than 0.05 gives ‘clear and convincing’ evidence against the
null hypothesis (Table 2). A p-value less than 0.15 gives a
‘preponderance of evidence’ against the null hypothesis. Our
p-value is 0.428, not even close to the lowest standard. This
means there is no evidence at all that the average flight time
is significantly different from 9 seconds. These differences are
Table 2 Standards of evidence probably due to Chance. It is a Law of the Universe.”
table.

To illustrate the implications of her conclusion, Avona drew


the picture in Figure 3.

“There is no signal here, just a lot of noise in our system.


Depending on variations in the weather, wind, pilot, paper,
timing device, and construction, the time could vary up to
10.8 and all the way down to 7.2 seconds.

“Also, if the mean is exactly 9 seconds, our average gross


revenue will be exactly equal to our cost, $9 million. We will
be making money on half, and losing money on the other
half. This is not good. This means our long-run profit will be
zero, 0. The best way is the most profitable way.”

The room was quiet.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 161

Figure 3 The Normal distribution of


flight times if the mean is 9 seconds
and the standard deviation is 0.6 0.6
seconds.

7.2 7.8 8.4 9 9.6 10.2 10.8

Everyone had taken a liking to Avona’s signal/noise analogy


months ago. They all agreed with her interpretation of their
data. As usual, the team ended their argument with an
agreement.

Ten was an exciting, encouraging number. But, they needed


to know more before they could launch the new product.
There were two possibilities:

(1) The problem might just be the small sample size of


3. They could do more tests of the current design to
strengthen the signal and reduce the noise. This would
let them determine the average flight time with greater
accuracy.

(2) They might need to go back to the drawing board and


find a way to further increase the flight time.

For 2500 years the right triangle has shown us the route to
profitability. Ancient Greek mariners used the sextant to
navigate the Mediterranean Sea’s lucrative markets.

In his little book, Posterior Analytics, Aristotle equated the


right triangle with truth.3 Applying vector analysis to a data
matrix on a regular basis is a good way for today’s seekers of
truth to learn about Aristotle’s principles. Six Sigma experts
know that the New Management Equation discovered by an
old Greek named Pythagoras 2500 years ago is worth billions
of dollars today.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


162 Using Profit Signals
In analysis, as in telecommunications, customers want a
strong signal. Just like Avona, communications engineers
from Marconi in 1901 to Nokia in 2003 have appreciated the
value of a high signal-to-noise ratio. The data matrices and
vector analyses employed by engineers differ only superficially
from the matrix and vectors you used to earn your Five-
Minute PhD.

Overcoming Obstacles

“Science phobia is contagious,” wrote Carl Sagan, an


astronomer and television celebrity, just prior to his death in
1996. 4 “Some people consider science arrogant—especially
when it purports to contradict beliefs of long standing or
when it introduces bizarre concepts that seem contradictory
to common sense. Like an earthquake that rattles our faith
in the very ground we’re standing on, challenging our
accustomed beliefs, shaking the doctrines we have grown to
rely upon can be profoundly disturbing.”5

The transparent analysis principles in the cornerstone


of evidence shake the foundations of business decisions.
Executives may find the data matrix and vector analysis
distressing. Until they get the hang of using these tools, both
concepts tend to terrify cost-accounting analysts. Once on
board, these executives and analysts become vital assets for
breakthrough project teams.

Math phobia is another, even more daunting obstacle on


the high road to evidence-based decisions. In our consulting
practices over the past 20 years, we have found that some of
the people who fear math most are Accountants, Controllers,
Financial Analysts, Chief Financial Officers, Chief Operating
Officers, and Chief Executive Officers. Many corporate
officers “did not do well in high school algebra.” Take Daniel
Sloan for instance:

“Numerical dyslexia, reversing numbers instead of letters, has


plagued me since I memorized my times tables in Mrs. Peiffer’s
fourth grade classroom. I can no more do math in my head than
I can read the letters at the bottom of an eye chart without my
glasses. I must wear glasses to see. I must use a computer to do
math. Confronting math phobia was the most painful, anxiety-

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 163
provoking, downright embarrassing, and humiliating career step
I ever took.

“Overcoming my math phobia was a more strenuous challenge


than all of my five years as a Vice President of Marketing,
publishing five peer-reviewed statistical textbooks, my stint as
a Senior Vice President in a publicly traded, $500 million
corporation, and founding and running my own business for 14
years. It has been as rewarding as it has been difficult. One of the
best things success has given me is the opportunity to help other
business leaders like me take that frightening first step forward.”

Larger than science and math phobias combined, is the fear


of losing one’s job. Experience shows, money motivates. It
can and does persuade executives and line workers alike to
face and overcome both these phobias. Six Sigma is a cultural
business force that compels people to step up to a difficult
task.

Privacy is exceptionally important to adult learning.


Computerized, personal, learning programs deliver privacy.
They are available for adults who suffer from science
and math phobias. Math Blaster, Alge-Blaster, Pro-One’s
CD-ROM multi-media course Mathematics, and many
other programs are great ways to re-learn the principles of
addition, subtraction, multiplication, division and the order
of operations. They are fun. Private tutors and educational
consultants are other options that work well.

The best news for executives and workers alike is that cheap,
reliable, and very user-friendly software makes vector analysis
as easy to learn as sending an E-mail.

Comparing Two Ways of Doing Things

“Hey Avona!” shouted Tom. “We think we have some


evidence you are going to like. Just look at this stack of
numbers.”

Avona’s eyes opened wide. “Way to go. It looks like there


might be a genuine difference between the two different
helicopter designs.” Still not having her statistical program,
she entered the data into one of her spreadsheet templates and
showed them the vector analysis in Table 3.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
164 Using Profit Signals

“I can’t believe it took me an hour to program this worksheet


template so it will act like a data matrix,” Avona complained.
“What a waste of time. I sure hope the purchase order for my
statistical software gets approved soon.” Avona loved evidence,
but patience was not her long suit.

Table 3 Vector analysis for comparing two helicopter designs. The raw data are flight times minus
the objective of 9 seconds. The profit signal consists of the average variation for each design. The
average variation for white helicopters is –0.2 seconds of flight time. The average variation for
pink helicopters is 0.2 seconds of flight time. The Profit Signal Vector has one degree of freedom
because a single number, 0.2, determines it. When the numbers in this column are squared, the
minus sign disappears. The squared lengths of all the vectors are connected by their part in the New
Management Equation (NME).

“So Table 3 is where your ‘cornerstone of evidence’ comes


from?” asked Mary.

“Right. Just look at my models (Figure 4). The labeled edges


correspond to the vectors in Table 2. We can make a model
of your data and our new Analysis of Variance using some
bamboo skewers and Sculpey Clay. I just happen to have a
supply in my desk drawer.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 165

Figure 4 The cornerstone of evidence


represents any vector analysis. A
Polydron regular tetrahedron model is
next to a cornerstone of evidence.
Differences in raw data change
the dimensions. It is a Law of the
Universe.

Avona played with all sorts of modeling toys. Her office was
filled with them. She told people they were symbolic. She
would go on and on to anyone who would listen about some
artist named Alexander Calder.

“Let’s use my $1 handheld calculator to help us cut the


bamboo skewers to length. We will use inches as the units.
The length of the raw data vector is the square root of 8.38,
which equals 2.89 inches. The length of the data average
vector is the square root of 8.00, which equals 2.83. The
length of the variation vector is the square root of 0.38,
which equals 0.62 inches. The length of the profit signal
vector is the square root of 0.32, which equals 0.57 inches.
The length of the noise vector is the square root of 0.06,
which equals 0.24 inches. The noise is so short it will be
buried completely in the Sculpey Clay. The profit signal and
noise vectors are the fine print in a vector analysis.

“I sure wish we had our data matrix software. It is silly for us


to use a hand calculator.

“We haven’t talked about that last vector in the back of the
tetrahedron. This is the vector of hypothetical predicted
values. I didn’t include it in my spreadsheet templates because
it isn’t important in the type of experiments we’ve been doing.
It’s tremendously important in response surface experiments.
That’s where we are optimizing over several continuous
variables.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


166 Using Profit Signals

“Anyway, we get the prediction vector by adding together the


profit signal and data average vectors.” (Table 4) It is always
just a tad shorter than the raw data vector. In this case, the
length of the prediction vector is the square root of 8.32,
which equals 2.88 inches.”

Table 4 The vector of hypothetical


predicted values is the sum of
the profit signal and data average
vectors. It gives the predicted
average flight times for the two
designs. It has two degrees
freedom because it is determined
by two numbers.

“It sure is colorful,” noted Dick, looking at their model.


“Could we have hot pink Sculpey Clay points in space instead
of green ones?”

“We sure can. I think Sculpey Clay is a Six Sigma product,”


Mary hypothesized. “Say. I just realized if you set one of those
up on its end, it even looks like a radio tower sending out
profit signals. I am going to need to bake mine in the break
room toaster oven for a few minutes so the clay firms up and
holds onto the vector skewers.”

“Wow, look at that p-value in the table,” said Tom tearing his
gaze away from Mary’s profit signals radio tower, “There really
is a difference between the two designs.”

“Absolutely right,” said Avona. “The null hypothesis is that


there is no difference between the designs. A p-value less
than 0.01 gives evidence beyond a reasonable doubt against

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 167
the null hypothesis. We are shredding that straw man like a
mogul field at Mount Baker.

“The spreadsheet actually has a formula called FDIST that


calculates the p-value. It was named after Ronald Fisher. See,
you just plug in the F ratio value, one degree of freedom for
the profit signal and three degrees of freedom for the noise
vector and voilà. There is hardly any noise in this data at all. It
is almost all profit signal!

“By subtracting the p-value 0.001 from the number 1, we get


the number 0.999. This means we can be 99.9% confident
that there is a difference between the pink and white designs.
Even though the average difference is only 0.4 seconds, the
vector analysis is sensitive enough to detect it. Plus, that’s
another $400,000 in profit per helicopter. Phenomenal work
team!

“So, which design works best?”

“What is most profitable is best!” Tom, Dick and Mary sang


out. “Pink helicopters are best.”

“It certainly looks that way,” said Avona. “But before we


release the pink design to production, let’s do a confirmation
experiment. And while we’re at it, let’s include the green
design in the comparison. We don’t have much data on that.
See you guys later.”

“Gee whiz Mary,” said Tom after Avona had left. “Everyone
can see pink helicopters are best. Why is Avona such a stick-
in-the-mud? And why does she keep saying ‘we’ when she
really means us?”

“Just be grateful she didn’t talk about evidence again.”

Comparing Three Ways of Doing Things

“Wow! I think we are onto something with these pink


helicopters,” said Dick. “We even checked them against green
helicopters. They still came out best.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


168 Using Profit Signals
“And what is best is most profitable,” said Avona. “Let’s plug
your numbers into my spreadsheet template. I want to show
it to Rotcev Sisylana, our new CEO from Uzbekistan. He’s
gonna love this. Maybe he will get me two copies of my data
matrix software. Shoot, they cost less than a thousand dollars.
I wasted more than that last week dinking around with my
spreadsheet templates.”

Avona’s analysis is presented in Table 5. The null hypothesis


is that all three designs will have the same average flight
time. The p-value of 0.004 says there is evidence beyond a
reasonable doubt that this is false. In other words, at least one
of the designs is significantly different from another. Which
one is best? From the profit signal, we can see that the pink
design flies 0.325 seconds longer than the average flight time
of 0.9 seconds. The white and green design flight times are
0.125 and 0.200 shorter than average. Once again, pink is
best.

Table 5 Vector analysis for comparing three helicopter designs. The raw data are flight
times minus the objective of 9 seconds. The profit signal consists of the average variation
for each design. It has two degrees of freedom because it is determined by two numbers,
-0.125 and - 0.200 in this case. The third number, 0.325, is minus the sum of these two.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 169
After reviewing the results in Table 5 Dick observed, “This
table looks just like all the others except it’s taller.”

“Thank you Dick,” Avona responded. “Can I see those


helicopters first hand? I would love to watch them fly.”

After carefully observing a few flights she noticed something


the others had missed. “Have you noticed that the pink
helicopters have longer blades than the white and green
ones?”

“What?” blurted Tom and Mary. “We never noticed that


before! Maybe it’s actually the longer blades that cause the
longer flight times.”

“Of course,” added Dick. “It’s obvious that flight time should
depend on blade length, not on color.” Tom and Mary
said nothing, but they each wondered why Dick had not
mentioned this “obvious” thing earlier.

“Do we have to start over, Avona?” asked Tom, Dick, and


Mary.

“Not completely. But we are wasting time and money by


analyzing only one factor at a time. We’ve spent $216 million
and we still don’t know anything about our other product
features. When I got my PhD, I learned that the way to
maximize the evidence in an experiment is to study several
factors at the same time. Let’s do a cube experiment!”

“Oh no,” whispered Mary to Tom. “It’s bad enough when she
talks about evidence. Now it’s cubes.”

Comparing Eight Ways of Doing Things

But Avona was right. The cube experiment they decided to


run had three factors: color, paper-clip ballast, and blade
length. As shown in Table 6, each factor had two levels
(settings or choices).

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


170 Using Profit Signals

Table 6 The data matrix for the cube


experiment run by Avona, Tom, Dick
and Mary.

Avona had lost patience with her senior management peers.


She had finally purchased her own copy of the statistical
software and installed it on her laptop. As a point of
comparison, she first showed everyone the vector analysis in
her spreadsheet template (Table 7).

“I notice this table is just the same as the others, except it’s
wider.”

“Thank you, Richard. Anyway, it looks like we have two


statistically significant profit signals. The p-values for paper
clip ballast and blade length are 0.047 and 0.028, respectively.
By looking at the profit signal vector for paper clip (Y), we
can see that not adding the weight to the helicopter adds 0.12
second to the overall average flight time. Also, we can see that
adding weigh subtracts 0.12 seconds from the overall average
flight time. Overall, this means that not adding the weight
to the helicopter increases the average flight time by 0.24
seconds compared to adding the weight. That’s $240,000
additional profit per helicopter sold.

“By looking at the profit signal vector for blade length (Z), we
can see that using the short blade subtracts 0.20 second from
the overall average flight time. Also, we can see that using
the long blade adds 0.20 seconds to the overall average flight
time. Overall, this means that using the long blade instead

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 171

Table 7 Vector analysis for the


cube experiment run by Avona, of the short blade increases the average flight time by 0.40
Tom, Dick and Mary.The raw seconds. That’s $400,000 additional profit per helicopter
data are flight times minus the sold.
objective of 9 seconds.
“The combined effect of these two changes is an increase
of 0.64 seconds. This means a total of $640,000 additional
profit per helicopter sold. We’ll make millions.”

Tom asked, “I know that X, Y and Z are code names for the
three factors. But what do XY, XZ and YZ mean?”

Avona said, “They are code names for the interactive effects
among the factors. An interactive effect exists when the
effect of one factor depends on the level (choice or setting)
of another factor. In this case there were no significant
interactions. Usually there are.”

Mary asked, “Is that why it was OK to just add together the
effects of paper clip and blade length?”

“Exactly!”

Next, Avona opened her statistical software and clicked her


mouse a few times. Up came the Pareto chart in Figure 5.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


172 Using Profit Signals

Figure 5 Modern statistical software ���� ���������������� ��� �


presents analysis results as pictures. ���� ��������
Everyone can see just by looking ����������� �����
which factors make the biggest ����������� �����
difference. ����������������������� �����
������������ �����
������������������������ �����
������������������������ ����

Everyone was taken aback to see Avona use a bar chart. “For
heaven’s sakes Avona,” cried Mary. “Have you become a bar
chart bamboozler?”

“Not really. It’s just that modern software manufacturers are


smarter than they used to be. They found out customers
wanted a quick visual analysis of which factors have the
largest effects.”

“Rock on!” shouted Tom, Dick, and Mary.

Comparing 256 Ways of Doing Things

“Rotcev wants us to test eight different variables,” complained


Mary. “That is 28 or 256 combinations. That would cost us
$9 million times 256, or $2.3 billion!”

“Good thinking,” said Avona. “But with our data matrix


software we can screen all eight factors with only 16
helicopters. That would cut our R&D costs by 94 percent.”

The team built 16 helicopters with different configurations


using two different levels of Rotcev’s 8 factors: paper
type, body width, body length, blade length, paper clip,
aerodynamic folding, wing tape, and body tape. Figure 6
shows the data matrix for the experiment, including the flight
times that were obtained.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 173

Figure 6 Statistical software


automatically determines the
hyperspace geometry for testing eight
different variables simultaneously
using only 16 experiments.

The software calculated the vector analysis in less time than


it took to click the mouse. The Pareto chart ranking the eight
factors by strength of signal is shown in Figure 7.

Figure 7 Statistical software


automatically rank orders each factor
according to the size of its Profit
Signal strength.

“If I read this right,” observed Dick, “It looks like we could be
over-engineering our product. Very few of the other factors,
including the expensive paper, makes a difference.”

“Very astute thinking Dick,” complimented Mary. “I think


you just figured out a few good ways for us to make more
money.”

“Roctev needs to meet this team and hear about these results
soon,” said Avona.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


174 Using Profit Signals

Chapter Homework

Think of these two elements—profit signals and noise—by


using your cell phone as an analogy. Strong signals are easy to
understand. Noise or static are impossible to decipher.
The strong signals in our exercise data matrix came from the
two factors that influenced the outcome.

Noise has its own vector. Noise, Chance or random


variation is a phenomenon of nature. Variation surrounds
every measurement and measurement system. Variation is
everywhere, always.

For example, weigh yourself on a bathroom scale and record


this measurement. Wait a few moments and weigh yourself
again. Weigh yourself every hour and keep a running record
throughout the day. You will discover that your weight
may vary by as much as six to 10 pounds per day. Why?
Everything varies including your weight and the system used
to measure it.

A data matrix and the rules of a vector analysis sort profit


signals from noise. Statistical evidence is a ratio. Evidence is
the length of the profit vector divided by the length of the
average noise vector. Evidence, when used to make business
decisions, leads to consistently reliable predictions and Six
Sigma style profits.

The right triangle vector illustrations in this book show how


all measurements, all data sets, can be decomposed into these
two parts. In this way, facts can be seen by anyone at a glance.
There is no need to work equations. So long as you stick
with the inherent discipline of a data matrix, you and your
colleagues will simply be able to see the answers to problems,
be they complex or simple as pie.

The deceptive simplicity of 23 cube arrays makes visual,


intuitive, correct, and statistically significant inferences
possible. It makes analysis fast. Deliberate analytic speed saves
enormous amounts of time.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Using Profit Signals 175

Closing Arguments

Orville Wright, one-half of the team that used The New


Management Equation to create the airplane, comments on
the use of data:

“I have myself sometimes found it difficult to let the lines run


where they will, instead of running them where I think they
ought to go. My conclusion is that it is safest to follow the
observations exactly.”6

Endnotes

1
Sloan, M. Daniel. Using Designed Experiments to Shrink
Health Care Costs. Milwaukee. ASQ Quality Press, 1997.

2
Dilson, Jesse. The Abacus, The World’s First Computing
System: Where It Comes From, How It Works, and How to
Perform Mathematical Feats Great and Small. New York, St.
Martin’s Press,1968.

3 A New Aristotle Reader, Edited by J.L. Ackrill. Princeton,


Princeton University Press. 1987. Page 39.

4
Sagan, Carl. Science as a Candle in the Dark, The Demon
Haunted World. New York, Ballantine Books, 1996. Page 328.

5
Sagan, Carl. Science as a Candle in the Dark, The Demon
Haunted World. New York, Ballantine Books, 1996. Page 32.

6
Jakab, Peter L. Visions of a Flying Machine, The Wright
Brothers and the Process of Invention. Washington, Smithsonian
Institution Press. 1990. Page 140.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


176 Using Profit Signals

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 6

Predicting Profits

M
aking accurate predictions is an important, difficult
task. By now, you may not be surprised to learn
vector analysis is the international standard for
making predictions as well as for making comparisons. This
is good news for Corrugated Copters and your company too.
The vector analysis methods for solving prediction problems
are known as regression modeling and analysis.

Before returning to the exploits of our Six Sigma


breakthrough project heroes Mary, Dick, Tom, Avona, and
Rotcev an orientation to basic correlation and regression
concepts is in order.

“Correlation assesses the tendency of one measure to vary


in concert with another,” wrote Stephen Jay Gould in The
Mismeasure of Man. “The invalid assumption that correlation
implies causation is probably among the two or three most
serious and common errors of human reasoning.”1

Experienced managers candidly acknowledge that cost-


accounting variance analysis is based on this faulty premise.
No wonder old school spreadsheet forecasts bear so little
relationship to actual business sales, revenue, inventory,
earning, and other performance metrics.

What a wonder. Month after month, trillions of dollars in


corporate and governmental resources are squandered trying
to explain prediction errors that are inevitable. Never is it
recognized that the one-dimensional “prediction” methods
mechanized in spreadsheets and institutionalized by business
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
178 Predicting Profits
school curriculums were made obsolete in 1890 by Charles
Darwin’s half-cousin Francis Galton. In his 1890 essay
Kinship and Correlation, Galton wrote,

“Few intellectual pleasures are more keen than those enjoyed


by a person who, while he is occupied in some special inquiry,
suddenly perceives that it admits of a wide Generalization,
and that his results hold good in previously-unsuspected
directions. The Generalization of which I am about to speak
arose in this way.”2

Though Galton did not capitalize the word Generalization


in this instance, we did so readers could see he was speaking
about a true Law of the Universe.

Fingerprint Evidence

It is an entertaining and obscure footnote in the history of


evidence-based decisions that by 1893, the same year the
grandfather mentioned in our Premise—the man who used
paper bags and arithmetic to cipher out his farm’s business
transactions because he didn’t trust the new fangled way
of doing things called multiplication—the genius Galton
was pioneering the use of fingerprints as forensic evidence.3
This breakthrough soon found its way into courtrooms of
judgment and justice around the world.

The graphic statistical results of vector analysis, applied to a data


matrix, are fingerprints. They are the fingerprints every process
leaves behind. Each fingerprint data set exhibits unique
swirls, bifurcations, endings and statistically significant
patterns. This technology, this Generalization, is the
grandchild of Francis Galton’s imagination.4

Quoting from Internet sales literature, “Biometrics is a


technology that analyzes human characteristics for security
purposes. The voice, iris, hand, and face can be used in
addition to fingerprints, but among these fingerprints are the
most cost-effective.” 5

Think movies. Think Disney. Think Spielberg and Lucas.


Think prime time, reality TV. Most all forensic evidence
presented by the entertainment industry in whodunits,
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Predicting Profits 179
murder mysteries, science fiction, and gumshoe adventures is
based on the New Management Equation. Serious biometric
predictions—be they concerned with acute lymphoblastic
leukemia, therapeutic vaccine studies, or criminal
investigations—all use the Pythagorean Theorem.6 This
evidence adds up.

Profits are too important to be left to the Chance coincidence


that a paranormal guess will sometimes be right. Forecasts
conjured without the cornerstone of evidence and the New
Management Equation, belong in a dust bin with auras,
divining rods, Tarot cards, palmistry, past lives, soothsaying,
Rune readings, and good old-fashioned wishful thinking.

This chapter is weighty. Given the stakes of international


commerce, the weight of evidence we present in this chapter
is appropriate. If the reading gets a bit heavy for you, peek at
the illustrations. Look for the right triangles. Those pictures
are our wink at you. You know the secret handshake and
inside joke.

A vector analysis applied to a data matrix is a correct analysis.


Remember, c2 = a2 + b2. You will never have to do any of these
calculations. Ever. Data matrix software takes your data and
lays it all out for you.

Three Wishes

Cost-accounting variance analysis has been around almost as


long as Aladdin’s Lamp. Surely, it must have some merit. If we
rub it, and rub it, and rub it again with our erasers, the Genie
will appear. Will we not be granted our three wishes?

Unfortunately, no. We will not. G. Charter Harrison’s


standardized cost model was a step forward in 1918. In
2003 it is too simplistic to satisfy international standards for
quantitative analysis. Even worse, it is the mother of white-
collar waste and rework.

For example, consider the traditional break-even analysis


pictured in Figure 1. Here are the three wishes:

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


180 Predicting Profits

Figure 1 The traditional break-even


analysis is a good example of wishful �������
������
(Averaged
thinking in the white-collar work place. Expenses)
Dollars
���� ���������������
�� �����������������
��� ged
�� era e)
v
(A com ����������������
In

Product or Service Volume

1. I wish my revenue were exactly a straight-line


function of volume.

2. I wish my expenses were exactly a straight-line


function of volume.

3. I wish the relationship between these lines never


changed.

Granting these three wishes would be equivalent to


suspending the physical laws of our universe. Even a wide-
screen Disney genie would decline this opportunity.

Noise, Chance or random variation, attends every


measurement. The mythological Greek Sisyphus had a
better chance of rolling his rock to the top of his hill than a
manager has of making his monthly results fall exactly on a
hypothetical straight line.

Table 1 compares and contrasts wishful thinking with reality.


As numbers in the first column increase, numbers in the
second column increase by an exactly proportional amount.
This perfect linear relationship produces perfect predictions.
These are plotted in Figure 2. They get rave reviews in
management meetings. This sort of line is a sure sign that
shenanigans, rather than standards of evidence, are in use.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Predicting Profits 181

Table 1 Wishful thinking versus


reality. The ‘wish profits’ are
hypothetical straight-line predictions.
The ‘real profits’ are actual results.

Figure 3 shows the actual performance numbers on which


the linear relationship was based. There is, you guessed it,
a huge amount of variation. A single-number prediction is
useless without a statement of prediction error based on the
degree of variation in the process being predicted. We need to
see the profit signal and noise vectors.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


182 Predicting Profits

Figure 2 Wishful thinking results


falling exactly on the straight-line
prediction earn rave reviews in
management meetings.

Figure 3 The straight-line predictions


were based on real profit data
containing a huge amount of variation.
A single-number prediction is useless
without a statement of prediction error
based on the degree of variation in the
process being predicted.

A persistent leadership “homily” suggests that idealized


targets “inspire” superior performance. We have observed
the opposite. Even the best of intentions cannot redeem a
patently false premise.

Arbitrary goals are products of wishful thinking. They are


exercises in futility. They demoralize and debilitate the people
assigned to achieve them. They waste time and money that
might otherwise find its way to the bottom line.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Predicting Profits 183

Prediction Practice

“Say Avona,” said Mary, “I have an idea. You would probably


call it a hypothesis. First I noticed there is quite a bit of
variation in our flight times. Then I noticed there is quite a
bit of variation around our target blade length. We already
know blade length is a key control variable. Tom told me
it would be difficult and expensive to put tighter controls
around our tolerance specifications. Finally, I noticed there is
also quite a bit of variation in our drop height.

“My hypothesis is that drop height could be used as another


control variable. This would be relatively easy to do. I have a
hunch we might even be able to predict the flight time from
the drop height.”

“You know physics, so your hunch has my attention,” said


Avona. Then, with a knowing smile, “But why would you
want to predict flight time from drop height?”

“You know perfectly well why! If we can accurately predict


performance we can anticipate the future. It would be like
knowing what the stock market is going to do tomorrow. We
could use that knowledge to make more profits. What is most
profitable is best.”

“Yes, but can you be a little more specific?” asked Avona.

“Well, if we could predict flight time from drop height, we


could compensate for a variation in blade length by making
an opposite, off-setting change in the drop height. This
way, we could hit our advertised flight times with much less
variation.”

“That’s a great idea,” said Avona. “Make it so.”

“Wait a minute,” said Mary. “This isn’t the same as the other
things you showed us. We aren’t trying to find the best way
of doing something. We’re trying to determine a relationship.
Can you give me a preview of how we’re going to do this?”

“With pleasure,” said Avona. “Let’s say your data looks


like this (Table 2). In problems that involve a relationship
between two variables, we have to know which is the
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
184 Predicting Profits

Table 2 The data matrix array for


three flight times paired with three
different drop heights.

independent variable and which is the dependent variable.


The dependent variable is the one we want to predict. The
letter Y is used symbolize the dependent variable, which for us
is the flight time.

“The independent variable is the one we will use to make the


prediction. The letter X is used to symbolize the independent
variable, which for us is the drop height.”

“OK, but what’s this ‘coded drop height’ about?” asked Mary.

“The values -1, 0 and 1 are codes for low, medium and high
drop heights. At the minus setting I was sitting in my chair.
At zero I was standing. At 1 I got up on my desktop.

“I know it would look better if we put in the actual drop


heights, but it’s easier to explain if we use the codes. It’s also
easier to draw the picture and set up the spreadsheet template.
Of course, we don’t have to bother with the coding when we
use our statistical software. It takes care of everything.

“Anyway, here is the vector analysis from my spreadsheet


template for this practice problem (Table 3). We’re modeling
flight time as a linear function of drop height. If we had more
data, we might try something more elaborate.

“The only difference between this vector analysis and the


ones we did before is the way the profit signal vector gets
calculated. The best-fitting straight line is shown in Figure
4. Look closely and you can roughly see that the slope of the
predicted line in this case is 1.25. For our actual flight times
8.5, 9.5 and 11.0 the straight line predictions are 8.42, 9.67
and 10.92. Notice that 8.42 plus 1.25 equals 9.67. Also
notice that 9.67 plus 1.25 equals 10.92.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Predicting Profits 185

Table 3 Vector analysis for


fitting Y (flight time) as a linear
function of X (drop height).

“This means the forecast goes up 1.25 Y units for every coded
X unit. We get the profit signal vector by multiplying this
slope times the coded X data vector.

“Can you tell me whether or not the slope is statistically


significant?”

Figure 4 Best-fitting straight line for Y


as a function of coded X.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


186 Predicting Profits
Mary thought for a moment, then said, “It doesn’t achieve the
‘clear and convincing’ standard of evidence, because the p-
value is greater than 0.05. It does achieve the ‘preponderance
of evidence standard’ because the p-value is smaller than
0.15.”

Figure 5 Picture of the vector “Exactly,” Avona exclaimed. “Well done. Here is the drawing
analysis for fitting Y as a linear for this vector analysis (Figure 5).”
function of X.

8.5
9.5
11.0

Raw
Data
e
Lin
n
sio
es
gr
Re

Coded

“Notice that the profit signal vector is parallel to the coded


X data vector at the lower left. This is always true because the
profit signal vector is always equal to the coded X data vector
multiplied by the slope of the best-fitting line, 1.25 in this
case.

“If the slope is larger, the relationship between X and Y is


stronger, and the profit signal vector is longer. If the slope is
smaller, the relationship between X and Y is weaker, and the
profit signal vector is shorter.”

Now Mary had a question. “OK, but how do I use all this to
make a prediction?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Predicting Profits 187
Avona responded, “Graphically, what we do is start at a coded
X value in Figure 4, go up to the fitted line, then over to the
raw Y data axis. The number on that axis is the prediction.
For example, if we had an X value of 0.5, the predicted Y
value would be somewhere between 10.0 and 10.5.

“We can be more exact if we are willing to deal with the


actual equation of the line:

Predicted flight time = 9.67 + 1.25 × (Coded drop height)

“9.67 is the average flight time in our practice data set, and
1.25 is the slope. If the coded drop height is 0.5, then we get:

Predicted flight time = 9.67 + 1.25 × (0.5)

= 9.67 + 0.67

= 10.34

“Applying this equation to the coded X values -1, 0 and 1 is


the same as adding together the Y data average vector and the
profit signal vector. The result of this is called the predicted Y
vector (Table 4).

“If you plotted the predicted Y values versus the coded X


values, they would fall exactly on the straight line in Figure
4.”

“Is the predicted Y vector visible in Figure 5?” asked Mary.

“Yes,” answered Avona. “It’s the vector in the shaded plane


labled “Regression Line”. It goes from the point (0, 0, 0)
up to the point where the noise and profit signal vectors
intersect.”

Mary had one final question. “If we make a prediction, don’t


we have to state the prediction error based on the variation in
our data?”

“Right again,” said Avona. “But let’s save that for when you
get your real data. The statistical software will automatically
show you the prediction error.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


188 Predicting Profits

Table 4 Predicted Y vector from fitting


Y as a linear function of X.

Predicting Real Flight Times

“OK, Avona, I have my data now,” announced Mary as she


barged into Avona’s office.

“I see you bought your own copy of the statistical software,


too,” observed Avona. “Good job.”

“Yeah. I get four times as much work done in a quarter of the


time it would take with a spreadsheet. I’m giving Corrugated
Copters much better information and spending more time
with my family.

“Anyway, about my study. We did five tests at each of three


drop heights. I entered my data into the spreadsheet template
you gave me (Table 5). I used coded values -1, 0 and 1 for
the low, medium and high drop heights. The p-value is 0.000,
so there is a real relationship here, beyond a reasonable doubt.

“The overall standard deviation of the flight times is 0.288


seconds. The noise standard deviation is 0.056 seconds.
This is astonishing. We can eliminate 81% of our flight time
variation by controlling the drop height.”

“How in the world did you come up with 81 percent?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Predicting Profits 189
“That was hard. I took a lucky guess. I used the last line in
Table 5 to puzzle it out. See the standard deviation of the
variation vector is 0.288. The standard deviation of the noise
vector is 0.056. Well, 0.056 is 19.4 percent of 0.288. It all
added up so I figured that is why you put the last line in your
spreadsheet template.

“Impressive,” admitted Avona. “Great job, Mary.”

“Thank you. Now check me on this next thing. I think the


predictive equation is this:

Predicted flight time = 1.60 + 0.34 × (Coded drop height).

Is that right?”

Table 5 Vector analysis for fitting flight time (Y) as a linear function of coded drop height (X). The
raw data are Mary’s actual flight times minus the objective of 9 seconds.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


190 Predicting Profits

“Yup, said Avona. “You’re batting 1000 today, Mary. 1.60 is


the overall average of your Y data, and 0.34 is the slope of the
best-fitting line using coded X data. Your equation is right
on the money. And, given the reduction in variation you’ve
demonstrated, I mean ‘money’ in the literal sense.

“Yesterday, I said I would show you how to use the statistical


software to get a picture with predictions and prediction
limits. Let’s do that now.”

Avona clicked her mouse two or three times to produce the


graph in Figure 6. “This shows the data, the best-fitting
line, and the 95% prediction limits. When these limits are
narrower, your predictions are more accurate. When they are
wider, your predictions are less accurate.”

“Is there an easy way to calculate the limits?” asked Mary.

“Yes,” answered Avona. “The upper limit is approximately


two noise standard deviations above the predicted value, and
the lower limit is approximately two noise standard deviations
below the predicted value. Your noise standard deviation
is 0.056 seconds. Two times this is 0.112. So, with 95%
confidence, predictions from your equation will be accurate
almost to plus or minus one tenth of a second. Not bad.”

Figure 6 Picture of the best-fitting


straight line for predicting flight time
(Y) as a linear function of coded drop
weight (X). The 95% prediction limits
are approximately two noise standard
deviations above and below the line.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Predicting Profits 191

Closing Arguments

Peter L. Bernstein, noted economist, economic advisor to


nations and multinational companies, and author of the
Business Week, New York Times, and USA Today best seller,
Against the Gods, The Remarkable Story of Risk, comments on
the value of prediction and regression.7

“Forecasting—long denigrated as a waste of time at best and


at worst a sin—became an absolute necessity in the course of
the seventeenth century for adventuresome entrepreneurs who
were willing to take the risk of shaping the future to their
own designs.

“Commonplace as it seems today, the development of business


forecasting in the late seventeenth century was a major
innovation.

“If nature sometimes fails to regress to the mean,


human activities, like sweet peas, will surely experience
discontinuities, and no risk management system will work
very well. Galton recognized that possibility and warned,
‘An Average is but a solitary fact, whereas, if a single other
fact be added to it, an entire Normal Scheme, which nearly
corresponds to the observed one, starts potentially into
existence.’” 8

Endnotes

1
Gould, Jay Stephen. The Mismeasure of Man. New York:
W.W. Norton & Company, 1996. page 272.

2
Galton’s complete works are available from a variety of
library and Internet sources. This quote comes from http:
//www.mugu.com/galton/essays/1890-1899/galton-1890-
nareview-kinship-and-correlation.html The origin of this web
page is the comprehensive http://www.mugu.com/galton/
start.html

3
http://www.mugu.com/galton/
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
192 Predicting Profits

4
http://www.fme.fujitsu.com/products/biometric/pdf/Find_
FPS.pdf

5
http://www.fme.fujitsu.com/products/biometric/pdf/Find_
FPS.pdf

6
http://www.wvu.edu/~bknc/BiometricResearchAgenda.pdf

7
Bernstein, Peter L. Against the Gods, The Remarkable Story of
Risk. New York, John Wiley & Sons. 1966. Page 95.

8
Bernstein, Peter L. Against the Gods, The Remarkable Story of
Risk. New York, John Wiley & Sons. 1966. Page 182.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 7

Sustaining
Results

S
tewardship entails honorable conduct in the
management of other people’s property. But, it is
broader than that. It also includes respect for the
people’s moral responsibilities. These responsibilities require a
constant demonstration of trustworthiness in economic and
personal conduct.”1 observed William G. Scott and David K.
Hart in Organizational Values in America.

The privileges of executive corporate leadership are paired


with responsibilities. Every corporate director and senior
level executive lives in the world of uncertainty. More than
profit is at stake when a manager begins the workday. Jobs
and the welfare of one’s community are on the line with every
significant decision.

Six Sigma products and services are powered by evidence-


based decisions. Evidence-based decisions reduce uncertainty.
They quantify financial and human risk in ways that can
be validated and replicated. They produce near perfect, Six
Sigma performance. Near perfect, Six Sigma performance
engenders trust. Customers confidently bet their lives, and the
lives of their loved ones, on Six Sigma performance.

Poor quality management decisions can and do injure our


world. Three Mile Island and Chernobyl are monumental
governmental management blunders. The Copper-
7 Intrauterine Device (IUD) and Thalidomide are
quintessential medical management mistakes.2 The after
effects of the Exxon Valdez wreck in Alaska, the Union
Carbide plant explosion at Bhopal, India, Hooker Chemical’s
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
194 Sustaining Results
Love Canal disaster at Niagara Falls, the sinking of Brazil’s
PetroBras platform and its resulting million-gallon oil spill,
all bear witness to the wide ranging effects of poor quality
decisions.

Evidence-based decisions provide a safety net that can help


protect us from poor quality judgments. Seventeen years ago,
Nobel Laureate Richard Feynman used an “Avona” model to
demonstrate in no uncertain terms that NASA scientists did
not understand the concept of correlation.3,4 It is now widely
acknowledged that Challenger might still be flying if NASA
managers had applied vector analysis to a data matrix in
1986. These powerful tools were widely available at the time.
Unfortunately, in 2003 it is clear that NASA managers are
still basing important decisions on spreadsheet calculations.

Six Sigma has shrunk, and is shrinking, our globe. As one


executive told us recently, “Our home office is Earth. The
staff meetings I go to look like the United Nations. We
emphasize diplomacy both inside and outside the company.
Our international relationships build the teamwork we need
to compete.”

This new level of thoughtfulness is a good thing. Managers


around the world now understand, in dollars and cents,
that how they treat a worker sewing a soccer ball in Pakistan
affects not only the outcome of the World Cup, but also the
political stability of the nations where they do business.

These are things to think about as we rejoin Corrugated


Copters. Tom, Dick, Mary and Avona have stabilized their
flight times. They know which way is best. Their challenge
now is to hold and gain market share. They need to sustain
their best practices and profits.

Evaluating Practices and Profits

“Avona,” said Mary, “Our customers are insisting that we


manufacture helicopters with virtually no variation in flight
time. They told me we have to have a Cpk of 1.5 or better, or
we are out as their main supplier. What on earth is a Cpk? Is it
an abbreviation for something?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 195
“No, it’s just another goofy Six Sigma symbol,” said Avona.

“What?”

“Just kidding. Seriously, Cpk is a numerical index that


quantifies how capable a process is of producing virtually
perfect quality output. A Cpk of 1.5 implies no more than 3
or 4 defective products or services per million delivered.”

“That level of quality is impossible,” said Dick, who just


wandered into the break room with a fresh steamed latte.

“Not at all,” chided Avona. “A Cpk of 1.5 is an accepted


international standard, at least for components and sub-
processes. In fact, companies that cannot or will not produce
that level of quality are getting edged out or even kicked out
of the marketplace.”

“You’re kidding.”

“No, I’m not. This is old news. Six Sigma is not just a passing
fad. It is an extremely disciplined way of competing for
market share. Let me show you how to calculate a Cpk value
from your data.”

“What? Cpk isn’t just the same old New Management


Equation?”

“Like most other things in Six Sigma, Cpk is based on the


New Management Equation. It is a function of the average
and the standard deviation. It puts an additional spin on these
by combining them with the Upper and Lower Specification
Limits.” Avona pulled out a piece of paper and wrote the
following expressions.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


196 Sustaining Results

“Cpk is defined to be the smaller of two other numbers called


Cpl and Cpu. Cpl is the number of process standard deviations
between the process average and the Lower Specification
Limit, divided by 3. Get it? Process? Lower? Capability? Cpl ?
Don’t you just hate the way acronyms aren’t even arranged in
order? Cpu is the number of standard deviations between the
average and the Upper Specification Limit, divided by 3.”

“Why are they divided by 3?” asked Mary.

“It’s an arbitrary convention. Everyone uses it. We’re stuck


with it. Anyway, we want Cpk to be as large as possible. This
means we want both Cpl and Cpu to be as large as possible.”

“Maybe you could draw us a picture,” suggested Dick. “You


could even put a right triangle in it.”

Figure 1 A process with Cpk =


0.67. The average is 2 standard
deviations above the Lower
Specification Limit (LSL) and 4
standard deviations below the
Upper Specification Limit (USL).
The Cpl is 2/3 = 0.67 and the Cpu
is 4/3 = 1.33. Roughly 2.5% of
outcomes from this process will fall
below the Lower Specification Limit.

“Good idea, Dick. The right triangle actually gives us a


convenient place to put the standard deviation. Let’s say we
have a Lower Specification Limit (LSL) of 7.2 and an Upper
Specification Limit (USL) of 10.8. I’ll use a bell-shaped curve
to represent process variation. Here’s a process with an average
of 8.4 and a standard deviation of 0.6 (Figure 1). The
average is 2 standard deviations above LSL and 4 standard
deviations below USL. Therefore,

Cpl = 2/3 = 0.67

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 197

and

Cpu = 4/3 = 1.33

Cpk is the smaller of these two numbers, 0.67. This implies


that roughly 2.5% of outcomes from this process will fall
below LSL.”

“That’s not good,” observed Mary.

Avona drew another picture. “Here’s a process with an average


of 9.0 and a standard deviation of 0.6 (Figure 2). The
average is 3 standard deviations above LSL and 3 standard
deviations below USL. Therefore,

Cpl = 3/3 = 1.00

and

Cpu = 3/3 = 1.00

Cpk is the smaller of these two numbers, but in this case


since the process is perfectly centered, these numbers are the
same. So Cpk equals 1.00. This implies that roughly 0.3%

Figure 2 A process with Cpk = 1.00.


The average is 3 standard deviations
above LSL and 3 standard deviations
below USL. The Cpl is 3/3 = 1.00 and
the Cpu is 3/3 = 1.00. Roughly 0.3%
of units produced from this process
will fall below LSL or above USL.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


198 Sustaining Results
of outcomes from this process will fall below LSL or above
USL.”

“That’s better,” observed Mary.

“But still not good enough for Six Sigma,” said Avona. She
drew a third picture. “Here’s a process with Cpk = 1.33. The
average is 9.6 and the standard deviation is 0.3 (Figure
3). The average is 8 standard deviations above LSL and 4
standard deviations below USL. Therefore,

Cpl = 8/3 = 2.67

and

Cpu = 4/3 = 1.33

This implies that roughly 32 outcomes per million will fall


above USL.”

Figure 3 A process with Cpk = 1.33.


The average is 8 standard deviations
above LSL and 4 standard deviations
below USL. The Cpl is 8/3 = 2.67
and the Cpu is 4/3 = 1.33. Roughly
32 outcomes per million will fall abve
USL.

Avona drew a fourth picture. “Here’s a process with Cpk =


1.67. The average is 9.3 and the standard deviation is 0.3
(Figure 4). The average is 7 standard deviations above LSL
and 5 standard deviations below USL. Therefore,

Cpl = 7/3 = 2.33 and

Cpu = 5/3 = 1.67

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 199
This implies that roughly 287 outcomes per billion will fall
above USL.

“OK, now for your quiz.” Mary and Dick groaned. “What
would Cpk be if we moved the average to 9, right in the center
of the specification range?”

Figure 4 A process with Cpk = 1.67.


The average is 7 standard deviations
above LSL and 5 standard deviations
below USL. The Cpl is 7/3 = 2.33 and
the Cpu is 5/3 = 1.67. Roughly 287
outcomes per billion will fall above
USL.

Mary answered first. “Then the average would be 6 standard


deviations above LSL and six standard deviations below USL.
Cpl and Cpu would both be equal to 6/3, which is 2, so Cpk
would be 2.”

“Right on,” said Avona. “And FYI, a process with that level
of capability would produce no more than 2 or so defective
outcomes per billion.”

After a moment of silence, Dick said, “Now that’s something


to aspire to.”

Process Improvement Simulation

“Just for grins, let’s roll some dice,” suggested Avona. “I will
record them as you roll. Viva Las Vegas!”

After a few rolls, Mary said, “What a surprise! I’m getting


numbers between 2 and 12. Hmm, it looks like more of them
are in the middle of the range than at the extremes.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


200 Sustaining Results
She entered the numbers into her calculator. “The average is
about 7 and the standard deviation is a little over 2.

“Something must be wrong with my dice,” observed Dick.


“All I get are elevens. What’s up with that? Oh, wait a minute.
One of my die has a five on every side and the other one has a
six on every side. Avona, where did you get these?”

“Wizards of the Coast,” answered Avona. “They have lots of


games based on probabilities and three dimensional reasoning.
By fixing the dice so you always get the same outcome, you
are playing with a Six Sigma process. Perfect elevens every
time. The only way you can get a different answer is to write
down the wrong number.”

“Let work out how many ways there are to get each possible
outcome with two regular dice,” said Dick (Figure 5).

Figure 5 Seven is the most probable


outcome when rolling two dice. Six
different combinations will produce a
seven, while only one combination will
produce either a 2 or a 12.

Tom chimed in, “I know a better way to present the outcomes


(Figure 6).”

“Don’t you always?”

“See, this way it looks sort of like a bell-shaped curve. I bet


that makes Avona happy.”

“It does,” said Avona. “In fact, it’s a nice segue into what I
wanted to show you. We can use the throwing of two or more

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 201

Figure 6 The frequency distribution of


outcomes when throwing two dice.

dice to simulate the evolution of process capability through


Six Sigma breakthrough projects. Each die will represent
a cause of defects. The sum of the dice will represent the
number of defects per helicopter.

“The average flight time for our current design is 11 seconds.


That means our average profit margin is $2 million. Let’s say
these defects cost an average of $100,000 each to repair. If a
helicopter has 20 defects, we break even. If it has more than
20 defects, we lose money. Therefore, our Upper Specification
Limit is 20 defects.”

“20 sounds like an awful lot of defects,” said Dick. “Can’t we


make it a lower number?”

“Work with me, Dick. It’s just a simulation. OK, let’s get
started. Our initial process involves four dice.”

For each simulation, one of them threw the four dice, one
of them called out the result, and one of them entered the
result into their data matrix statistical program. They traded
jobs once in a while. After completing 1000 simulations, they
decided they had had enough. Avona showed them how to do
a process capability analysis with two mouse clicks. (Figure
7).

Figure 7 Process capability analysis


of the initial “four dice” process.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


202 Sustaining Results

“We get only Cpu in this analysis because there is only an


upper specification limit. In situations like this, Cpu and
Cpk are the same thing. Using the bell-shaped curve with
‘Mean’ marked on it, my calibrated eyeball tells me the
average number of defects per unit is about 14. The standard
deviation is 3.4. Cpk is 0.56. As you can see, the number of
defects on 4.7% of helicopters will be ‘Above USL’. In other
words, 4.7% of them will have more than 20 defects, and we
will lose money.”

“We might also lose market share,” said Mary. “There’s no


guarantee we can catch all the defects before a helicopter
goes to a customer. It would be better to keep them from
happening in the first place.”

“You got it, girl,” said Avona. “Let’s assume now that we
have data-mined our process data base, using vector analysis
of course, prioritized the causes of defects, and successfully
eliminated one of the top causes. Our improved process
involves only three dice.”

After completing another 1000 simulations, Avona showed


them the capability analysis of the improved process (Figure
8).

Figure 8 Process capability analysis


of the new, improved “three dice”
process.

“Average defects per unit, using the calibrated eyeball method,


the mean marked on the small distribution curve and the
0-25 scale just below it in Figure 8, have gone from 14
to about 10. This is an average savings of $400,000 per
helicopter. The standard deviation is about 3. Cpk is 1.05. The
number of defects on 831 parts per million (PPM) will be

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 203
‘Above USL’. In other words, we’ll make money on all but one
helicopter in a thousand.”

“This does look a lot better than the old process,” said Tom.
“But what if the average drifts up over time?”

“That’s a very good question,” said Avona. “That’s exactly


why we can’t afford to be satisfied with a Cpk that is barely
over 1. We need to eliminate other causes of defects so that
upward drifts don’t result in yield loss. Let’s say we have done
that, and our process now involves only two dice.”

After they completed another 1000 simulations, Avona


showed them the capability analysis of the new process
(Figure 9).

Figure 9 Analysis of the ultra-capable


“two dice” process.

“Average defects per unit, using the calibrated eyeball method,


has gone from 10 to about 7. This is an additional savings of
$300,000 per helicopter. The standard deviation is about 2.5.
Cpk is 1.75. The number of defects on 78 parts per billion
(0.078 PPM) will be ‘Above USL’. In other words, we’ll make
money on all but one helicopter in 10 million.”

“That’s not all,” added Tom. “With an upper three-sigma


limit of about 14 as a control limit, we can detect upward
drifts before they cause yield loss.”

“Good point,” noted Avona. She did another simulation, a


small one this time. She produced a control chart showing

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


204 Sustaining Results
what would happen if the causes of defects in the “three dice”
process came back (Figure 10).

Figure 10 The first 40 units come


from the “two dice” process, the
last 10 come from the “three dice”
process. The Upper and Lower
Control Limits (UCL and LCL) are the
three-sigma limits for the “two dice”
process.

“A control chart uses the average and the three-sigma limits


to monitor a process over time,” explained Avona. “In my
simulation, the first 40 units came from our “two dice”
process. The last 10 units came from the “three dice” process.
Can you see what happened on the chart?”

“Yes,” said Dick. “The dots got a lot bigger.”

“Thank you, Dick. Actually, I did that myself to distinguish


the three-dice units from the two-dice ones. What about the
data in relation to the average and upper three-sigma limit?”

“Well,” said Mary, “the two-dice units were evenly distributed


around the average, and none of them were above the upper
three sigma limit. The three-dice units are all above the
average, and one of them is above the upper three-sigma
limit.”

“That’s right,” said Avona. “Those are the two most


important rules for interpreting a control chart. These are
signals that something has changed. So what you said before
was exactly right, Tom. We can use the average and the three-
sigma limits of the two-dice process to catch an upward drift
before it causes any yield loss.”

“Also, these rules give us operational definitions of when to


initiate troubleshooting,” said Tom. “Otherwise, everyone
would have a different interpretation of the same data. Hmm,
this reminds me of the ‘standards of evidence’ you’re always
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Sustaining Results 205
talking about, Avona. Are control charts related in some way
to that?”

“Bingo!” exclaimed Avona. “Allow me to explain.”

“OK,” said Mary, “but can we take a break first?”

Monitoring Practices and Profits

After the break, Avona showed the team a table of financial


results (Table 1).

Table 1 Quarterly financial report


(thousands of dollars).

“In many companies, the Executive Committee agonizes


over numbers like these every quarter. They make bar charts
(Figure 11). They try to figure out what went wrong in
Quarter 5, and who to blame. They try to take credit for
Quarter 13, and come up with imaginative explanations.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


206 Sustaining Results

Figure 11 Quarterly financial results


(thousands of dollars).

“Don’t we do the same thing?” asked Mary.

“We used to,” answered Avona, “but not any more. Not since
Rotcev took over. He immediately insisted that we apply
standards of evidence everywhere, not just in manufacturing.”

“Rotcev is almost as enthusiastic about this stuff as you are,


Avona,” commented Dick. “But I didn’t realize you could use
it on financial data. I thought the accountants had their own
special ways of doing things.”

“They did,” said Avona. “That was the problem. They could
make the numbers say pretty much whatever our previous
CEO wanted them to say.”

“To do this right,” continued Avona, “we need to put the


numbers into a data matrix (Table 2). Then we can apply a
vector analysis.

“Looking only at totals is a big problem with traditional cost


accounting variance analysis. For example, if we look only at
quarterly totals, we lose all the information in the monthly
numbers.

“If we had weekly data, we would start with that. Looking


only at monthly or quarterly totals would lose all the
information in the week-to-week variations.

“With vector analysis, we use all the information in whatever


data we have. The analysis in Table 2 is exactly the same as if
we were comparing 15 ways of doing something. The profit

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 207

Table 2 Data matrix and vector


analysis for quarterly review of
monthly financial data.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


208 Sustaining Results
signal vector contains all the information about differences
among the 15 quarters. The noise vector contains all the
information in the month-to-month variations.

“Here is the cornerstone of evidence for this vector analysis


(Figure 12). It gives the lengths of all the vectors.

Figure 12 The cornerstone of


evidence for the vector analysis in
Table 2. The numbers in parentheses RAW DATA (3356)
are the lengths of the vectors. The
three long vectors are not drawn to

(185TION
scale. (335 NOISE

)
2)

IA
DA (162)
TA

VAR
AV
E RA
GE
(33
50)
IT
R OF AL
P GN
SI (90)

“Who can tell me if there are any significant differences


among the 15 quarters?”

“There aren’t any,” Mary quickly replied. “The p-value is


0.792. It doesn’t meet any standard of evidence. The apparent
quarter-to-quarter changes are just noise, not signals.”

“That’s right,” said Avona. “Now, the F ratio basically


compares the degree of variability in the profit signal vector
to the degree of variability in the noise vector. We reached our
conclusion because the F ratio wasn’t large enough to achieve
a standard of evidence. In other words, the profit signal
variation wasn’t large enough compared to the noise variation.

“We can confirm this visually by plotting the profit signal and
noise vectors together on a single graph (Figure 13).

“The solid line is the profit signal vector and the dotted line
is the noise vector. They are plotted in time sequence. The
overall degrees of variability are about the same.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 209

Figure 13 The numbers in the profit


signal vector are plotted as the solid
line. Each of these is an average of
3 numbers in the variation vector.
To make the visual comparison
statistically valid, the numbers in the
noise vector were first divided by the
square root of 3. Don’t blame us, it’s
a Law of the Universe. The adjusted
noise numbers are plotted as the
dotted line.

“Creating a graphical comparison like this is a little trickier


than it looks. To make the comparison statistically valid, I had
to divide the numbers in the noise vector by the square root
of 3. This is because each number in the profit signal vector is
an average of 3 numbers in the variation vector. I know this is
confusing, but it’s a Law of the Universe.

“Anyway, in 1924 a man named Walter Shewhart was trying


to come up with a good graphical method for analyzing data
over time when there is a natural or logical way of grouping
the data. For example, we grouped our raw monthly data by
calendar quarters. That made sense because the Executive
Committee reviews it on a quarterly basis. Shewhart called
these rational sub-groups.5

“Instead of plotting the profit signal vector and adjusted


noise vector on top of each other, Shewhart decided it would
be better to plot just the profit signal numbers, and use
horizontal lines to represent the upper and lower three-sigma
limits of the adjusted noise numbers. Also, he decided to add
the data average vector to the profit signal and noise vectors.
He felt this would be easier to interpret.

“In other words, he invented what we now call the X-bar


chart control (Figure 14).

“This control chart tells us the same thing as the F ratio: the
quarter-to-quarter changes are just noise.”

“I have a question,” said Dick. “Do we have to do the vector


analysis all over again every quarter?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


210 Sustaining Results

Figure 14 The dots are the averages


of the monthly revenues in each
quarter, not the totals. The centerline
is the grand average of all the
monthly numbers. The Upper Control
Limit (UCL) is 3 noise standard
deviations above the average.
The Lower Control Limit (LCL) is 3
noise standard deviations below the
average.

“That’s a good question,” answered Avona. “Fortunately, the


answer is no. Once we have a good baseline, like we have in
this example, we hold the control limits constant and just plot
the new numbers as time goes by.”

“I guess the fundamental things you’ve taught us really do


apply,” said Dick.

“OK, here’s your quiz. What are two ‘events’ on this chart that
would indicate a real change of some kind?”

“A point outside the control limits,” said Tom.

“A bunch of points in a row above the center line,” said Mary.

“Right on both counts,” said Avona. “Remember, Mary, it


could also be a bunch of points below the center line. And, by
the way, the usual requirement is eight in a row for a statistical
signal.”6

“This is great stuff,” said Mary. “But I’ve been wondering:


aren’t we still losing some of the information in the month-to-
month changes?”

“Excellent point,” said Avona. “Shewhart was aware of this


problem. His solution was to plot the standard deviations of
the subgroups on their own control chart (Figure 15). The
two charts together give us a complete picture of what’s going
on over time.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 211

Figure 15 The dots are the standard


deviations of the monthly revenues
in each quarter. The centerline is the
average of the standard deviations.
The Upper and Lower Control Limits
(UCL and LCL) are three-sigma limits
based on the standard deviation of
the standard deviations. Strange, but
true.

Taking Action

After the others left, Avona realized there was another basic
fact about control charts that she needed to teach them. It
wasn’t about how to set up the charts, or how to interpret
them. She felt that was pretty easy.

She knew from experience that control charts were all too
often used as “window dressing”. Maybe “wallpaper” is a
better analogy. In manufacturing at least, she knew that
control charts add real value only when they are used as a
basis for action.

She also knew that reacting to control chart signals was a


process, just like any other business activity. In order to add
value, the reaction process must be defined and documented.
It must be improved over time.

She had found the tools of Process Mapping to be ideal for


these tasks. In her experience, it worked best to have teams
of operators, supervisors, maintenance technicians, engineers
and managers develop the reaction plans together. She had a
reaction plan “skeleton” she always used to get them started
(Figure 16).

The question “Signal?” refers to one or more pre-defined


signals on one or more control charts. The charts and signals
are defined by the team that develops the plan. The term
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
212 Sustaining Results

Figure 16 A generic reaction plan


“skeleton” for a manufacturing or
service process.

“escalate” means to raise the level of the investigation by


bringing in someone with greater expertise. Ideally, the
manufacturing or service process is stopped until “Continue”
is reached. Figure 17 shows an actual example of a reaction
plan for a lot-based manufacturing process.

Figure 17 An example of a reaction


plan for a manufacturing process.

In this example, the team decided to confirm a control chart


signal by immediately taking a second sample from the
same lot. If the second sample does not show a signal, the

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 213
occurrence is documented and the lot moves on to the next
operation.

If the second sample does show a control chart signal, the


manufacturing process is put on hold while the Operator
goes through a pre-determined checklist. The checklists in a
reaction plan are determined by the team that develops the
plan. That is why it is so important that all vocations are
represented on the team: operator, supervisor, maintenance
technician, engineer, and managers.

If the operator solves the problem, the occurrence is


documented and the lot moves on to the next operation.
Otherwise, the supervisor is called in. It may be necessary to
bring in the engineer, or the maintenance technician, or even
the manager. The important point is that the manufacturing
process remains on hold until one of two things happen:

1. The problem is solved.

2. Someone of sufficiently high authority makes the


decision to resume manufacturing while the problem is
being worked on.

The keys to the success of reaction plans are:

(a) Orderly and consistent evidence-based response to


problems as they occur.

(b) Visibility of problems throughout the


organization, appropriate to their level of severity.

(c) Evidence-based decisions made at the appropriate


levels of responsibility throughout the organization.

A disciplined approach like this is a bitter pill at first.


Supervisors and managers object to the loss of production
time. After a few weeks or months, the same supervisors
and managers are singing the praises of their reaction plans.
Invariably, they have seen their unplanned downtime
plummet. Problems are being fixed right away, instead of
being ignored until they become catastrophes.

These short-term economic benefits are overshadowed by


long-term improvements in process capability. The old
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
214 Sustaining Results
“four dice” process gives way to the “three dice” process. We
think, “That’s great, but that’s it. It’s impossible to get any
better.” But we keep following our reaction plan, refining
it as we learn new things. One day we wake up and, to our
great astonishment, the impossible has happened. We find
ourselves with an ultra-capable “two dice” process. We find
our competitors using us as the benchmark.

You are asking yourself, is this really possible?

Was Pythagoras right about right triangles? Is the earth


spherical, and does it revolve around the sun? Does
multiplication work? Do gravity and electricity exist? Do
airplanes fly? Can you buy things on a computer and have
them delivered to your door? Are vectors and hyperspace real?

All the evidence we have says “yes”.

Closing Arguments

“The idea of control involves action for the purpose of


achieving a desired result.” Walter A. Shewhart. Statistical
Method from the Viewpoint of Quality Control, 1939.7

Endnotes

1
Scott, William G. and, Hart, David K. Organizational
Values in America. New Brunswick, Transaction Publishers,
1991. Page 139.

2
Committee on Quality of Healthcare in America. Kohn,
Linda T., Corrigan, Janet M., Donaldson, Molla S. Editors.
To Err is Human, Building a Safer Health System. Washington,
D.C. National Academy Press. 2001.

3
http://www.student.math.uwaterloo.ca/~stat231/stat231_
01_02/w02/section3/fi4.4.pdf and http://www.ralentz.com/
old/space/feynman-report.html

4
http://www.uri.edu/artsci/ecn/mead/306a/Tuftegifs/
Tufte3.html

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Sustaining Results 215
5
Shewhart, Walter A. Economic Control of Quality of
Manufactured Product, New York, D. Van Norstrand
Company, Inc. 1931. Republished in 1980 by American
Society for Quality Control.

6
AT& T Technologies. Statistical Quality Control Handbook,
copyright 1956 by Western Electric. Copyright renewed by
AT&T Technologies, Inc. 1984.

7
Shewhart, Walter A. Statistical Method from the Viewpoint of
Quality Control. New York, Dover Publications, Inc. 1986.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


216 Sustaining Results

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Chapter 8

The Three Rs

E
ducation and training are the first steps in building an
organization founded on evidence-based decisions and
the New Management Equation. Knowledge and skill
necessarily change the nature of authority.1 Trust, decency and
respect replace fear and favor as social adhesives.2

American history provides an excellent road map for


redefining the 3 Rs—Reading, wRiting, and aRithmetic—to
Reading, wRiting, and vectoR analysis.

John Adams wrestled with evidence as we all do. “Facts


are stubborn things; and whatever may be our wishes, our
inclinations, or the dictates of our passions, they cannot alter
the state of facts and evidence.”3 Adams and his colleagues
were as passionate about intellectual liberty as they were about
freedom. “Liberty cannot be preserved without a general
knowledge among the people.”

Thomas Jefferson wrote not only to Adams, but to all of us


in his 1821 autobiography. “We thought that on this subject,
a systematical plan of general education should be proposed,
and I was requested to undertake it. I accordingly prepared
three bills for the Revisal, proposing three distinct grades of
education, teaching all classes.”4

Jefferson’s friend and ghostwriter got, characteristically, to


the point. “Education should be on the spot; and the best
method… I call for the education of one million and thirty
thousand children.” This was a bold proposal. Those of
us who enjoy the rare combined privilege of Untied States
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
218 The Three Rs
citizenship and an American public education can thank
Thomas Paine.5 Paine’s outlandishly impractical investment
scheme turned out to be the bargain of the millennium.

We predict that in the new millennium, Paine’s good deal can


yield even better bottom line business results.

Six Sigma’s Hidden Factory

“I have been thinking about what you have taught us Avona,”


said Dick. “Obviously I had a hard time understanding those
spreadsheet tables. Once you showed me the Pareto chart that
rank-ordered the factors, my light finally went on.”

Roctev the CEO, Avona, Tom and Mary listened. They were
smiling.

“Thanks for not making fun of me when my lights were


out,” said Dick. “Numbers make me nervous. Then I get
embarrassed. Then I say dumb things I wish I could take
back.

“Anyway, I drew a flow diagram yesterday. That darned


picture kept me awake all night long. I was driving my wife
crazy tossing around. So I got up at 3 AM and came into
work.”

Roctev, Avona, Tom, and Mary looked at Dick’s map


(Figure 1).

“You are one brave guy,” complimented Roctev. “The last


time an employee told me I was full of baloney was when I
was a Vice President of Marketing.”

Everyone stared at Roctev.

“I can see exactly what you mean by my comfort zone. I


began getting uncomfortable in 1986. Before that, I had it
all. Big office. Big desk. Four phone lines. Two secretaries and
a million-dollar advertising budget. Heck, if I couldn’t make
myself look better than everyone else at the annual review, I
would have had a real problem.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


The Three Rs 219

Figure 1 The hidden factory of


traditional Six Sigma. Projects “One day I was working a night shift to ‘get close to my
are delayed and deferred, not employees.’ I sat down next to a clerk. I believe she was 19. I
because of cost accounting, do remember she was a sophomore at USC. I remember that
finance or data analysis. Bold
because USC stood for the University of Southern Colorado,
proposals can make people so
uncomfortable that they would not the University of Southern California. She was none too
rather waste money than upset pleased to have me pay her a social visit.
the status quo.
“So, after about 30 minutes of chit chat, and since the lobby
was vacant at one AM, she told me she had homework to do.
I tried to keep up the conversation by expressing an interest.
She shut her mathematics text and looked right at me.”

“You know, you guys in senior management don’t have a clue,


do you.” she said.

“Ummm. What do you mean exactly?”

“Well I read that bull poop memo about your recent


management decision. You know the one with all the
numbers?”
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
220 The Three Rs

“Yes. I do.”

“Well, anybody who has taken Statistics 101 at USC can tell
you don’t even know how to do an Analysis of Variance.”

“I had taken Statistics 101 as a foreign exchange student at


Baldwin-Wallace College in 1969. I never did learn what
Analysis of Variance meant, so I said, ‘Well actually I did take
that class but I got a C in it and that was a gift. Please show
me what you mean.’”

“Here, I will show you.”

“She pulled out a piece of graph paper, a calculator, a ruler,


and a pencil. She drew me a picture. Our layoff plan to save
money and our $11 million senior management data analysis
on the supposed need for a massive building program just got
an F. Her whole show took about five minutes. I thanked her
and excused myself.

“I figured if a 19-year-old could see through the faulty


reasoning behind the decisions our CEO, CFO, COO and
me were making, maybe the other 1,000 employees could too.
I did not sleep well. But, I did decide to confront my math
phobia. I learned how to draw control charts. My colleagues
chose to keep on bamboozling. Many of them still are. They
all got promoted.

“One thing led to another. Now I am a CEO. Instead of a 19-


year-old kid with spunk, I have a top-flight team. You.”

“Wow,” said Mary.

“Whew,” said Dick. “I thought you’d go seriously supersonic


if you thought I was saying you blame accounting and finance
instead of stepping up to the plate and just saying, ‘This
change scares the heck out of me.’”

Roctev looked at Dick, “So, you say we have this gigantic,


Six Sigma hidden factory of rework. Just for the sake of the
argument you thought we would have, let’s say your map is
true. My comfort zone is the problem that stops projects. As
CEO, I am the reason for Six Sigma project rework. What
should I do? What would you do?”
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
The Three Rs 221

“I would start phasing out the Six Sigma bureaucracy.”

“What!” cried Tom and Mary who had just mounted and
framed their Black Belt certificates. Avona chuckled quietly
and looked at her shoelaces. Roctev nodded his head.

“Dick, you just earned your American Society for Quality


Six Sigma Black Belt certification. Do you mean to tell me
that you are proposing to give it all up for the good of the
company?” asked Roctev.

“Let me get this straight Dick,” said Tom with a stern look on
his face. “Are you saying that Black Belts aren’t needed?”

“No. No.” replied Dick. “You are an expert. You are a teacher.
We need experts and good teachers. But people respect you
and Mary because of what you know and do, not because of
your numbered certificates.

“All I am saying is, I think we need everybody. Everybody


has a brain. All the people we work with have imaginations.
Avona, her models, simulations, and the software have made
Six Sigma so simple, everybody can contribute.

“Six Sigma is simply the use of evidence-based decisions. That


idea is as old as Aristotle. All this Black Belt and Green Belt
stuff is overhead. It would be less costly, and more effective if
we called our Six Sigma program the Three Rs.”

“Huh?” said Mary.

“You know. We could have some fun with it. Reading,


wRiting, and vectoR analysis,” suggested Dick. “Maybe it
could be Reading, wRiting, and Refraction. Whatever. We
could just call Six Sigma ‘literacy’. Let’s hire people who
are literate, or who want to be literate. I even used a cube to
outline the three factor interaction.” (See Figure 2)

“Literacy?”

“Yes. Let’s call the way we work literacy and be done with it.”

“Let’s go get a latte, chai and biscotti,” said Avona. “I’m


buying.”
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
222 The Three Rs

Figure 2 Literacy now refers to people


who know how to read, write and
vector-analyze numbers.

vectoR
analysis

wRiting

Reading

Our Proposal

A global workforce that is literate in the Three Rs of the New


Management Equation is an excellent value. It is far less costly
than the alternatives. But there is a cost.

“New arts destroy the old. See the investment of capital


in aqueducts, made useless by hydraulics; fortifications by
gunpowder; roads and canals, by railways; sails, by steam;
steam by electricity,” wrote Ralph Waldo Emerson.6

His observations ring true as we watch vacuum tubes made


almost useless by transistors, transistors by silicon chips;
palpation by Magnetic Resonance Imaging; auscultation
by ultrasound; poisonous purple foxglove seed remedies
by quality controlled digitalis; telegrams, by wireless
communication; typewriters by computer keyboards; the
cost accounting variance analysis by the Analysis of Variance;
spreadsheets by the data matrix software.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


The Three Rs 223
In 1992, while grappling with the hand calculations, ruler,
pencil, and Xeroxed chart template required to produce a
statistical process control chart, the Chief Executive Officer
of Northwest Hospital in Seattle at that time, criticized cost
accounting by quoting Emerson. His daring Total Quality
Management (TQM) observation is intrepid today.7

“A foolish consistency is the hobgoblin of little minds, adored


by little statesmen and philosophers and divines.”

Though none of us were able to articulate a proposal for


improving the cost accounting variance analysis back then—
that being a vector analysis applied to a data matrix—we came
to discover that the rest of Emerson’s quote was prophetic.
Sometimes you just get lucky.

“With consistency a great soul has nothing to do. He may


well concern himself with a shadow on the wall. Speak
what you think in hard words and tomorrow speak what
to-morrow thinks in hard words again, though it contradict
every thing you said to-day— ‘Ah, so you are sure to be
misunderstood,’—Is it so bad then to be misunderstood?
Pythagoras was misunderstood, and Socrates, and Jesus, and
Luther, and Copernicus, and Galileo, and Newton and every
pure and wise spirit who ever took flesh. To be great is to be
misunderstood.”8

Each age, as Emerson pointed out, must write its own books.
The books of an older generation will not fit ours. Motorola’s
Six Sigma business initiative was designed at a time when
a dual 5.25-inch floppy disk drive IBM computer with an
amber screen was an executive luxury. Harvard Graphics bar
charts on a dot matrix printer were breakthrough technology.

When General Electric got a hold of Six Sigma, the Internet


and Windows 95 were new. 9600 Baud was a fast connection.
Time flies.

Can it be our own college students were babies in the eighties?


Great Caesar’s Ghost! We are old men. How can this be?
We have grey hair. There are bald spots on the back of our
heads. We are wearing progressive lens glasses. Our Les Paul,
Stratocaster and PRS guitars and Cry Baby Wah-Wah pedals
are antiques for sale on EBay. When did this happen? Do we
look as funky as a judo gi and as old as a Six Sigma acronym?
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
224 The Three Rs

Yes. We do.

We must work, and work very hard, to stay young. We must


change with the times. We must learn, unlearn and relearn
how to do things. We must try to age gracefully.

We must negotiate and we, all of us, must get to Yes together.9

Endnotes

1
Wood, Gordon S. The Radicalism of the American
Revolution. New York, Alfred A. Knoph. 1992. page 189.

2
Wood, Gordon S. The Radicalism of the American
Revolution. New York, Alfred A. Knoph. 1992. page 189.

3
http://www.dropbears.com/b/broughsbooks/history/articles/
john_adams_quotations.htm

4
Jeffereson, Thomas. The Life and Selected Writings of Thomas
Jefferson, edited by Adrienne Koch and William Peden. New
York, Random House. 1944. Page 48.

5
Paine, Thomas. Collected Writings. New York. Library of
America. Pages 630-633.

6
Emerson, Ralph Waldo. Circles from The Portable Emerson,
edited by Carl Bode in collaboration with Malcolm Cowley.
New York. Penguin Books. Page 229.

7
Sloan, M. Daniel and Torpey, Jodi B. Success Stories in
Lowering Health Care Costs by Improving Health Care Quality.
Milwaukee, ASQ Quality Press. 1995. Pages 87-97.

8
Emerson, Ralph Waldo. Self-Reliance.

9
Fisher, Roger, and Ury, William, Getting to Yes, Negotiating
Agreement Without Giving In. New York, Penguin Books.
1981.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices

I. Glossary of Terms: Data Matrix, Vector Analysis


And Evidence-based Decisions

ANOVA – Acronym for Analysis of Variance, Fisher’s general


term for the various forms of vector analysis he developed.

Confidence level – Obtained by subtracting the p-value from


the number 1 then multiplying by 100. It is a measure of the
strength of evidence in the data against the null hypothesis.

Cornerstone of Evidence – This is a generalized tetrahedron


representing a vector analysis. Each of the four faces is a
generalized right triangle. The six sides or edges represent the
raw data vector and the five possible vector components of
variation that can be broken out of any set of raw data.

Data matrix – An array of numbers or labels in rows and


columns. Each row is an object, entity or event for which we
have collected data. Each column is one of the variables we
have measured or observed.

Data vector – A stack of numbers or labels treated as a single


entity. A column in a data matrix is a vector. It is a point in
n-dimensional space, where n is the number of rows in the
data matrix.

DMAIC – This is an acronym for Design, Measure, Analyze,


Improve and Control, which is the Six Sigma project cycle.

Factor – A controlled variable in a designed experiment.


© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
226 Appendices

F ratio – A measure of the strength of evidence in the data


against the null hypothesis. A statistic proportional to the
ratio of squared length of the profit signal vector to the
squared length of the noise vector.

New Management Equation – Our name for the


Pythagorean Theorem.

Noise – The chance, normal, common, random, statistical


variation found everywhere in Nature. It is a Law of the
Universe.

P-value – The probability of getting, by chance alone, an F


ratio as large as the one we got. A p-value less than 0.15 gives
a ‘preponderance of evidence’ against the null hypothesis. A
p-value less than 0.05 gives ‘clear and convincing’ evidence
against the null hypothesis. A p-value less than 0.01 gives
evidence ‘beyond a reasonable doubt’ against the null
hypothesis.

Profit SignalTM – Quantifies and rank orders which factors


impact any business, manufacturing, or service process. It
is the vector at the bottom right-hand, forward corner of
the tetrahedron. The ratio of the length of this vector to the
length of the noise vector in a correct analysis yields the F
ratio that measures the strength of evidence.

Profit signal vector – Same as profit signal.

Pythagorean Theorem – The square of the long side of a


right triangle is equal to the sum of the squares of the other
two sides. c2 = a2+ b2.

Tetrahedron – A three-dimensional figure with four


triangular faces and six edges.

Vector – An arrow that defines magnitude and direction,


connecting one point in space with another.

Vector analysis – The process of breaking up a raw data vector


into perpendicular vector components of variation.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 227

II. The Business Bookshelf

1. Aristotle’s books on Logic, Eudemian Ethics, and Politics


are essential. These texts outline the sequential, Inductive/
Deductive cycle of the scientific method: Hypothesis,
Experiment, and Test Hypothesis. Aristotle’s cycle is the
foundation for all science and Walter Shewhart’s original Plan,
Do, Study, Act cycle and M. Daniel Sloan’s IDEA Cycle:
Induction, Deduction, Evaluation, Action.

Posterior Analytics suggests that the triangle signifies truth.


Eudemian Ethics details the links between a respect for the
individual, knowledge, the pursuit of happiness, virtue, and a
good social order.

2. David Hume’s A Treatise of Human Understanding, 1739,


emphasizes the importance of sequential perceptions. Hume’s
ideas, writing, and thinking reflect the British personality
in applied science. Bacon, Nightingale, Newton, Darwin,
Fisher, and Box are names that can be culturally linked to
Hume’s work.

3. The Declaration of Independence, 1776. This classic


document addresses life, liberty, the pursuit of happiness,
justice, and a good social order. Jefferson, Franklin,
Washington, Adams, and other American revolutionaries were
Hume’s contemporaries. They communicated. The study of
philosophy, science, and mathematics were integral to their
lives.

4. Immanuel Kant’s Critique of Pure Reason, 1781, tackles


the complexity of quality. The circular logic of knowing, the
quality of judgments, relationships, and modality are dealt
with in one difficult, challenging text. Einstein specifically
honored this work as an inspirational force in his work.

5. Albert Einstein’s little book, Relativity, 1917. Einstein


introduces the idea that “the evolution of an empirical science
is a continuous process of induction.” Dr. Einstein’s ideas
on time, the sequential order of perceptions, measurement,
and analysis are landmarks. He specifically describes his use
of probability, the Pythagorean Theorem, and the Cartesian
coordinate system. He provides a complete listing of applied
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
228 Appendices
science thought leaders: Euclid, Galileo, Kepler, Descartes,
Gauss, Hume, and Kant.

6. Ronald A. Fisher’s works from 1913-1935. (A good


resource for finding them is Collected Papers, Volumes 1-5 J.H.
Bennet, Ed. The University of Adelaide, 1971-1974).

At age 25, Fisher’s 1915 Biometrika, 10, paper entitled,


“Frequency Distribution of the Values of the Correlation
Coefficient in Samples from an Indefinitely Large Population”
introduces the idea of using geometry to represent statistical
samples. The Pythagorean Theorem or New Management
Equation is a Generalization. It applies to samples of any size.

Fisher’s 1921 Metron paper, “On the Probable Error of a


Coefficient of Correlation Deduced from a Small Sample,”
explains the logarithmic transformation of the correlation
coeffient r that leads to a near normal distribution. He
presented a table tabulating the transformation for each value
of r.

Statistical Methods for Research Workers, 1924. This book


details the practical application of a circular, inductive and
deductive logic cycle. The thirteenth edition credits W.
Edwards Deming for the extension of the z Table to the 0.1
level of accuracy.

The Design of Experiments, 1935, is a phenomenal, seminal


work. “Inductive inference is the only process known to us by
which essentially new knowledge comes into the world.” The
importance of experimental observations must be connected
to the “precise, deductive reasoning” of Euclidean geometry.

7. Clarence Irving Lewis, Mind and the World Order, Outline


of a Theory of Knowledge. 1929. This book inspired Walter
Shewhart. The philosophy of conceptual pragmatism
led to the development of the field of Six Sigma quality
improvement.

8. Walter A. Shewhart’s Economic Control of Quality of


Manufactured Product, 1931. This book includes illustrations
and ideas from Fisher’s work and his own unique perspective
on the importance of sequential data analysis. Induction
precedes deduction.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 229
“The Nature and Origin of Standards of Quality,” was written
in 1935 and was published in the January 1958 issue of The
Bell System Technical Journal. He describes the character of
the continuous improvement cycle as legislative, executive,
and judicial.

Statistical Method from the Viewpoint of Quality Control,


1939, is taken from a series of US Agricultural Department
lectures delivered at the invitation of W. Edwards Deming.
Pages 44 and 45 contain the graphic illustration of a
continuous improvement cycle: Hypothesis (Legislative in
nature), Experiment (Executive in character), Test Hypothesis
(Judicial). Induction precedes deduction.

9. W. Edwards Deming’s Some Theory on Sampling, 1950, is a


noteworthy historical book. It was directly affected by Fisher
and Shewhart’s work. The first chapter addresses the primary
importance of the design of an experiment. Deming details
the geometry of sample variances on page 62.

On the “Distinction between Enumerative and


AnalyticSurveys,” The American Statistical Association Journal,
June 1953, comes directly from Some Theory on Sampling.
This article shows the futility of using random samples for
analyzing a dynamic process.

“On a Classification of the Problems of Statistical Inference,”


June 1942, Number 218, Volume 37, gives Deming’s vision
of a quality controlled health care system.

Out of the Crisis, 1986. Here are fourteen points to ponder


for a good social order in the workplace to ponder. His
PDCA improvement cycle dates to Aristotle. Deming cites
the influence of Shewhart, Clarence Irving Lewis, and Fisher.
It is curious to note that Deming’s 1951 understanding of
the importance of a designed experiment and the economy/
geometry of sample, is absent from this work.

10. Ludwig von Bertalanffy’s General System Theory, 1968.


This is a best-of-class book on systems thinking.

11. George Box, William G. Hunter, and J. Stuart Hunter,


Statistics for Experimenters, An Introduction to Design, Data
Analysis, and Model Building, 1978. This is a master work of
applied science. The pictures R.A. Fisher imagined are drawn.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
230 Appendices
Many of the important algebraic expressions Fisher wrote are
translated. Somehow Fisher’s ideas are simplified. Induction
precedes deduction.

One of the book’s essential main points, Fisher’s main point,


is hidden from view. The Pythagorean Theorem provides
sound theory for all standartd statistical theory.

12. Steve deShazer’s Clues: Investigating Solutions in Brief


Therapy, 1988, is the only therapy model and/or psychological
theory we know of that was developed using probability
theory, inductive reasoning, and flow diagrams. This model
for rapid improvement works well in systems of any size. Mr.
deShazer formally opposes a focus on defects, defectives, and
problems. Rather one should focus on solutions and doing
more of what works.

13. Darrell Huff, How to Lie With Statistics. This is the


definitive 20th Century work on the Big Bamboozle.

14. Roger Fisher and William Ury, Getting to Yes, Negotiating


Agreement without Giving In. This is the handbook for
teaching people how to bring Six Sigma breakthroughs to
fruition.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 231
III. Evidence-Based Decisions, Inc.
Six Sigma Black Belt/ Expert
16 Class Curriculum Outline

Black Belt Core Study Texts and Free On-Line Resource book

1. Profit Signals, How Evidence-based Decisions Power Six


Sigma Breakthroughs, M. Daniel Sloan and Russell A. Boyles
PhD, Evidence-based Decisions, Inc., Sloan Consulting and
Westview Analytics, 2003.

2. Getting to Yes, Negotiating Agreement Without Giving In by


Roger Fisher and William Ury. (1991) ISBN 0-14-015735-2

3. Getting Ready to Negotiate, The Getting to Yes Workbook by


Roger Fisher and Danny Ertel. (1995) ISBN 0-14-023531-0

4. How to Lie With Statistics, Darrell Huff, (1954), ISBN 0-


393-31072

5. Learning to See Lean Value Stream Mapping work book


http://www.lean.org/Lean/Bookstore/ProductDetails.cfm?Sele
ctedProductID=9

6. Engineering Statistics Handbook, Free PDF download


on-line Internet Resource, http://www.itl.nist.gov/div898/
handbook/index.htm

7. Optional Show Stopper: Paper Flight, Complete, easy to


follow instructions for making 48 different models that fly. by
Jack Botermans. (1984) ISBN 0-8050-0500-5

Must Read, Master Black Belt, Bedrock Classics:

1. Economic Control of Quality of Manufactured Product, W. A.


Shewhart. (1932) ASQ Quality Press, Milwaukee, Wisconsin.

2. Statistics for Experimenters, Box, Hunter and Hunter.(1978)


ISBN 0-471-09315-7

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


232 Appendices
Software Recommendations: Superior software is essential to
breakthrough improvements and bottom line business results.
Our course is published for students using Adobe Acrobat
5.0. Therefore, the de facto standard, Portable Document
Format (PDF), is a requirement for printing, reading, note
taking and electronic file attachments.

JMP 5.0 http://www.jmpdiscovery.com/index.html As of


September 2003, We believe this vector analysis program is
the best in class. It is capable of handling virtually all of the
analysis work required in Six Sigma breakthrough projects.
Another application, Minitab, is also available. http://
www.minitab.com/ We happily accommodate customers who
prefer this excellent application.

Microsoft Excel. Six Sigma leaders must know how to use


Excel and its add-ins.

Crystal Ball by Decisioneering. http://decisioneering.com


This multi-variate, financial simulation tool is superb for
enlisting and retaining finance leader support. This tool can
be an excellent guide for project selection.

Quality America’s Excel SPC-IV add-in. http://


qualityamerica.com/ Ease-of-use and a short learning curve
makes this program desirable for some executive champions.

Our course is distinguished by the speed with which Black


Belt candidates produce bottom line business results. The
rigor and relevance of the course content are structured
around the proven Six Sigma DMAIC cycle: Define, Measure,
Analyze, Improve and Control.

Course content covers the American Society for Quality’s


(ASQ) Six Sigma Body of Knowledge and uses Bloom’s
taxonomy of knowledge:

Knowledge

Black Belt Experts must be able to recognize terminology,


definitions, ideas, principles and methods.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 233
Comprehension

Experts must be able to understand tables, reports, diagrams,


and directions.

Application

Experts must be able to apply principles, methods, and


concepts on the job.

Analysis

Experts must be able to break down data and information.


Statistical reasoning, analysis, and computing literacy are key.

Synthesis

Experts must expose unseen and informative patterns.

Evaluation

Black Belt Experts must be able to make judgments regarding


the value of proposed ideas and solutions.

Black Belt Course Outline

1. Defining Six Sigma: Introduction, Overview, and History


– A Six Sigma Gestalt
1.1. Learning Objectives: Theory and practice.
1.2. Introductions.
1.3. The 5-Minute PhD: Vector analysis applied to a data
matrix.
1.4. The Complete Six Sigma Tool Kit: Categorical
Catapult Experiment: 23 Designed Experiment (DOE),
the ANOVA, Scatter Diagrams, Regression, Correlation,
Histograms, Pareto charts, Control Charts, Inductive and
Deductive reasoning.
1.5. Four Essentials in a thorough, 6 Sigma Analysis. JMP
5.0. (or Minitab 13) software navigation are introduced.
1.5.1. Calculate the Mean- Recognize that the mode
and median exist
1.5.2. Calculate the Standard Deviation: s and sigma,
σ.
1.5.3. Calculate Improbability – The F ratio
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
234 Appendices
1.5.4. Graph Data in meaningful ways that illustrate the
mean, standard deviation and probability information.
1.6. Six Sigma: History, philosophy, goals and models.
1.6.1. The scientific method: Hypothesis, Experiment,
Test Hypothesis.
1.6.2. PDSA or PDCA: Plan, Do, Study, Act or Plan,
Do, Check, Act
1.6.3. The IDEA cycle: Induction, Deduction,
Evaluation and Action.
1.6.4. DMAIC : Define, Measure, Analyze, Improve,
Control.
1.7. Standards of Evidence: Evidence-based Profitability
Principles. Vector Analysis applied to a Data Matrix
1.7.1. Analogy (1931-2003): Legal System Decisions
1.7.2. Analogy: Management System Decisions
1.7.3. Interactive Dialogue: Assessing evidence in your
corporate culture.
1.7.3.1. Where you are today?
1.7.3.2. Where you want to be in your future?
1.8. An Enterprise View: Suppliers, Inputs, Process,
Outputs and Customers
1.8.1 Y = f (X1, X2….Xn)
1.9. The Six Sigma Lucrative Projects Results Map
1.10. Lucrative Project Selection
1.10.1. Calculating the Priority Projects Using Excel
matrix
1.10.2. Selecting and Leveraging Projects
1.10.3. Brainstorming
1.10.4. S.M.A.R.T. projects
1.10.5. Specific, Measurable, Achievable, Relevant,
and Time Bounded.
1.10.6. Project Charters and Planning Tools
Gantt and Performance Evaluation and Review
Technique (PERT) Charts
1.11. Designed Experiment Homework. Every class
participant will complete his or her first breakthrough
project this evening. Results will be recorded and
analyzed using JMP or Minitab for class presentations
during in class 2. In class demonstrations are
mandatory.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 235
2. Define: Organizational Responsibilities and Financial Six
Sigma.
2.1. Homework Experiment Presentations using software
2.1.1. Designed experimentation demonstrations from
home. Typically these are spread out through the entire
day. By the end of the day people have memorized
software keystrokes for either Minitab or JMP. Both are
easy to master. Both yield identical answers. They are
reliable as the sunrise and sunset.
2.1.2. Getting to Yes workbook reports.
2.1.3. How to Lie with Statistics reading assignments and
discussion.
2.2. Learning Objectives
2.3. The Continuous Catapult Experiment: 23 Designed
Experiment (DOE)
2.3.1. Accuracy and Precision.
2.3.2. Predicting the future with the Profiler.
2.4. Six Sigma Language, Leadership and Job Descriptions.
“Kickin’ the heck out of variation,” led to the martial arts
metaphor.
2.4.1. Executive
2.4.2. Champions
2.4.3. Master Black Belts
2.4.4. Black Belts
2.4.5. Green Belts
2.5. Linking Organizational Goals and Objectives to Six
Sigma
2.5.1 What is different about 6 Sigma and other problem
solving tools?
2.5.2. Closeda nd Open Loop Feedback Systems
2.5.3. SWOT analysis of Sub-optimizing systems. Class
dialogue on cultural norms and issues related to this topic.
2.5.3.1. Strengths
2.5.3.2. Weaknesses
2.5.3.3. Opportunities
2.5.3.4. Threats
2.6. The New Management Equation - Old Equation
Comparison
2.7.Profit Signals workshop to Include Chief Financial
Officers and/or Controllers
2.7.1. Crystal Ball budget building Decisioneering
Tutorial Review
2.7.2. Futura Apartments Tutorial
2.7.3. Vision Research Tutorial
2.7.4. Hands on corporate example and demonstration
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
236 Appendices
2.8. Project Documentation: Data, analysis, and evidence
do not speak for themselves. Visualize and plan your
breakthrough project presentation.
2.8.1. Spreadsheets
2.8.2. Story Boards
2.8.3. Phased Reviewed
2.8.4. Management Reviews
2.8.5. Executive Team Presentations
2.8.6. Homework. Design, build, and fly paper airplanes
according to your experimental array with your team.
Begin building a Crystal Ball model related to potential
Six Sigma projects.

3. Defining Six Sigma Project Selection and Benchmarking


3.1..Paper Airplane Homework Presentations
3.1.1 A Complete Six Sigma Pilot Project – Synectic
Experiment Paper Flight, Complete, easy to follow
instructions for making 48 different models that fly, Jack
Botermans. (1984) ISBN 0-8050-0500-5
3.1.2. Debriefing, Analogies, and Analysis.
3.1.2.1. Intuitive and counter-intuitive solutions.
3.1.2.2. Iterative learning and fun.
3.1.3. Project Timelines
3.1.4. Statistical Software Application practice.
3.2. Learning Objectives
3.2.1. A Catapult 25 DMAIC Experiment.
3.2.2. Predicting the Future with categorical and
continuous variables.
3.2.3. Profiler: Optimization and Desirability
3.3. The 5 Whys
3.4. Six Sigma is a Business Initiative NOT a quality
initiative. The American Society for Quality’s Black Belt test
is discussed. As of 2003, there were no questions related to
vector analysis or the data matrix. Consequently, we cover the
entire list of recommended tools..
3.5. Negotiation techniques for Success: Getting to Yes.
3.5.1. Practical Applications.
3.5.2. Wise, Efficient, Build Relationships and BATNA
3.5.3 10 Principles for Getting to Yes
3.5.4. Dialogue discussion.
3.6. SIPOC Diagrams (Supplier, Inputs, Process, Outputs,
and Customer)
3.6.1. Brainstorm and draw one.
3.6.2. S.M.A.R.T. Projects and the SIPOC diagram.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 237
3.6.3. Specific, Measurable, Achievable, Relevant, and
Time Bounded.
3.7. Project Charters and Paper Work.
3.7.1. Process Characterization and Optimization.
3.7.2. Brainstorming Critical To Quality Flight
Standards
3.7.2.1. Voice of the Customer (VOC)
3.7.2.2. Ground Rules for Nominal Group Technique
3.8. Benchmarking – Process Elements and Boundaries.
DMAIC is what your customers expect to see. Frame your
reports accordingly.
3.8.1. Defining - Design for Six Sigma
3.8.2. Measurement – Performance Metrics and
Documentation
3.8.3. Analysis: Mean, Standard Deviation, Probability,
Graph
3.8.4. Improvement
3.8.5. Control
3.8.6. Internal Best Practices using the complete Six
Sigma tool kit.
3.8.7. Comparing Machines, Production Lines, Plants,
and Shifts
3.8.8. Plant Visits and interviews.
3.8.9. Literature Searches: Internet and Company.
3.8.10. Independent Evaluations and public Financial
Reports.
3.8.11. Product Tear Downs and published books.
3.9. Textbook DMAIC breakthrough Case Study
presentation.
3.10. Project homework and reading assignments set.

4. Defining Process and System Capabilities


4.1. Review of Homework and Reading
4.2. Learning Objectives
4.3. The Complete Six Sigma Tool Kit: Vector Analysis
Applied to a Data Matrix. Repetition for mastery using candy
M&M Sampling, Sorting, and Analyzing. Hands-on Define,
Measure, and Analyze Experiments. Practice with JMP 5.0
Software Application
4.3.1. Populations versus Samples
4.3.2. Operational Definitions – Critical to Quality
Characteristics
4.3.3. Flow Diagramming the Production Process
4.3.4. Sampling our population of candy.
4.3.5. Histograms
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
238 Appendices
4.3.6. Pareto Charts
4.3.7. Control Charts
4.3.8. Scatter Diagrams and Correlation Coefficients
4.3.9. 23 Designed Experiment: Comparing the value of
systematic observation with simple arithmetic counts.
Understanding the context of multiple variables is the key
to breakthrough improvement projects.
4.4. The DMAIC Breakthrough Chart
4.4.1. Juran’s Trilogy
4.4.2. Shewhart’s P-Chart
4.5. Defects Per Unit
4.5.1. Calculating Defects per Million (DPU)
Opportunities
4.5.2. Motorola’s classic, proof reading example.
4.6. Six Sigma Values.
4.7. Cp and Cpk
4.7.1. Practical Applications using Dice
4.7.2. JMP 5.0 or Minitab Calculation Practice
4.7.3. Confidence Interval introduction.
4.8. 2 Helicopter Designed Experiment
8

4.8.1. Emphasis of key concept. Compare M&Ms


Enumerative Sampling with two-level, eight factor DOE
analytic sampling.
4.9. Project Selection Focused Homework on Process
Capability
4.9.1. Calculate and graph Cpk for all 16 copters.
4.9.2. Calculate and graph Cpk for select individual
copters.
4.10. Homework: Read Quality Function Deployment white
papers for report.
4.10.1. Project selection updates including Crystal Ball
model.
4.10.2. Present results of tools applied in daily work.

5. Define: Negotiation, Quality Function Deployment, and


Data Mining Training
5.1. Homework reports, presentations, and dialogue.
5.2. Learning Objectives: Vector Analysis applied to a data
matrix and Evidence-based decisions.
5.3. Observe Designed Experiments DMAIC
Demonstrations by Students
5.3.1. Mean, Standard Deviation, Probability, and
Graphed Results.
5.4. Helicopter 23 Confirmation Experiment
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 239
5.4.1. Iterations and efficient learning.
5.4.2. How does this analogy apply to your work?
5.5. Change Agents and Team Leadership: Pythagoras,
Aristotle, to Frederick Douglas and Harriett Tubman to
2004.
5.5.1. Cultural Influences
5.5.2. Innovation Adoption Model
5.5.3. Diffusion of Innovation
5.5.4. Adoption Process
5.5.5. Force Field Analysis – Forces Fighting Change
5.5.6. Change Agent Methods
5.5.7. Understanding and overcoming Road blocks
5.5.8. Negotiation – Getting to YES.
5.5.9. Motivation
5.5.10. Communication
5.6. Building a House of Quality - A One proven method
of encouraging concurrent engineering.
5.6.1. Using and Excel Template
5.6.2. Functional Requirements and Robust Design
5.6.3. Design for X (DFX): Design Constraints,
design for manufacturability, design for test, design for
maintainability.
5.6.4. The Whats
5.6.5. The Hows
5.6.6. “Correlation matrix” Trade Offs
5.6.7. The Four Phases, The Four Houses of Quality.
5.6.8. KANO Model of Quality
5.7. Excel Data Sorting Function – A brief history of data
mining.
5.7.1. Orthogonal Arrays
5.7.2. Homogeneous Fields and Records, Columns
and Rows
5.7.3. 23 DOE Data Mining Demonstration and
practice.
5.7.4. A correct vector analysis: Thorough 6 Sigma
Analysis: The average, standard deviation, probability, and
analytic graph.
5.8. Homework: Outline project selections for class
presentation.
5.8.1. Bring data in spreadsheet formatted for data (sorting)
mining practice.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


240 Appendices
6. Measuring Value: Rolled Throughput Yield Metrics, &
Costs of Quality
6.1. Homework presentations and review of data mining
strategy.
6.2. Learning Objectives
6.3. The 24 Quincunx Machine Experiment for JMP or
Minitab practice.
6.3.1. Observe the machine.
6.3.2. DMAIC Comprehensive definition process to
determine variables and outcomes..
6.3.3. Central Limit Theorem Simulations using
the machine and Decisioneering’s computerized
demonstration model.
6.3.4. Uncovering the “Hidden Factory”
6.4. Drawing the Value Stream - Lean Flow Charting
Fundamentals
6.4.1. Drive Down Costs Red Bead Sampling Game
– Drive Down Costs
6.4.2. Interactive role playing using a game of historical
significance.
6.5. Rolled Throughput Yield (RTY)
6.5.1. Poisson Computer Simulation on Defects per Unit
6.6. Thought Process Mapping
6.6.1. Categorical Thinking
6.6.2. Universal Standards of Measurement.
6.7. Critical to Quality Tree
6.7.1. Identifying Critical to Quality Characteristics
(CTQ)
6.7.2. Customer needs, Drivers, Quantified CTQ
6.7.3. Relevant to business in financial, quality, and
productivity terms.
6.8. Collecting, Sorting, Developing and Translating
Customer Information
6.8.1. Surveys: Telephone, mailing, interview
6.9. Brainstorming
6.10. Cause and Effect Diagrams
6.11. Affinity Diagram Experiment
6.12. Costs of Poor Quality
6.12.1. Internal Failures
6.12.2. External Failures
6.12.3. Appraisal Costs
6.12.4. Prevention Costs
6.12.4.1. Detailed walk through of an exemplary
Cost of Quality corporate report. Excel Spreadsheet
template available.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 241

6.12.5. Quality Cost Statement by Product Line


6.12.6. Taguchi Loss Function Example
6.12.7. Phillip Crosby’s Rule of 3
6.13. Homework Focus on Quality Costs Project Results

7. Measure: Process Mapping


7.1. Introduction
7.1.1. Workshop Purpose and Agenda
7.1.2. Learning Objectives
7.1.3. Homework Reports
7.2 Process and System Concepts
7.2.1. Process Model (SIPOC)
7.2.2. Why use a process model?
7.2.3. Systems and Processes.
7.2.4. Systems Thinking
7.2.5. Definitions
7.2.6. Process Categories
7.2.7. Goals of Process Design
7.2.8. A Primary Objective
7.2.9. What makes a process reliable?
7.2.10. “Global Process Requirements
7.3. Documenting Processes
7.3.1. Why be concerned with information?
7.3.2. What is this common process?
7.3.3. Why document a process?
7.3.4. Structure your information
7.3.5. Balance document needs
7.3.6. Document design rules
7.3.7. A documentation survey tool
7.4. Techniques for process mapping
7.4.1. What is process mapping?
7.4.2. Why use process maps (flow diagrams)?
7.4.3. The mapping method
7.4.4. Define the process
7.4.5. Process model revisited
7.4.6. A Process Definition Tool
7.4.7. Define the Process
7.4.8. What is your purpose?
7.4.9. Process Customers
7.4.10. Process Boundaries
7.4.11. Outline for process definition
7.4.12. Exercise in process definition
7.4.13. Flow charting the primary process
7.5.14. What is a parallel process?
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
242 Appendices
7.4.15. Adopt and use standard symbols
7.4.16. Other useful symbols
7.4.17. Example
7.4.18. Writing good narrative
7.4.19. Exercise: Primary Process
7.4.20. Flow charting alternative paths
7.4.21. Example
7.4.22. Exercise: Alternative paths
7.4.23. Add control points
7.4.24. Example
7.4.25. The decision question
7.4.26. Controls: Some considerations
7.4.27. Exercise: Control Points
7.4.28. Responsibility matrix
7.4.29. Exercise: Define responsibilities
7.5. Using alternate formats for process mapping
7.5.1. Types of maps
7.5.2. Simple flow chart
7.5.3. Top-down flow chart
7.5.4. Cross-functional flow chart
7.5.6. PERT chart
7.5.7. Decision tree
7.5.8. Data flow diagram
7.5.9. Geography flow diagram
7.5.10. Standardized Process Chart
7.5.11. Exercise: Remap
7.5.12. Finish the flow chart
7.5.13. Characteristics of a good flow chart
7.5.14. Key implementation points
7.6. Using maps to improve and streamline processes
7.6.1. Goals of process analysis
7.6.2. Elimination targets: waste, rework, delays, reverse
loops, and needless complexity.
7.6.3. Technique #1: Value assessment
7.6.4. Technique #2: Standardize
7.6.5. Technique #3: Using the map
7.6.6. Technique #4: Early control
7.6.7. Technique #5: Prevention
7.6.8. Technique #6: Analyze inputs
7.6.9. A process analysis tool.
7.7.10. Homework

8. Measure: The Productive Team Member


8.1. Introduction
8.1.1 Workshop purpose and agenda
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 243
8.1.2. Learning objectives
8.1.3. Homework report (six sigma project progress)
8.2. Characteristics of Effective Teams
8.2.1. Box Of Stuff exercise
8.2.2. Teams vs Groups
8.2.3. What things the Team must Manage
8.2.4. Inputs for a Successful Team
8.2.5. Sense Of Urgency—Good Or Bad?
8.2.6. Outputs of a Successful Team
8.2.7. Stages of Team Development
8.2.8 (Murder Mystery exercise)
8.2.9 Four Stages of Team Development
8.2.9.1. Norms and Team Development
8.2.9.2. A Sample “Code of Cooperation”
8.2.9.3. Overcoming Hindrances to Team Performance
8.2.9.4. Circle In The Square exercise
8.2.9.5. Competition versus Cooperation
8.2.9.6. Signs of Team Trouble
8.2.9.7. Groupthink
8.2.9.8. Ground Rules for Consensus
8.2.9.8. Five Approaches To Getting Unstuck
8.2.9.9. Team Roles and Responsibilities
Meeting Management and Leader skills
8.2.9.10. Member role and responsibilities
8.2.9.11. Improving Communication
8.2.9.12. The Johari Window
8.2.9.13. Communication Model
8.2.9.14. Types of Feedback
8.2.9.15. Practicing Feedback
8.2.9.16. Building “I-Statements”
8.2.9.17. Communication Breakdown
8.2.9.18. How To Correct Bad Listening Habits
8.2.9.19. Barriers to Good Listening
8.2.9.11. Learning Style Inventory
8.2.9.12. Strategies for Managing Change
8.2.9.13. Principles of Large-System Change
2.2.9.14. Change vs. Transition
8.2.9.15. External Forces For Change
8.2.9.16. Internal Forces For Change
8.2.9.17. The Four Room Apartment
8.2.9.18. Improving Team Performance
8.2.9.19. Team Self-Evaluation
8.2.9.20. Close

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


244 Appendices
8.2.9. Assign Homework (Six Sigma Project Progress using
appropriate tools) Visit http://www.fmeca.com/ Read as much as
you can prior to the next class.

9. Measuring the Process: Failure Mode Effects Analysis


FMEA Workshop. Criticality is included and emphasized.
9.1. History.
9.2. Definitions and Acronyms.
9.3. Walk through of the entire FMEA process will include
group work tools and methods introduced in effective team
member class.
9.4. Review output of product.
9.5. Homework: Present project progress and estimated
dollar savings using tools.

10. Analyze: Exploring, Summarizing, and Predicting using


data.
10.1. Homework Review – Focus on Project Financial
Results
10.2. Learning Objectives
10.2.1. Be able to give examples of continuous,
categorical, count, pass/fail, and life data.
10.2.2. Use correct graphics to summarize measurement
data.
10.2.3. Use software to explore data basesdatabases.
10.2.4. Explain central limit theorem using coin tosses.
10.2.5. Fit a normal distribution to measurement data
and assess goodness of fit.
10.2.6. Review process capability concepts with working
exercise.
10.3. Review: Types of Data
10.3.1. Traditional taxonomy
10.3.1.1. Nominal, ordinal, interval, ratio
10.3.2. More useful modern taxonomy
10.3.2.1. Attribute = categorical = discrete = nominal
10.3.2.2. Ordinal
10.3.2.3. Continuous = measurement = parameter =
variable
10.3.2.4. Time to failure = life data
10.4. Collecting, Recording and Analyzing Measurement
Data
10.4.1. Real-world examples
10.5. Review of graphics for measurement data
10.5.1. Stem and leaf diagram
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 245
10.5.2. Frequency histogram and Cumulative
Distribution Function (CDF)
10.5.3. Boxplots
10.6. Review of descriptive statistics for measurement data
10.6.1. Minimum, maximum and range
10.6.2. Mean and standard deviation
10.6.3. Plus and minus three standard deviations
10.6.4. Central Limit Theorem, coin tosses and Process
Capability
10.7. JMP Data Exploration Exercises
10.7.1. Entering data in rows and columns
10.7.2. Generating descriptive statistics and graphics
10.7.3. Producing a report: Integrating with Microsoft
Word, Excel, and Power Point
10.8. Workshop: Wooden sticks, calipers, data entry, and
worker variation.
10.8.0.1. Yield calculations for one-sided specs
10.8.0.2. Yield calculations for two-sided specs
10.8.0.3. Cumulative or “rolled throughput” yield
(Review and Reinforcement)
10.8.0.4. Gauge Reproducibility and Repeatability
Studies and Practice
10.9. Homework Focused on Project Results

11. Analyze: Inductive Reasoning Part 1 – Quantifying


uncertainty in measurement systems (Formerly known as
Hypothesis Testing)
11.1. Homework Review Focused on Project Results
11.2. Learning Objectives
11.2.1. Explain relationships between processes and
populations.
11.2.2. Identify default statistical models for
measurement, pass/fail, count and life data.
11.2.3. Express real-world problems in terms of statistical
models and population parameters.
11.2.4. Use Confidence Intervals to characterize or test a
process in terms of mean, standard deviation, three sigma
limits, capability indices, fraction defective or reliability.
11.3. Measurement Systems
11.3.1. Definition of a measurement system
11.3.2. Population sampling
11.3.3. Process Sampling
11.3.4. Measurement objectives

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


246 Appendices
11.4. Measurement Uncertainty
11.4 1. The Normal, binomial, Hyper-geometric, Poisson,
and Weibull distribution models
11.4.2. Accuracy and precision (Review and
Reinforcement)
11.4.3. The role of calibration procedures
11.4.4. Repeatability and Reproducibility (R&R)
11.4.4.1 Repeatability: dependability of the gauge
11.4.4.2 Reproducibility: dependability of gauge
operators and environment
11.4.5. Examples and exercises: Calibration and
Calibration Control: Penny for your Thoughts workshop
exercise.
11.5. The Seven Habits of Highly Statistical People:
Quantifying Uncertainty.
11.5.1. Statistical Inference
11.5.3. The law of likelihood and likelihood function.
11.5.4. Interval Estimation
11.5.5. Confidence and Evidence
11.6. Characterizing and Testing exercises
11.6.1. The “one-sided” fallacy.
11.6.2. Pass/Fail
11.6.3. Interpreting Opinion Polls
11.6.4. Sample Size Calculations
11.7. Chi Square and t distributions
11.8. Homework Focused on Project Results

12. Analyze – Inductive Reasoning Part II


12.1. Homework Review Focus on Project Results
12.2. Learning Objectives
12.2.1. Recognize statistical problems when they occur,
and be able to classify them as testing an objective,
comparing processes, or relating variables.
12.2.2. Identify appropriate null hypotheses for testing an
objective, comparing processes, and relating variables.
12.2.3. Choose appropriate test procedures based on type
of problem and type of data.
12.2.4. Use p-values to interpret the results of statistical
tests.
12.2.5. Explain the difference between correlation and
regression.
12.3. Statistical Hypotheses and Process Hypotheses
12.3.1. The null hypothesis.
12.3.2. Fair coin tosses.
12.3.3. P-values
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 247
12.3.4. Z statistics and the Z transformation
12.3.5. P values from Z statistics
12.3.6. P values from z or Chi squared distributions.
12.4. ANOVA The geometry of analysis
12.4.1. Likelihood ratio
12.4.2. Degrees of Freedom
12.4.3. P Values from the F statistic
12.5. Relating variables.

13. Analyze - Model Building, Data Mining, and Linear


Regression
13.1. Homework and Six Sigma Project Progress Review
13.2. Quantifying the Strength of Evidence
13.2.1. Hypothesis testing revisited
13.2.2. Law of likelihood in comparison problems
13.2.3. Confidence interval for a difference
13.2.4. P-values
13.2.4.1. Mathematical definition
13.2.4.2. Operational interpretation
13.3. Sample Size Calculations
13.3.1. Smallest difference of practical significance
13.3.2. Power of detection
13.3.3. Example: comparing two opinion polls
13.4. Pass-Fail Data
13.4.1. Likelihood ratio test for equality of two or more
Binomial proportions
13.4.2. Test for equality of two or more Binomial
proportions (valid only for large sample sizes)
13.4.3. z test for equality of two Binomial proportions
(valid only for large sample sizes)
13.5. Number of Defects
13.5.1. Likelihood ratio test for equality of two or more
Poisson means
13.5.2. Test for equality of two or more Poisson means
(valid only for large sample sizes)
13.5.3. z test for equality of two Poisson means (valid
only for large sample sizes)
13.6. Continuous Measurements
13.6.1. F test for equality of two Normal standard
deviations
13.6.2. t test for equality of two Normal means
13.6.3. F test for equality of two or more Normal
means (Analysis of Variance) (valid only if all standard
deviations are the same)

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


248 Appendices
13.7. Life Data (Time to Failure)
13.7.1. Likelihood ratio test for equality of two or more
Weibull distributions
13.8. Chi-square tests
13.8.1. Interpreting the table of the Chi-square
distribution
13.8.2. Tests of association in contingency tables
13.9. Workshop: Pennies for Your Thought
13.10. Regression Analysis
13.10.1. Scatter Diagrams (Review and Reinforcement)
13.10.2. Correlation is not causation
13.10.3. Linear Regression Models
13.10.3.1. “All models are wrong, some are useful.”
13.10.3.2. Straight-line regression
13.10.3.3. Multiple regression
13.10.3.4 Polynomial regression
13.10.4. Fitting Regression Models
13.10.4.1. The least squares estimates
13.10.4.2. The RMS error
13.10.4.3. Testing for significance of predictor variables
13.10.4.4. Predicted mean values
13.10.4.5. Confidence intervals for predicted mean
values
13.10.4.6. Confidence intervals for predicted
individual value
13.10.5. Regression diagnostics
13.10.6. The dangers of R2
13.10.6.1. Testing for lack of fit
13.10.6.2. Residual plots
13.10.6.3. JMP exercises
13.10.7. Workshop: Pennies for Your Thought
13.10.8. Homework with Project Focus

14. Improve – Experimental Design and Analysis


14.1. Learning Objectives
14.1.1. Be able to explain the difference between
optimization and screening experiments.
14.1.2. Calculate sample sizes for optimization
experiments.
14.1.3. Create matrices for optimization experiments.
14.1.4. Analyze data from optimization experiments.
14.1.5. Interpret and apply results from optimization
experiments.
14.2. Homework Review Focused on Projects
14.3. Introduction to Experimentation
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 249
14.3.1 Why should I do experiments?
14.3.2 When should I do experiments?
14.4. Concepts and Definitions
14.4.1. DOE Terminology
14.4.2. Experimental Unit
14.4.3. Sample Size
14.4.4. Response
14.4.5. Factor
14.4.6. Level
14.4.7. Design Point
14.4.8. Design matrix
14.5. Types of factors
14.5.1. Continuous
14.5.2. Categorical
14.5.3. Control
14.5.4. Noise
14.6. Do not experiment with one factor at a time! (OFAT
Review and Reinforcment)
14.7. Design principles
14.7.1. Bold strategy
14.7.2. Factorial structure
14.7.3. Control group
14.7.4. Replication
14.7.5. Randomization
14.7.6. Blocking
14.8. Experiments with All Factors at Two Levels
14.8.1. Examples
14.8.2. JMP Steps
14.8.3. Exercises
14.9. Basic Design Process
14.9.1 JMP Steps
14.9.2 Exercises
14.10. Screening Experiments
14.10.1. Examples, Modified Design Process
14.11. Workshop: The Funnel Process

15. Improve - Process Optimization and Control


15.1. Learning Objectives
15.1.1. Describe iterative strategy for experimentation
15.1.2. Perform multiple response analysis.
15.1.3. Calculate sample sizes for robust optimization
experiments.
15.1.4. Create matrices for robust optimization
experiments.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


250 Appendices
15.1.5. Analyze data from robust optimization
experiments
15.1.6. Understand common cause and special cause
variation.
15.1.7. Describe a Reaction plan to out of control
conditions.
15.2. Review of Designed Experiments Homework
15.3. The Process of Experimentation
15.3.1. The experimental cycle
15.3.2. Types of experiments
15.3.3. Strategies for experimentation
15.4. Statistical Modeling
15.4.1. Standard assumptions
15.4.2. The method of least squares
15.4.3. Models for continuous factors
15.4.4. Models for categorical factors
15.5. Statistical Testing
15.5.1. Testing model coefficients
15.5.2. Testing for lack of fit
15.5.3. Exercises
15.5.4. Predicted values and residuals
15.5.5. Exercises
15.6. Multi-level Optimization Experiments
15.7. Response surface analysis
15.7.1. Quadratic models for continuous factors
15.7.2. Continuous × categorical interactions
15.7.3. Example and JMP exercises
15.8. Design process
15.8.1. Quadratic models for continuous factors
15.8.2. Continuous × categorical interactions
15.8.3. Sample size calculations
15.8.4. Example and JMP exercises
15.9. Workshop: the Funnel Process using robust
optimization and quality control, process improvements
15.10. Homework: Design of Experiments Project Focus:
Report project results in DMAIC format for final class.
15.11.

16. Control - Optimization Experiments and Statistical


Process Control
16.1. Learning Objectives
16.1.1. Multiple response analysis.
16.1.2. Rational sub-grouping.
16.1.3. Establishing baselines
16.1.4. Monitoring low failure rates.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
Appendices 251
16.1.5. Multivariate statistical process control
16.1.6. Short-Run SPC
16.2. Review of Designed Experiments Homework
16.3. Multiple Response Optimization
16.3.1. Example
16.3.2. “Multiple responses” is the rule, not the exception
16.3.3. Optimizing one response at a time will not work
16.4. Desirability functions
16.4.1. The three types of response objective
16.4.2. Constructing a desirability function for each
response
16.4.3. Constructing the overall desirability
16.4.4. Maximizing the overall desirability
16.4.5. Optimizing over subsets of the design region
16.4.6. JMP exercises
16.5. Robust Optimization Experiments
16.5.1. The concept of robust optimization
16.5.1.1. Optimize the mean
16.5.1.2. Minimize the variance
16.5.1.3. Examples
16.5.2. Strategy for design of robust optimization
experiments
16.5.2.1. Identify key noise variables
16.5.2.2. Define noise factor
16.5.2.3. Include noise factor in the design
16.5.3. Strategy for analysis of robust optimization
experiments
16.5.3.1. Apply multiple response technique
16.5.3.2. Maximize overall desirability
16.5.3.3. Minimizes variability for a given mean
16.5.3.4. Seeks best combination of close-to-target
mean and low variability
16.5.4. Statistical Process Control as a mind set and
strategy.
16.5.4.1. Acceptance sampling and broken promises.
16.5.4.2. Hands on SPC experiments and software
practice Workshop: the Funnel Process Exercises
Summaries for Quick Reference
16.5.5. Thought process for designing an experiment
16.5.6. More on Sample size calculations
16.6. Homework: Design of Experiments Project Focused
Report project results in DMAIC format.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


252 Appendices

IV. Profit Signals Production Notes

We consciously chose to demonstrate Senior Master Black


Belt, Six Sigma level knowledge and skills in every aspect of
the production of this book. Our lean production system
included two authors. When necessary, we retained the
illustration services of expert contractors, just-in-time.

We produced the electronic versions of our book


independently. Kinko’s prints the four color cover, perfect
bound paperback version on demand in a pull-production
system. We carry only the inventory we need for personal,
corporate use.

The Internet, computing power, and software allowed us to


complete the entire writing and production of the book in 90
days. This is the classic Six Sigma project time line. We began
by creating the Profit Signals title on June 18. We completed
the work in PDF format on September 11, 2003.

Though it was an entirely Chance coincidence, the


completion of our book on this day was an appropriate way
to celebrate liberty, freedom of speech, equality, applied
science, democratic values, art and the pursuit of happiness.
Evidence-based decisions are as important to world peace as
they are to prosperity.

The applications that played primary roles are as follows:

• Microsoft Word® 2000 and 2002 were our primary


composition tools.

• JMP 5.0® statistical software, manufactured by


SAS, was our favored analytic program. We also use
Minitab with clients who have that standard.

• Microsoft Excel® was used for spreadsheet screen


captures and some graphics. Using Excel for data
matrix vector analysis shows the amount work
required before a spreadsheet behaves like a reliable,
rules-driven, software analysis application.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Appendices 253
• Microsoft Explorer was the web browser we used for
Internet research.

• Microsoft Power Point® was frequently used by


Russell for first draft, technical drawings.

• Adobe Illustrator® 10 transformed all illustrations


into EPS files for production.

•Adobe Photoshop® 7.0 was used for certain


photographic and graphic illustrations.

• Adobe Acrobat® 6.0 Professional helped us


disseminate copies for review.

• Adobe In Design® 2.0.2 allowed us to design, layout


and construct our book.

• Crystal Ball by Decisioneering® was the Excel add-


in we used to make this spreadsheet behave like a data
matrix.

• Process Model®, a data matrix based flow


diagramming program, was used to create flow
diagrams.

• Quality America ‘s Excel add-in, Statistical Process


Control program was used to produce a control chart.

• Dell desktop and laptop computers, a Hewlett-


Packard LaserJet 1300 and an hp officejet v40xi jet
printer produced hard copy for old fashioned proof
reading and review.

In our opinion, it is not only a reasonable expectation for


Black Belts, Master Black Belts and Executive Champions to
use a similar list of programs in their daily work, it is essential
to Six Sigma powered project breakthroughs.

Profound thanks are due to our wives, Lynne and Michelle,


and our wonderful children, Austin and Molly. Patience is
their virtue. We love them.

In addition to the entire Adobe products technical support


team, four individuals went well beyond the call of duty as we
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003
254 Appendices
produced Profit Signals. Jack Benham introduced us in July of
2002. Good on ya’ matey. Onwards and upwards. Without
Jack’s vision, leadership and masterful management skill there
would be no Profit Signals.

Cheryl Payseno, our friend, colleague, nurse, and former


hospital administrator volunteered her case study on Breaking
the Time Barrier. She also encouraged us to tackle the cost
accounting variance and break-even thinking head on with
the Premise’s second illustration, Figure 2.

Our friend, colleague, and final copy proof-reader Bethany


Quillinan stepped into the fray to help us see our words
through yet another set of eyes. She did a Six Sigma quality
job on a pressure packed deadline.

Finally, our friend Bill Moore, the President of MedCath,


Incorporated, Hospital Division, volunteered invaluable
editorial support. The specificity of his constructive criticisms
and the solutions he proposed strengthened the quality of our
work immeasurably.

So, “Thank you very, very, very much Lynne, Austin,


Michelle, Molly, Jack, Cheryl, Bethany and Bill.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Index

Adams, John 217


Aladdin 179
analogy 44, 113, 153, 154, 174
analysis 9, 10, 13, 21, 33, 51
Analysis of Variance 37, 53, 58, 222
ANOVA 37, 55, 88
Archimedes 46
Aristotle 152, 161, 175, 221

Bamboozle 56, 57
belt grinding 142
Bernstein, Peter L. 191
Black Belt 19, 37, 76, 79, 90
Box, George E.P. 115, 153
break-even analysis 11, 179

CABG 34, 136


Calder, Alexander 165
Case Studies 21, 117
Cohen, Bernard 91
Confidence Level 81, 132
control chart 110, 203, 223
cornerstone of evidence 10, 14, 92, 118, 208

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


256 Index
correlation 177
Corrugated Copters 22, 23, 194
cost-accounting variance analysis 11, 12, 15, 21, 53, 119,
177, 179
Cost of Poor Quality 111
Cpk 114, 194
credulity 50
critical thinking 57
Critical to Quality 99
CTQ 99
cube 38
cynicism 118

Darwin, Charles 178


data matrix 9, 11, 14, 16, 18, 19, 20, 21, 22, 29
data matrix geometry 124
da Vinci, Leonardo 43
defects per million 92
degrees of freedom 65, 167, 168
Delusions 56
Design of Experiments 44, 48
differences 32, 53, 54, 74
Disney, Walt 47, 178
Disraeli 118
DMAIC 21, 96, 130

Einstein, Albert 43, 46, 91


Einthoven, Willem 33
EKG 33
emergency department 128
Emerson, Ralph Waldo 222
Euclid 46
evidence-based decision 10, 13, 18, 23
Executive Committee 209

Fads and Fallacies 85


Failure Mode Effects Analysis (FMEA) 112
feedback 136

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Index 257
Feigenbaum, Armand V. 110
Feynman, Richard P. 194
fields 59
fingerprints 178
Fisher, Ronald A. 43
Five-Minute PhD 20, 152

G. Charter Harrison 179


GAAP 55
Galileo 223
Galton, Francis 55, 178
Galvin, Robert 87
Gantt, Henry L. 97
Generalization 9, 19, 31, 178
generalization 9, 16
Generally Accepted Accounting Principles 57
General Electric 110, 223
George E.P. Box 21
Gosset, William 32
Gould, Stephen Jay 177
Guinness 32

Harrison, G. Charter 179


Hidden Factory 218
hidden factory 108, 111
Hill, Sir Austin Bradford 135
Huff, Darrell 52
Hunter, William 74
Hunter, J. Stuart 74
hyperspace 38, 39, 43, 60

Imagineering 31

JCAHO 128
Jefferson, Thomas 217

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


258 Index
JMP 131
Joint Commission 128

Kaizen-blitz 145
Keats, John 50
knowledge 36, 39

law of the universe 9, 66, 81, 158


lean 108
Length Of Stay 130

Mackay, Charles 56
main effect 39
Marconi 43
math phobia 220
Matreshka 106
Maxwell, James Clerk 91
measurements 9, 19, 153
Michelangelo 43
Minitab 88
Motorola 223
multiplication 15

N
NASA 194
Netter, Frank 33
Newton, Isaac 50
New Management Equation 63, 161, 175
Normal distribution 70
n dimensions 31

OFAT 103

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


Index 259

P-value 81
p-value 72,
Paine, Thomas
Paper Bags 14
Pareto chart 127, 140, 171
perpendicular planes 41
PERT 97
Picasso, Pablo 37
predicted values 11, 44, 165
process capability 113
process maps 94
Profit Signals 44
Pythagoras 21
Pythagorean Theorem 13

quarterly review 207

reasoning 81
records 123
refraction 50
regression modeling 177
Ronald Fisher 9, 12
Rothamsted 32
Russian dolls 106

Sagan, Carl 162


sample size 59
sample standard deviation 64
scientific management 13
Sculpey Clay 10, 164
Shewhart, Walter A. 209
Simulation 100
SIPOC 155
Sisyphus 180
Six Sigma 18, 89
Six Sigma theory 19

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003


260 Index
Six Sigma tools 19
Smith, Bill 18
spreadsheet 80
spreadsheet analysis 14
standards of evidence 20, 21, 49, 160
Stories 50
straight-line prediction 181
straw man 76
strength of evidence 83

Taylor, Frederick W. 84, 103


tetrahedron 10, 152, 165
Themis 83
Three Rs 23
Transparency 13
Turrell, James 47
Twain, Mark 59

V
variation 10
vector 10, 60, 152
vector analysis 9, 10, 13, 70, 76, 158, 171, 208

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

You might also like