Professional Documents
Culture Documents
belongs to someone else, not me. ignore information that clashes with our prior beliefs.
The availability heuristic leads us to overestimate the
Finger pointing probability of events that we can easily recall at the expense
(Its their job!) and of considering rare but more serious events. And plan
continuation bias causes us to continue our original plans
reinventing the wheel of action even when conditions change and those plans
offices of senior executives to professional middle managers. the company missed the forest for the trees.
Yet, increasingly complex operations require information from
these areas to flow to senior executives and to be integrated Whether in dealing with
health and safety, quality,
into broader strategic decision making.
Systems Strike Back. If not implemented carefully, traditional Advanced technology, as was the case with UCSFs
approaches that emphasize environmental safety and quality pharmacy robot, often contributes to failures in the first place.
controls and warning systems can backfire in several ways. And Yale sociologist Charles Perrow has identified many of
First, having too many controls can create an environment ripe the features that trigger failure in complex, tightly coupled
for risk creepthe process by which an unacceptably risky systems. Redundancies and safety systems, Perrow
practice gradually becomes acceptable. A behavior that argues, are the biggest single source of catastrophic failure
deviates from rules and best practicesfor example, failing to in complex, tightly coupled systems. Finally, when it comes
follow some of the more arduous steps of a complicated proce- to maintaining vigilance, research from NASAs flight
dure or giving approval to a decision that almost meets safety cognition lab on experienced airline pilots shows how
standardsrarely results in disaster the first time it occurs. difficult this is. Researchers argue that it is unrealistic to
Then, as deviance continues without consequences, it develops expect flight crews to be able to maintain vigilant monitoring
into an accepted norm. of normally reliable systems. Yet, in many industries,
operators are frequently blamed for environmental, quality,
It becomes a routine part of the job to sign off on maintenance and safety lapses even when the causes are systemic.
work orders without actually checking if they were completed,
or to perform a quality inspection without a second set of eyes. To make matters worse, a toxic culture in some organizations
Not even harrowing near missescases when dumb luck prevents system-level thinking from being incorporated into
intervenes to avert disasterhalt this process. So risk creep business practices. Too often, managers blame employees
continues and day after day causes seemingly small errors for errors and discourage them from raising concerns.
without consequencesuntil, one day, the consequences This deprives companies of valuable information about what
become all too real.8 is effective on the ground and suppresses warning signs of
potential system accidents.
Second, environmental, safety, and quality systems can create
a feeling of security that encourages more risk taking, a
phenomenon known as risk compensation.9 Drivers are less
cautious when their cars are equipped with anti-lock brakes.
And the captain of the HMS Titanica ship thought unsinkable
because of its watertight compartments and modern design
raced along despite warnings of icebergs ahead.
Figure 1
victims and penalties to the U.S. Government.17
signals of failure is that senior leaders can drive big
decisions about efficiency and corporate strategy. Upon
taking the helm of Alcoa, for example, CEO Paul ONeill
Skeptics strengthen
made worker safety his priority. By doing so, he opened up organizations by
digging into issues and
channels from the factory floor to the C-suite, creating fluid
communication lines within his large and widely distributed
organization. Not only did his approach increase safety;
doggedly pursuing
it also increased employees voicetheir ability and
willingness to speak up and raise concernsand
allowed for suggestions of all kinds to flow across the
organization.15
problems until
theyre understood and
ultimately solved.
EHSQ Leadership Summit
12 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Step Two: Develop Data-Based
Solutions
It is vital to collect and analyze reliable data about the newsletter that helps pilots understand the errors and
behavior of complex systems. We need good data about near misses reported by their peersmost of whom they
weak signals of failure, but merely warehousing such data will likely never meet. The newsletter is a powerful tool to
isnt enough. We need to turn the resulting knowledge into heighten awareness of potential risks.
practical solutions.
NASA codes ASRS reports to help understand emerging
Generating Leading Risk Indicators trends in safety. In the same way, managers should
harness the power of big data whenever possible. Data
Even in risky industries, major accidents are rare, so
on near misses and other variablessuch as overtime
leaders need to predict risks by understanding small
hours, on-time deliveries, and defect ratescan help
failures and near misses. Take the example of Pablo
managers predict what plants or groups are operating
Garcia, the UCSF patient who had a massive overdose
effectively, and which might be on the path to serious
that resulted in a seizure and respiratory arrest. For every
issues. But most traditional approaches to EHSQ
dramatic outcome like Garcias, there are dozens of
management silo data across discrete software packages,
adverse drug eventsfrom innocuous mis-dosings to
making it difficult to combine the data into a meaningful
the delivery of the wrong medicinewith few visible
picture. And even if EHSQ managers can leverage data
consequences. But for luck, each of these incidents might
across silos, they dont necessarily have enough data or
have had dire consequences. By measuring even harmless
specialist personnel to build a useful predictive metric.
drug events, rather than just patient outcomes, hospital
managers get a better idea of how things might go wrong.
An ideal approach to learning from near misses would
The same opportunity exists in other industries, as most
incorporate information from several sources: from a firms
equipment malfunctions or safety near misses dont
own EHSQ data, from incidents that occurred in other
cause injuries, downtimes, or quality issues. But they do
firms, and even from issues that arise in other industries.
provide data on what is going wrong, a valuable way
But most EHSQ professionals dont have access to such
to understand system-level complexity.
data sources and, without specialized help, lack the robust
analytics needed to unlock their value.
Though such data are fundamentally retrospective (in
that they look at what has already happened), they can be
used to generate leading indicators and help managers
shift from a reactive approach to a proactive one. In the
aviation world, NASAs Aviation Safety Reporting System
collects reports of safety incidents and makes them
available to researchers. They also publish a monthly
Using Structured Decision Tools external experts not directly involved in a given project
Structured decision toolscognitive aids and like the Risk Review Boards at NASAs Jet Propulsion
organizational practices that reduce the influence of Laboratorycan catch insiders decision errors and en-
decision biasescomplement leading indicators and sure that unacceptable risks are challenged and mitigated.
design-based interventions. By directly combatting (See sidebar for a list of resources on structured
decision biases, these tools reduce the frequency of decision tools). By directly combatting decision biases,
errors that trigger escalating failures in complex these tools reduce the frequency of errors that trigger
systems. They also help us avoid underestimating escalating failures in complex systems. They also help us
the risks that come from hard-to-observe interactions avoid underestimating the risks that come from hard-to-
within our systems. observe interactions within our systems.
Checklists, for example, help compensate for the limits Structured Decision Tools
of human memory and attention by, for example, listing Quality control for your big decisions:
A 12-question checklist
explicit steps for safety and product inspections.
Kahneman, D., D. Lovallo, and O. Sibony. 2011.
Pre-determined criteria for risks and quality help us avoid Before you Make that Big Decision. Harvard Business Review,
89(6): 50-60.
confirmation bias and plan continuation bias by calling
attention to lapses in quality or increasingly risky practices. The premortem: A technique to unearth hidden risks
Klein, G. 2007. Performing a Project Premortem.
Harvard Business Review, 85 (9): 18-19.
Likewise, running a so-called pre-mortem exercise
a technique that asks team members to imagine that a SPIES: A simple tool for making better forecasts
Moore, D., and U. Haran. 2014. A Simple Tool for
major failure has already occurred and asks them to Making Better Forecasts. Harvard Business Review (online).
imagine the underlying causesreduces overconfidence
An excellent introduction to the biases that affect
and opens our thinking to complex risks that we might managerial decisions
otherwise ignore. And teams composed of internal and Bazerman, M. H., and D. Moore. 2013. Judgment in
Managerial Decision Making. Wiley (8th edition).
Many things stand in the way of leaders who hope to drive Wells Fargo, for example, recently paid a fine of nearly
meaningful change in the EHSQ realm. First, employees $200 million dollars and fired over five thousand low-level
are generally skeptical of change and often see such employees for opening unauthorized customer accounts.
initiatives as transient efforts that will pass. They resist The bank set aggressive targets for its bankers to cross sell
change, particularly top-down efforts. And merely directing different products to existing customers. Yet despite the
people to report anomalies or change their routines isnt relevance of three of Wells Fargos stated values, People
enough. Hours of speeches and meetings cant compare as a competitive advantage, Ethics, and Whats right for
with the effectiveness of a set of practical, concrete tools customers, managers retaliated against employees who
that leverage the intrinsic motivations of employees. spoke out about the unrealistic sales targets and fraudulent
practicesoften by firing them in a way that prevented them
When the values an organization articulates as a cornerstone from securing future jobs in the financial sector.19
of its culture diverge from the behavior of its leaders and
managers, it is nearly impossible to create a positive climate. Fortunately, by using practical solutions that integrate with
Few actions do more to devastate an organizations climate what actually motivates people, leaders can demonstrate
their commitment to a positive EHSQ climate. Consider a
Hours of speeches system that captures and analyzes weak signals of failure.
effectiveness of a set
into account human nature. A mobile app, for example, can
effectively and quickly capture data, and in many settings,
of practical, concrete employees already carry their cell phones with them. By
A firm can learn from its own experiences, but it cannot draw
data on weak signals,
solutions, and change
on others data. It has a small-sample problem. Companies initiatives.
can overcome this problem by working with a knowledge
brokeran organization that spans many, otherwise
industries as medical instruments, furniture, toys, and
disconnected industries and uses this position to synthesize
computers has given us a broad view of the latest technol-
data and insights across the boundaries of different
ogies available and has taught us how to do quality product
sectors, markets, and firms. There is abundant evidence
development and how to do it quickly and efficiently.
that knowledge brokers are in a unique position to recognize
how existing knowledge in one industry might be used to Because a knowledge broker stands at the intersection of
create breakthroughs in another. From inventor Thomas distinct domains, it benefits from what sociologist Ron
Edisons groundbreaking research lab in Menlo Park to Burt called a vision advantage: it taps into distinct, non-
the powerhouse design-consulting firm IDEO, knowledge redundant streams of information. Because a knowledge
brokers turn insights from one domain into solutions in broker spans otherwise disconnected firms and industries,
other industries. it can access much more data, and much richer data, than
a single firm that monitors only its own processes and
Since its founding in the late 1970s, for example, IDEO has
outcomes. By spanning industries from mining to health
developed thousands of successful products by moving
care, and from retail to manufacturing, a knowledge broker
between over 40 different industries that are usually isolated
can access a broad diversity of data about weak signals
from one another. For example, IDEO designed a new water
of failure, leading indicators of problems, and validated
bottle by incorporating a leak-proof nozzle based on an
solutions. And rather than reinventing the wheel that already
existing solution used for shampoo bottles. It adopted a
exists in another industry, it can recombine existing ideas
tracking mechanism for the original Apple computer mouse
into new solutions that work across multiple industries.
from the giant trackballs used in video game machines. And
Thus knowledge brokers can help firms move from simply
it created a surgical skin stapler by incorporating ideas from
measuring their internal processes to implementing novel
model airplane engines and office staplers.22 As CEO David
and broadly validated solutions.
Kelley put it, Working with companies in such dissimilar
advantage, knowledge solutions, these may never come into contact with one an-
other in a productive way. Unless, of course, the firm itself
brokers have another has knowledge brokers who can span traditional organi-
crucial benefit: they zational boundaries and learn from the otherwise separate
silos. Knowledge brokers can select, and then synthesize,
are independent the best ideas from a wide range alternative perspectives
or punishment.25
What can mountain climbing teach us about air crashes and nuclear meltdowns?
Roberto, M. A. 2002. Lessons from Everest: The Interaction of Cognitive Bias,
Psychological Safety, and System Complexity. California Management Review, 45(1): 136-158.
http://bit.ly/1HWshOw
ANDRS TILCSIK is an Assistant Professor of Strategic CHRIS CLEARFIELD is a principal at System Logic,
Management at the Rotman School of Management at the a Seattle-based research and consulting firm that helps
University of Toronto, where he developed and teaches organizations manage the risk of system failure. Chris
the award-winning MBA elective Catastrophic Failure in has written about failure and technology for the popular
Organizations. His research has been covered widely in science magazine Nautilus, the Harvard Kennedy
media outlets, including The New York Times, The Econ- School Review, The Guardian, Forbes, Project Syndicate,
omist, Forbes, The Washington Post, The New Yorker, the Harvard Business Review blog, and in a memo to the
and Bloomberg BusinessWeek and cited in testimonies to U.S. House of Representatives. He has given talks at
committees of the U.S. Congress. His most recent re- Columbia, Princeton, the New York City Office of Emer-
search explores the factors that contribute to catastrophic gency Management, the Jet Propulsion Laboratories, the
failure in the banking sector. As a Fellow of the Michael Hilltop Institute, New Yorks Metropolitan Transportation
Lee-Chin Institute for Corporate Citizenship, he is studying Authority, and many other venues. Chris is a licensed com-
corporate practices and decision-making processes that mercial pilot and a graduate of Harvard University (A.B.),
reduce the risk and impact of environmental disasters. He where he studied physics and biology.
is a graduate of Harvard University (Ph.D., A.M., and A.B.).
With Chris Clearfield, he is a founding member of
the Rethink Risk Project.
Copyright 2017
Intelex Technologies Inc.
intelex.com
@intelex
/intelextechnologies
/intelex-technologies-inc.
/intelexsoftware