You are on page 1of 22

Gather, Predict, Change:

How Smart Leaders Tame Risky Systems

By Andrs Tilcsik and Chris Clearfield

Gather, Predict, Change: How Smart Leaders Tame Risky Systems


INTELEX TECHNOLOGIES INC. | 1 877 932 3747 | INTELEX.COM
1
Executive Summary
Organizations face an increasingly complex and unforgiving these data into solutions by shifting from a backward-looking
EHSQ environment. The cost of black swan events and the picture of what has happened to a predictive understanding
cumulative impact of minor failures can be staggering. of risk. Third, they need to use practical tools to implement
Addressing this challenge is not merely an issue of avoiding solutions based on this new knowledge and drive lasting
losses or complying with regulations. Effectively managing the organizational changes. At all three stages, knowledge brokering
risks of complexity can improve a firms reputation and create a the process of gathering and synthesizing insights across
substantial advantage over competitors. Yet, many companies organizational and industry boundarieshelps leaders build
are still relying on traditional tools to manage EHSQ risks, and more effective EHSQ capabilities in their organizations.
these tools often fall short of meeting todays challenges.
This three-step approach, supported by knowledge brokering,
We outline three steps to developing a proactive, forward- not only facilitates better management of EHSQ risks but also
looking, system-level approach to managing EHSQ risks. serves as a catalyst for innovation.
First, managers need to systematically track the rich data
generated by daily operations. Second, they should convert

Gather, Predict, Change: How Smart Leaders Tame Risky Systems


INTELEX TECHNOLOGIES INC. | 1 877 932 3747 | INTELEX.COM
3
The Shifting Risk Landscape
On a calm July night in 2013, a long freight train sat parked magnitude that, when it was discovered, no one knew what
on the main track in a tiny Quebec village. The trainfive to do. Hours after the overdose, Pablo had a seizure and
locomotives and more than seventy tanker cars with almost stopped breathing. He nearly died. An investigation found
two million gallons of crude oilwas unmanned. The engi- that, rather than provide protection against the error, UCSFs
neer had parked it for the night. A few minutes before 1 a.m., computerized prescription system, coupled with a pharmacy
the train began to move. It rolled downhill for nearly eleven robot, actually cleared the way for the failure that poisoned
kilometers, picking up speed. The ground shook as it reached Pablo. Were learning that the magic of information
65 miles per hour and then, near downtown Lac-Mgantic, technology, so familiar to us in the consumer world that it
a picturesque small town, it derailed. The tankers burst into nearly seems normal, is far more elusive in the world of
flames and explosions ripped through the town center. The medicine, wrote Bob Wachter, a UCSF doctor who analyzed
violent inferno killed 47 people and destroyed dozens of the case.
buildings. Though there is often a temptation to pinpoint one
person to blame, the investigation revealed a complex series The same year saw significant failures in many other
of errors that combined to cause the accident. These are industries, too. Take the food industry, for example.
complex systems that are beyond the understanding of most Contaminated food in a primary school in India killed more
of us, said Mark Lalonde, a risk expert familiar with the than twenty children. Rat poison was found in cartons of
accident. Its harder, its messier and its more confusing lettuce in Germany. Several European countries reported
to understand the factors that came together to cause the nationwide contamination of milk with poisonous, cancer-
such a tragedy. causing chemicals. And scandals involving the presence of
horsemeatin what was supposed to be beefrevealed
Just three weeks later, but far from Lac-Mgantic, a different
serious limits to the traceability of the food supply chain in
kind of crisis unfolded. Pablo Garcia, a sixteen-year-old,
the European Union. The key issue is that the supply chain
was admitted to the medical center of the University of
has become far too complex, said the CEO of a British
California, San Francisco (UCSF)one of the worlds most
supermarket chain. And when you introduce complexity,
technologically advanced hospitalsfor a routine procedure.
you introduce risk.
The night before the procedure, Pablos doctor prescribed
a dose of antibiotics to stave off infection. Pablo should
have been given one double-strength pill. Instead, his nurse
gave him 39 pills, an overdose of such a startling

Gather, Predict, Change: How Smart Leaders Tame Risky Systems


INTELEX TECHNOLOGIES INC. | 1 877 932 3747 | INTELEX.COM
4

The CFO of Chipotle estimated that its food-safety

And when you


introduce complexity,
problems could cost it 7% of its entire customer base
customers who may never return. And the cost of Samsungs
phone recall: more than five billion dollars and billions more

you introduce risk.


in lost revenue.

But it is not just headline-grabbing catastrophes and black


During any recent year, far too many tragic examples
swan events that shape the modern risk landscape. Over
highlight the same issue : the growing vulnerability of the
time, even minor failures can add up to significant losses.
increasingly complex systems that underlie our businesses,
Employers in the United States, for example, pay almost
infrastructure, and economy. Recently, for example,
$1 billion each week in direct compensation costs for
the restaurant chain Chipotle faced food contamination
workplace injuries and illnesses.2 In hospitals, roughly 15%
issues that drove away consumers; technology powerhouse
of the money spent on care is due to patient safety errors.3
Samsung recalled its flagship smartphone due to battery-
And even small supply chain disruptionslike the temporary
caused fires; and several major airlines canceled thousands
closure of a portcan cause major problems if they cause
of flights because of technology failures in reservation and
temporary stock-outs that frustrate customers.
flight dispatch systems. These issues havent escaped the
attention of business leaders. In a recent survey of hundreds
Solving these challenges is not merely an issue of avoiding
of C-suite executives, nearly 60% agreed that the volume
losses or complying with regulations. Effectively managing
and complexity of their risks had increased significantly over
the risks of complexity can improve a firms reputation and
the past half-decade.1
create an advantage over competitors. And new research
for example, on the best-performing teams at Google
The cost of failing to address these risks can be staggering.
shows that the solutions to these challenges can even serve
Complex systems that blindsided executives were at the
as a catalyst for innovation.
center of several mega-failures that imposed massive
financial and reputational costs. The Deepwater Horizon
accident, for example, cost BP more than $60 billion.
In 2014, GMs ignition recall cost more than $4.1 billion in
victim compensation, repair costs, and other expenses.

Gather, Predict, Change: How Smart Leaders Tame Risky Systems


INTELEX TECHNOLOGIES INC. | 1 877 932 3747 | INTELEX.COM
5
The Challenge of Increasingly
Complex Systems
Though the above failures take place in different industries makes it difficult to understand what exactly is going on.
with different regulations, concerns, and cultures, they are And when parts of the system are tightly connected, a small
linked by a similar set of underlying causes. In each failure, trigger can easily push things over the edge, like a misstep
a series of small errors combined into a system accident. by a backcountry skier on a mountainside ripe for an
avalanche. In these cases, even when we do understand
Most of the time, our systems behave predictably and as whats happening, we often cant stop things in time to
intended: airlines fly safely, stock markets are stable, and prevent a full-blown failure. These systems are ripe for
lights turn on when we flip a switch. In relatively simple unexpected transitions from typical operations to the
systems, incidents in the environmental, health, safety, and wildness of system failure.4
quality (EHSQ) domain arise from predictable disruptions with
generally well-understood consequences. In such systems, Complexity in modern organizations comes from many
when a small chemical spill happens, for example, there sources. First, businesses increasingly rely on technology
may be immediate mitigation and regulatory considerations. in their day-to-day operations. While technology adds new
Similarly, if a safety feature breaks, it is usually obvious what capabilities, as in the UCSF case, it can also cause new
the potential consequences might be: injuries, equipment problems and compound existing ones. Technological
downtime, or lost productivity, for example. systems are often made to operate in ways that they werent
originally designedfor example, by interacting with exter-
But as complexity increases, systems become less nal partners systems or even legacy internal systemsthat
predictable. Complex systems are like elaborate webs: their require manual workarounds or layers of software patches.
parts are intricately linked and can influence and unexpectedly These systems also often lack robust backups. In the past
disrupt one another. And the state of a complex system is few years, for example, American, United, Delta, and
difficult to observe directly. We usually need to rely on Southwest Airlines all experienced computer problems
indirect indicators to figure out whats going on. whichin the absence of an effective and readily usable
backup systemgrounded thousands of flights, cost the
Failure in complex systems comes from unexpected airlines hundreds of millions of dollars, and caused severe
interactions between distinct small failures, like mechanical reputational damage. Even seemingly benign systems have
problems on an aging locomotive and the failure of an the potential for catastrophe, as Target found when hackers
engineer to correctly run a brake test, as happened at gained access to its point-of-sale terminals through their
Lac-Mgantic. Complexity creates more opportunities for HVAC vendors computer network and caused massive
these interactions and makes it more difficult for designers losses for the retail giant.
and operators to anticipate what things might go wrong,
either because there are so many sources of failure that its
hard to prioritize, or because the nature of the system itself

EHSQ Leadership Summit


6 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Second, as the organizations that use these complex and auditingincrease complexity.5
technologies grow in size and scope, their structure and Finally, changes in the external environment increase
operations also become more complex. As a result, rigid complexity. Climate change, for example, causes not just
silos develop. Not my job becomes not our job, as a shift in mean temperature but also a higher likelihood of
each function, division, and department understands extreme events like severe droughts and floodsall of which
and tackles only its own piece of the business. Distinct can affect businesses by disrupting supply chains, preventing
organizational subcultures emerge, each with its distinct employees from coming to work, and potentially destroying
priorities and routines. Finger pointing (Its their job!) key sites. And competition pushes companies to trim
and reinventing the wheel (We already have a process operations even as they push the envelope of their capabili-
for that!) are common symptoms.On paper, there might ties. For example, even as it dug one of the deepest wells in
exist a neat organizational chart with clear lines for reporting history, BP skipped cement tests to save money and shave
and collaboration, but the actual organization is more like time off its million-dollar per day lease of Deepwater Horizon.
a fragmented, impenetrable system full of barriers and
disconnected parts. This kind of complexity makes it As a system becomes more complex, its managers and
especially difficult to develop EHSQ solutions, which often operators become more likely to make decisions that lead
require cross-functional knowledge and monitoring. At GM, to mishaps.6 This is because the environment in which
investigators identified these issues as a key contributor to human cognition evolved was very different from todays
the ignition switch failure. They called this tendency the complex world. That environment did not prepare us to
GM Salute, defined as a crossing of the arms and point- manage todays systems; instead, it left us with hardwired
ing outward towards others, indicating that the responsibility cognitive biases. Confirmation bias, for example, makes us

belongs to someone else, not me. ignore information that clashes with our prior beliefs.
The availability heuristic leads us to overestimate the
Finger pointing probability of events that we can easily recall at the expense

(Its their job!) and of considering rare but more serious events. And plan
continuation bias causes us to continue our original plans
reinventing the wheel of action even when conditions change and those plans

(We already have a


become problematic.7

process for that!)


are common
These fundamental limitations are an endless source of
minor mistakes, oversights, and misunderstandings in nearly
everything we dobut most of the time it doesnt matter.

symptoms. In relatively simple systemslike an assembly line that we


can directly observe and control and, if necessary, shut
Third, complexity often comes from changing regulations downthese biases tend to be less dangerous. If things
that dont always harmonize with the latest technological go wrong because of human error, we can quickly spot
developments. In finance, for example, regulatory approaches problems and intervene. But in many modern systems,
to electronic trading often tie back to the traditional floor the problems these hardwired biases cause are hard
trading model instead of drawing from modern-day software to detect and comprehend in real time, and their
best practices. And the offshore drilling rules in place when ripples travel far.
Deepwater Horizon exploded did not reflect the latest
standards and best practices in the industry. Even well-
intended ruleswith additional requirements for compliance

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 7
The Limits of Traditional Approaches
that drilling priorities [took] precedence over planned
Missing the Forest for the Trees. Though the risk landscape is maintenance, and that the safety culture on the rig had little
changing rapidly, many companies are still relying on traditional influence at Divisional or Corporate levels.
tools to manage risks and operations. These tools fall short
when operating a complex and tightly coupled system. By emphasizing easy-to-measure items at the expense of a
broader safety culture, Transocean produced statistics showing
As the practices around environmental, health, safety, and a steady fall in incidents and creating a convincing illusion of
quality issues have matured, they have migrated from the safety. But without connecting the dots back to system safety,


offices of senior executives to professional middle managers. the company missed the forest for the trees.
Yet, increasingly complex operations require information from
these areas to flow to senior executives and to be integrated Whether in dealing with
health and safety, quality,
into broader strategic decision making.

There is a growing disconnect between traditional approaches


or environmental issues,
to risk management and the challenges that arise when
operating in a complex EHSQ environment. The measures
that once served executives well in managing riskinstituting
rules and controls, adding safety systems, and relying on alarms
measurement for
measurements sake

to alert operators of problemsare no longer sufficient to is not the answer.
achieve measurable improvement. These traditional approaches
to risk management focus on the trees but miss the forest. Indeed, the illusion of safety persisted even after the Deepwater
It manages some of the parts but not the sum of the parts. Horizon accident. Notwithstanding the tragic loss of life in the
Gulf of Mexico, we achieved an exemplary statistical safety
Consider Transocean, the company that owned and operated record as measured by our total recordable incident rate and
the Deepwater Horizon oil rig. In many ways, Transocean had total potential severity rate, read Transoceans annual report
a strong safety culture at a tactical level, one that focused on for 2010. As measured by these standards, we recorded the
reducing worker injury. But they lacked a strategic approach best year in safety performance in our companys history, which
to system safety. On the morning of the deadly explosion that is a reflection on our commitment to achieving an incident-free
marked the start of BPs massive spill, the rigs boss presented environment, all the time, everywhere. Though it was a year
a drill hand with a handsome silver watcha reward for spotting that saw the worst accident in the industrys history, traditional
a worn bolt during a recent safety inspection. At the same time, safety measures showed it was a great year. This is an
workers were frustrated with Transoceans approach to system important lesson about the dangers of choosing the wrong
safety. Only a few weeks before the explosion, they reported

EHSQ Leadership Summit


8 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
metrics. Whether in dealing with health and safety, quality, or and more vigilant operators will solve these challenges,
environmental issues, measurement for measurements sake but research shows these approaches are insufficient in a
is not the answer. complex and tightly coupled system.

Systems Strike Back. If not implemented carefully, traditional Advanced technology, as was the case with UCSFs

approaches that emphasize environmental safety and quality pharmacy robot, often contributes to failures in the first place.

controls and warning systems can backfire in several ways. And Yale sociologist Charles Perrow has identified many of

First, having too many controls can create an environment ripe the features that trigger failure in complex, tightly coupled

for risk creepthe process by which an unacceptably risky systems. Redundancies and safety systems, Perrow

practice gradually becomes acceptable. A behavior that argues, are the biggest single source of catastrophic failure
deviates from rules and best practicesfor example, failing to in complex, tightly coupled systems. Finally, when it comes
follow some of the more arduous steps of a complicated proce- to maintaining vigilance, research from NASAs flight
dure or giving approval to a decision that almost meets safety cognition lab on experienced airline pilots shows how
standardsrarely results in disaster the first time it occurs. difficult this is. Researchers argue that it is unrealistic to
Then, as deviance continues without consequences, it develops expect flight crews to be able to maintain vigilant monitoring
into an accepted norm. of normally reliable systems. Yet, in many industries,
operators are frequently blamed for environmental, quality,
It becomes a routine part of the job to sign off on maintenance and safety lapses even when the causes are systemic.
work orders without actually checking if they were completed,
or to perform a quality inspection without a second set of eyes. To make matters worse, a toxic culture in some organizations
Not even harrowing near missescases when dumb luck prevents system-level thinking from being incorporated into
intervenes to avert disasterhalt this process. So risk creep business practices. Too often, managers blame employees
continues and day after day causes seemingly small errors for errors and discourage them from raising concerns.
without consequencesuntil, one day, the consequences This deprives companies of valuable information about what
become all too real.8 is effective on the ground and suppresses warning signs of
potential system accidents.
Second, environmental, safety, and quality systems can create
a feeling of security that encourages more risk taking, a
phenomenon known as risk compensation.9 Drivers are less
cautious when their cars are equipped with anti-lock brakes.
And the captain of the HMS Titanica ship thought unsinkable
because of its watertight compartments and modern design
raced along despite warnings of icebergs ahead.

Third, EHSQ systems might also contribute to alarm fatigue,


causing critical alerts to be missed or ignored. For example,
caregivers in UCSFs intensive care units will deal with 2.5 million
audible alarms each monththe majority of them false. In
such an environment, the overwhelming noise of false or trivial
alarms overwhelms truly important warning signals.

So whats the solution? Conventional wisdom suggests


that newer technologies, more extensive EHSQ measures,

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 9
Solutions
generated by daily operations. They then need to convert
Managing complex, tightly coupled systems requires a
these data into solutions by shifting from a backward-looking
proactive, forward-looking, system-level approach. This
picture of what has happened to a predictive understanding
helps managers better understand systemic risk and
of risk. Next, managers should use practical tools to
fosters an organizational culture that can identify and
implement solutions based on this new knowledge and
seize opportunities. Rather than focusing exclusively on
drive lasting organizational changes. At all three stages,
compliance or the parts of a system, this approach
knowledge brokeringthe process of gathering and
considers the entire system and the possibility that small
synthesizing insights across organizational and industry
errors turn into major failures.
boundarieshelps leaders build more effective EHSQ
capabilities in their organizations.
To take such an approach, managers need to systematically
trackand learn as much as possible fromthe rich data

Figure 1

EHSQ Leadership Summit


10 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Step One: Track And Learn
Track Anomalies
Before most accidents, one or more insiders had serious By systematically learning about errors, near misses, and
reservations about the decisions or procedures in place, anomalies across the organization, higher-level managers can
but either failed to share these concerns or did share them piece together a mosaic of insights into potential meltdowns.
but fell on deaf ears. The history of the weeks and months But merely providing employees with a close-call reporting
leading up to an incident is often a history of smaller failures, system is not enough. A sophisticated web form gathers no
near misses, glaring irregularities, and other indications that input if employees with valuable information never use it.
something is amiss. But information about these symptoms For example, during our research, when we asked healthcare
usually resides with employees on the ground, who often feel workers about error-reporting systems, we got knowing nods
uncomfortable disclosing errors, expressing dissenting views, and confirmation that their hospital used such a system.
and questioning established procedures. 10
But when we asked nurses when the last time they submitted
a report was, many looked surprised, as if they didnt
Experiments show that even when we succeed by dodging understand the question: Oh, I never use itI dont want
a bulletwhen blind luck lets us avert disasterwe tend to get fired!
to assume that the process that led to it was fundamentally
sound.11 Before the Deepwater Horizon explosion, for Implementation is key to reaping the potentially tremendous
example, BP managers relied on their experience with benefits of such systems. Imagine a worker who, while
earlier, successful projects when they disregarded information walking to her assembly station on the factory floor, notices a
that they didnt have the right equipment on board to cement pool of liquid on the floor. First, it needs to be easy for her to
the well. But, who cares, its done, end of story, [we] will report the problem. Second, the information that she cap-
probably be fine and well get a good cement job, wrote a tures (which ideally includes a photograph or rich description
BP drilling engineer. 12
of the issue) needs to be quickly looked at by people who
can direct someone to fix the problemin this case, to clean
When information about near misses makes its way up the up the spill. Third, such a fix needs to be implemented in a
ranks, managers might fail to diagnose it as a warning sign, timely fashion so the reporter can see that her report made a
and often with good reason. In complex organizations, some difference.
troubles on the groundminor errors, lapses, and unusual
but manageable eventsare commonplace, even expected. Learn from Weak Signals of Failure
That a single, particular small event was, in fact, a warning In addition to fixing the problem, managers need to realize
sign is often not obvious until after a disaster occurs and its that a report may merely be the tip of the iceberg. They
logic is finally understood. But such irregularities tend to be need to dig into root causes. There may be other issues
reported haphazardly and are rarely analyzed rigorously, so that need to be addressed beyond cleaning up our hypo-
managers might see some of the trees but wont know they thetical spill. Is the spilled liquid hazardous in any way?
are in a forest.

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 11
Did any go into a drainage system, which might necessi- This approach can reduce the dangers posed by complexity.
tate a report to the local environmental regulator? Where Complex systems dont fail because of glaring errors.
did the spill come from was it from a localized source, Individuals usually dont have all the puzzle pieces they
or is there a valve leaking somewhere? If from a valve, need to figure out big problems on their own.
was it just left open accidentally, is there a manufacturing Systematically learning from weak signals of failure is one
flaw that might cause additional valves to fail, or are way to open up communication channels to let nagging
maintenance workers accidentally over-tightening them? worry through. And by fostering organizational skepticism,
In getting to the bottom of things, leaders need to leaders can empower workers, engineers, and managers to
approach the issues without assigning blame. Otherwise, identify problems before they become costly headline-
information about problems will stop coming. grabbing disasters. Skeptics strengthen organizations by
digging into issues and doggedly pursuing problems until
Beyond fixing a problem, organizations need to audit theyre understood and ultimately solved.16
whether such a fix had the intended effect. If maintenance
Contrast this approach with what occurred at General
workers were overtightening valves, did additional training
Motors before its ignition switch recall. Initially, engineers
fix the issue? Did it happen to cause other problems, like
and managers within GM ignored reports that a GM
under-tightening a different set of valves? And looking
engineer had designed a faulty ignition switch. But that
back months later, did the issue stay fixed or did workers
engineer ignored the problem, and teams reviewing the
return to their previous patterns?
issue failed to understand the safety implications. Later,
even as fatalities mounted, senior leaders remained in the
Research shows that incident reporting systems form an
dark. GMs culture lacked skepticism, and its engineers
essential part of how decision makers anomalize, that is,
paid little attention to these weak signals of failure. As a
treat minor errors as potentially significant details rather
result, concerns didnt rise to the top quickly enough.
than as normal, expected events.13 By digging in and
As an outside investigator reported, While the issue of the
investigating root causes, reporting systems can be an
ignition switch passed through numerous hands at GM,
important way of consolidating siloed information from
from engineers to investigators to lawyers, nobody raised
distinct parts of the system and forming a richer
the problem to the highest levels of the company. As a
understanding of an organizations systemsand how
result, those in the best position to demand quick answers
they might go astray.14
did not know questions needed to be asked. Ultimately,
because of such failures, GMs cars killed people, and the
Learning Where It Matters
company paid out over a billion dollars in compensation to
Another key benefit of systematically learning from weak


victims and penalties to the U.S. Government.17
signals of failure is that senior leaders can drive big
decisions about efficiency and corporate strategy. Upon
taking the helm of Alcoa, for example, CEO Paul ONeill
Skeptics strengthen
made worker safety his priority. By doing so, he opened up organizations by
digging into issues and
channels from the factory floor to the C-suite, creating fluid
communication lines within his large and widely distributed
organization. Not only did his approach increase safety;
doggedly pursuing
it also increased employees voicetheir ability and
willingness to speak up and raise concernsand
allowed for suggestions of all kinds to flow across the
organization.15
problems until
theyre understood and

ultimately solved.
EHSQ Leadership Summit
12 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Step Two: Develop Data-Based
Solutions
It is vital to collect and analyze reliable data about the newsletter that helps pilots understand the errors and
behavior of complex systems. We need good data about near misses reported by their peersmost of whom they
weak signals of failure, but merely warehousing such data will likely never meet. The newsletter is a powerful tool to
isnt enough. We need to turn the resulting knowledge into heighten awareness of potential risks.
practical solutions.
NASA codes ASRS reports to help understand emerging
Generating Leading Risk Indicators trends in safety. In the same way, managers should
harness the power of big data whenever possible. Data
Even in risky industries, major accidents are rare, so
on near misses and other variablessuch as overtime
leaders need to predict risks by understanding small
hours, on-time deliveries, and defect ratescan help
failures and near misses. Take the example of Pablo
managers predict what plants or groups are operating
Garcia, the UCSF patient who had a massive overdose
effectively, and which might be on the path to serious
that resulted in a seizure and respiratory arrest. For every
issues. But most traditional approaches to EHSQ
dramatic outcome like Garcias, there are dozens of
management silo data across discrete software packages,
adverse drug eventsfrom innocuous mis-dosings to
making it difficult to combine the data into a meaningful
the delivery of the wrong medicinewith few visible
picture. And even if EHSQ managers can leverage data
consequences. But for luck, each of these incidents might
across silos, they dont necessarily have enough data or
have had dire consequences. By measuring even harmless
specialist personnel to build a useful predictive metric.
drug events, rather than just patient outcomes, hospital
managers get a better idea of how things might go wrong.
An ideal approach to learning from near misses would
The same opportunity exists in other industries, as most
incorporate information from several sources: from a firms
equipment malfunctions or safety near misses dont
own EHSQ data, from incidents that occurred in other
cause injuries, downtimes, or quality issues. But they do
firms, and even from issues that arise in other industries.
provide data on what is going wrong, a valuable way
But most EHSQ professionals dont have access to such
to understand system-level complexity.
data sources and, without specialized help, lack the robust
analytics needed to unlock their value.
Though such data are fundamentally retrospective (in
that they look at what has already happened), they can be
used to generate leading indicators and help managers
shift from a reactive approach to a proactive one. In the
aviation world, NASAs Aviation Safety Reporting System
collects reports of safety incidents and makes them
available to researchers. They also publish a monthly

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 13
Figure 2

Leading risk indicators can provide valuable insights into the


risk of different sites or operations. Such indicators should be
built using an analytical approach that integrates near misses
and other variables with measured outcomes. Ideally, the
model itself should be developed using data gathered from
different sites, divisions, companies, and even industries.
Then, using the data from a specific site or operation, the
model is used to generate risk indicators for outcomes like
worker injuries or quality issues.

Using Structured Decision Tools external experts not directly involved in a given project
Structured decision toolscognitive aids and like the Risk Review Boards at NASAs Jet Propulsion
organizational practices that reduce the influence of Laboratorycan catch insiders decision errors and en-
decision biasescomplement leading indicators and sure that unacceptable risks are challenged and mitigated.
design-based interventions. By directly combatting (See sidebar for a list of resources on structured
decision biases, these tools reduce the frequency of decision tools). By directly combatting decision biases,
errors that trigger escalating failures in complex these tools reduce the frequency of errors that trigger
systems. They also help us avoid underestimating escalating failures in complex systems. They also help us
the risks that come from hard-to-observe interactions avoid underestimating the risks that come from hard-to-
within our systems. observe interactions within our systems.

Checklists, for example, help compensate for the limits Structured Decision Tools
of human memory and attention by, for example, listing Quality control for your big decisions:
A 12-question checklist
explicit steps for safety and product inspections.
Kahneman, D., D. Lovallo, and O. Sibony. 2011.
Pre-determined criteria for risks and quality help us avoid Before you Make that Big Decision. Harvard Business Review,
89(6): 50-60.
confirmation bias and plan continuation bias by calling
attention to lapses in quality or increasingly risky practices. The premortem: A technique to unearth hidden risks
Klein, G. 2007. Performing a Project Premortem.
Harvard Business Review, 85 (9): 18-19.
Likewise, running a so-called pre-mortem exercise
a technique that asks team members to imagine that a SPIES: A simple tool for making better forecasts
Moore, D., and U. Haran. 2014. A Simple Tool for
major failure has already occurred and asks them to Making Better Forecasts. Harvard Business Review (online).
imagine the underlying causesreduces overconfidence
An excellent introduction to the biases that affect
and opens our thinking to complex risks that we might managerial decisions
otherwise ignore. And teams composed of internal and Bazerman, M. H., and D. Moore. 2013. Judgment in
Managerial Decision Making. Wiley (8th edition).

EHSQ Leadership Summit


14 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Step Three: Use Practical Tools
To Implement Change
The mere presence of systems for identifying, tracking, and as silencing employees concerns or writing off their obser-
fixing problems, however, is not enough. Research shows vations as irrelevant. Managers cant just say that they value
that the failure to leverage the value of learning from weak speaking up, they have to demonstrate their commitment to
signals of failure is rooted in an organizations culture. 18
safety and quality.

Many things stand in the way of leaders who hope to drive Wells Fargo, for example, recently paid a fine of nearly
meaningful change in the EHSQ realm. First, employees $200 million dollars and fired over five thousand low-level
are generally skeptical of change and often see such employees for opening unauthorized customer accounts.
initiatives as transient efforts that will pass. They resist The bank set aggressive targets for its bankers to cross sell
change, particularly top-down efforts. And merely directing different products to existing customers. Yet despite the
people to report anomalies or change their routines isnt relevance of three of Wells Fargos stated values, People
enough. Hours of speeches and meetings cant compare as a competitive advantage, Ethics, and Whats right for
with the effectiveness of a set of practical, concrete tools customers, managers retaliated against employees who
that leverage the intrinsic motivations of employees. spoke out about the unrealistic sales targets and fraudulent
practicesoften by firing them in a way that prevented them
When the values an organization articulates as a cornerstone from securing future jobs in the financial sector.19
of its culture diverge from the behavior of its leaders and
managers, it is nearly impossible to create a positive climate. Fortunately, by using practical solutions that integrate with

Few actions do more to devastate an organizations climate what actually motivates people, leaders can demonstrate
their commitment to a positive EHSQ climate. Consider a
Hours of speeches system that captures and analyzes weak signals of failure.

and meetings cant Instead of a paper suggestion box or a toll-free number


where an employee might leave a voicemail (and possibly
compare with the never get a response), a team might use a solution that takes

effectiveness of a set
into account human nature. A mobile app, for example, can
effectively and quickly capture data, and in many settings,

of practical, concrete employees already carry their cell phones with them. By

tools that leverage the


intrinsic motivations
using such a system, managers can also demonstrate that
theyre closing the loop on problems by notifying employees
that theyre gathering more information or that a fix is in the
works. Even better, managers might gamify near-miss re-

of employees. porting by keeping a leaderboard to highlight active reporters

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 15
and randomly giving out bite-sized rewards (like small gift By systematically capturing information in a way that
certificates) to individuals or teams working to strengthen the resonates with peoples routines, managers can get good
organizational climate. data on who is aligned with organizational goals. And by
publicly celebrating the results, they can reinforce safety,
Whether solutions like these take hold in an organization
quality, and environmental best practices in a positive and
depends on how leaders treat those who speak up.
consistent way. The good news is that you dont need to
When employees raise a plausible and potentially serious
be the captain of an aircraft carrier to make a strong EHSQ
concern but turn out to be wrong, are they scolded or
culture an organizational centerpiece.
praised? As Bob Wachter, the UCSF doctor who analyzed
Pablo Garcias case, put it, The measure of a safe Simply shrinking traditional EHSQ toolsfor example, by
organization is not whether a person who makes a great merely creating mobile versions of paper formsis not
catch gets a thank-you note from the CEO. Rather, its enough to answer todays challenges. The real opportunity
whether the person who decides to stop the line still lies in using new technologies to change mindsets from
receives that note . . . when there wasnt an error.20 Its not my job to Its everyones job. A native app,
available to every worker, can help build a proactive,
Consider the case of the US Navys nuclear-powered
forward-looking, system-level approach to managing
supercarrier, the USS Carl Vinson, which measures over
EHSQ risks. Such an approach harnesses valuable input
300 meters long, displaces over 100,000 tons of water,
from employees and contractors about near misses,
and runs some of the most complex and dangerous
hazards, unsafe behaviors, and quality concerns.
aircraft operations in the world. A veritable floating city,
But it also closes the loop on learning, and makes line
it has a crew of over six thousand, including pilots, cooks,
employees the main beneficiaries of just-in-time safety
sailors, and of course, the deck crews that launch and
and quality guidance in their work environments. And its
recover aircraft while the ship moves at over 30 knots on
even better when those insights come not just from one
the open sea.
siloed site within a firm, but from benchmarks and data
on best practices across entire industries.
One day, during flight operations, a seaman told the Carl
Vinsons Air Boss, the officer in charge of flight operations,
that he had lost a tool somewhere on the deck. In the
parlance of air operations, the lost tool fouled the
deck, so the Air Boss immediately halted all landings
and diverted dozens of planes to land-based runways.
After a meticulous deck search, the crew eventually
found the lost tool.

Losing a tool on an aircraft carrier has potentially fatal


consequences (as it could destroy the engine of a landing
aircraft) and high costs (it is expensive to divert planes
and interrupt fleet operations). So what was the fate of the
careless seaman? Was he chewed out by his superiors
and transferred to a posting in the frigid wilds of Alaska?
Quite the oppositethe next day, his commanding officer
held a formal ceremony on the carriers deck to celebrate
the seaman for his bravery in reporting the missing tool.

EHSQ Leadership Summit


16 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Knowledge Brokering

Within a single firm,


Knowledge Brokers as External Partners
Organizations have traditionally pursued the above
goalslearning from weak signals, developing data-driven there is only a limited
amount of available
solutions, and driving sustainable changeon their own.
But the results are often disappointing. Within a single firm,
there is only a limited amount of available data on weak
signals, solutions, and change initiatives.

A firm can learn from its own experiences, but it cannot draw
data on weak signals,
solutions, and change

on others data. It has a small-sample problem. Companies initiatives.
can overcome this problem by working with a knowledge
brokeran organization that spans many, otherwise
industries as medical instruments, furniture, toys, and
disconnected industries and uses this position to synthesize
computers has given us a broad view of the latest technol-
data and insights across the boundaries of different
ogies available and has taught us how to do quality product
sectors, markets, and firms. There is abundant evidence
development and how to do it quickly and efficiently.
that knowledge brokers are in a unique position to recognize
how existing knowledge in one industry might be used to Because a knowledge broker stands at the intersection of
create breakthroughs in another. From inventor Thomas distinct domains, it benefits from what sociologist Ron
Edisons groundbreaking research lab in Menlo Park to Burt called a vision advantage: it taps into distinct, non-
the powerhouse design-consulting firm IDEO, knowledge redundant streams of information. Because a knowledge
brokers turn insights from one domain into solutions in broker spans otherwise disconnected firms and industries,
other industries. it can access much more data, and much richer data, than
a single firm that monitors only its own processes and
Since its founding in the late 1970s, for example, IDEO has
outcomes. By spanning industries from mining to health
developed thousands of successful products by moving
care, and from retail to manufacturing, a knowledge broker
between over 40 different industries that are usually isolated
can access a broad diversity of data about weak signals
from one another. For example, IDEO designed a new water
of failure, leading indicators of problems, and validated
bottle by incorporating a leak-proof nozzle based on an
solutions. And rather than reinventing the wheel that already
existing solution used for shampoo bottles. It adopted a
exists in another industry, it can recombine existing ideas
tracking mechanism for the original Apple computer mouse
into new solutions that work across multiple industries.
from the giant trackballs used in video game machines. And
Thus knowledge brokers can help firms move from simply
it created a surgical skin stapler by incorporating ideas from
measuring their internal processes to implementing novel
model airplane engines and office staplers.22 As CEO David
and broadly validated solutions.
Kelley put it, Working with companies in such dissimilar

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 17
Beyond their vision
and departments and often remain stuck in silos. Though
different functions develop distinct perspectives and

advantage, knowledge solutions, these may never come into contact with one an-
other in a productive way. Unless, of course, the firm itself
brokers have another has knowledge brokers who can span traditional organi-

crucial benefit: they zational boundaries and learn from the otherwise separate
silos. Knowledge brokers can select, and then synthesize,

are independent the best ideas from a wide range alternative perspectives

and not too deeply


embedded in any
in an organization. And thats how new solutions emerge.
Research in a large American electronics company, for
example, has shown that the best ideas for improving the
firms supply chain came from managers who had regular
given industry. discussions with people in different silos. Conversely,
managers who didnt cross boundaries tended to come
Beyond their vision advantage, knowledge brokers have up with unhelpful ideasnarrow, nitty-picky complaints
another crucial benefit: they are independent and not too and vague proposals that didnt solve real problems.
deeply embedded in any given industry. Research shows Good ideas came from knowledge brokers.24
that such an external, impartial position allows knowledge
But in this case, as in many other organizations, the
brokers to learn more effectively and to analyze data and
knowledge brokers didnt have a formal mandate, and
propose solutions more objectively.
their role wasnt organizationally supported. They were
Big data is helpful but not itself a solution to complex
just regular supply chain managers whose personal
problems. And cognitive biases and dysfunctional group
networks happened to span disparate groups.
norms can prevent organizations from effectively using
Yet, their success in generating solutions suggests
even the richest and most reliable dataas they did with
there is an opportunity for organizations to facilitate
the public health officials who monitored water quality in
learning by appointing a high-level executive as Chief
Flint, Michigan. Third-party knowledge brokers, however,
Knowledge Broker.
are not bound by the most convenient conclusions or ones
that conform to taken-for-granted industry norms and It would be someone whose job it is to span boundaries
managers prior beliefs. Their independent position allows between silos and then synthesize disparate perspectives
them to follow the data wherever it might take them and into solutions for the entire organization. In the EHSQ
avoid the decision biases that otherwise creep into our domain, which often requires cross-functional solutions
analysis. Thus, partnering with a knowledge broker can and monitoring, creating such positions might be
give a firm not only better data but also a more objective especially valuable.
analysisanalysis that can stop costly problems in their
tracks, before they impact the bottom line or show up on
the front page.

Internal Knowledge Brokers


The principle of knowledge brokering also applies to
learning within a company. In a complex organization,
information and ideas are scattered across different teams

EHSQ Leadership Summit


18 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Managing Complexity:
Challenge And Opportunity
Taking the above steps to address the risks of complexity is Likewise, decades of research shows that a knowledge
more than just an EHSQ risk mitigation program. These steps brokers position at the intersection of otherwise
can help executives accomplish several objectives beyond disconnected domains is a critical source of innovative
more effective risk management. ideas.26 And a recent McKinsey & Company study of more
than 1,000 strategic decisions found that techniques to
Developing an organizational climate that empowers reduce decision biases mitigate risk and increase investment
employees to report and openly share their concerns, for returns.27 Research shows that the approaches we outline
example, is not only a helpful safeguard against operational to managing EHSQ risks can also improve teams,
risks but also a catalyst for innovation. For instance, a multi- organizations, and decision making more broadly.
year research studycode-named Project Aristotle Organizations that adopt these approaches will become
examined hundreds of teams at Google and found that a better at learning, innovation, and decisions making, thus
key factor that distinguished the most creative teams was increasing their opportunities for growth even as they
that their members felt they could speak up and express reduce their exposure to systemic risks.
doubts or disagreements without the threat of rejection

or punishment.25

Gather, Predict, Change: How Smart Leaders Tame Risky Systems


INTELEX
Gather, Predict, Change: How Smart Leaders Tame Risky TECHNOLOGIES INC. | 1 877
Systems 932 3747TECHNOLOGIES
INTELEX | INTELEX.COM INC. 19
Further Reading
How paying attention to failure can pay dividends
Tinsley, C. H., R. L. Dillon, and P. M. Madsen. 2011. How to Avoid Catastrophe.
Harvard Business Review, 89(4): 90-97. http://bit.ly/1HWqPf7

What can mountain climbing teach us about air crashes and nuclear meltdowns?
Roberto, M. A. 2002. Lessons from Everest: The Interaction of Cognitive Bias,
Psychological Safety, and System Complexity. California Management Review, 45(1): 136-158.
http://bit.ly/1HWshOw

A systemic view: The role of complex interactions and tight coupling


Perrow, C. 2011. Normal Accidents: Living with High Risk Technologies.
Princeton University Press. http://amzn.to/1JdQQrK

Building resilient organizations that effectively manage unexpected crises


Weick, K. E., & Sutcliffe, K. M. 2007. Managing the Unexpected: Resilient
Performance in an Age of Uncertainty. John Wiley & Sons.
http://amzn.to/1JvMR6A

Ideas in Practice: A tour of the nuclear weapons sausage factory


Schlosser, E. 2013. Command and Control: Nuclear Weapons, the Damascus
Accident, and the Illusion of Safety. Penguin Group US. http://amzn.to/1OfHb7q

How small errors combine into major catastrophes


Weick, K. E. 1990. The Vulnerable System: An Analysis of the Tenerife Air
Disaster. Journal of Management, 16(3): 571-593. http://bit.ly/1OI6tpR

EHSQ Leadership Summit


20 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
Author Biographies

ANDRS TILCSIK is an Assistant Professor of Strategic CHRIS CLEARFIELD is a principal at System Logic,
Management at the Rotman School of Management at the a Seattle-based research and consulting firm that helps
University of Toronto, where he developed and teaches organizations manage the risk of system failure. Chris
the award-winning MBA elective Catastrophic Failure in has written about failure and technology for the popular
Organizations. His research has been covered widely in science magazine Nautilus, the Harvard Kennedy
media outlets, including The New York Times, The Econ- School Review, The Guardian, Forbes, Project Syndicate,
omist, Forbes, The Washington Post, The New Yorker, the Harvard Business Review blog, and in a memo to the
and Bloomberg BusinessWeek and cited in testimonies to U.S. House of Representatives. He has given talks at
committees of the U.S. Congress. His most recent re- Columbia, Princeton, the New York City Office of Emer-
search explores the factors that contribute to catastrophic gency Management, the Jet Propulsion Laboratories, the
failure in the banking sector. As a Fellow of the Michael Hilltop Institute, New Yorks Metropolitan Transportation
Lee-Chin Institute for Corporate Citizenship, he is studying Authority, and many other venues. Chris is a licensed com-
corporate practices and decision-making processes that mercial pilot and a graduate of Harvard University (A.B.),
reduce the risk and impact of environmental disasters. He where he studied physics and biology.
is a graduate of Harvard University (Ph.D., A.M., and A.B.).
With Chris Clearfield, he is a founding member of
the Rethink Risk Project.

EHSQ Leadership Summit


Gather, Predict, Change: How Smart Leaders Tame Risky Systems INTELEX TECHNOLOGIES INC. 21
End Notes
1
M. Beasley, B. Branson, and B. Hancock, 2015 Report on the 13
K.M. Sutcliffe and M.K. Christianson, Managing for the Unexpected,
Current State of Enterprise Risk Management: Update on Trends and University of Michigan, Ross School of Business, Executive White
Opportunities, North Carolina State Enterprise Risk Management Paper Series (2013), available at http://positiveorgs.bus.umich.
Initiative, available at https://erm.ncsu.edu/library/article/current- edu/wp-content/uploads/managing_unexpected_sutcliffe.pdf (last
state-erm-2015 accessed June 20, 2016).
2
United States Department of Labor, Occupational Safety and Health 14
Roberto, Michael A., Richard M.J. Bohmer, and Amy C. Edmondson.
Administration, Business Case for Health and Safety, https://www. Facing Ambiguous Threats. Harvard Business Review 84, no. 11
osha.gov/dcsp/products/topics/businesscase/costs.html (2006): 106.
3
Jackson, T. One Dollar in Seven: Scoping the Economics of Patient 15
Duhigg, Charles. The Power of Habit: Why We Do What We Do in Life
Safety. Ottawa, ON, Canada: Canadian Patient Safety Institute and Business. Vol. 34, no. 10. Random House, 2012.
(2009). 16
Perrow, Charles. Organizing to reduce the vulnerabilities of complex-
4
Perrow, Charles. Normal Accidents: Living with High Risk Technolo- ity. Journal of Contingencies and Crisis Management 7, no. 3 (1999):
gies. Princeton University Press, 2011. 150-155.
5
Clearfield, Christopher, Andrs Tilcsik, and Benjamin Berman.Pre- 17
Valukas, Anton R. Report to board of directors of General Motors
venting Crashes: Lessons for the SEC from the Airline Industry. company regarding ignition switch recalls. Jenner & Block, Tech.
Harvard Kennedy School Review (2015). http://bit.ly/1OfBaI7 Rep (2014).
6
Weick, Karl E. The vulnerable system: An analysis of the Tenerife air 18
Dillon, Robin L., Catherine H. Tinsley, Peter M. Madsen, and Edward
disaster. Journal of Management 16, no. 3 (1990): 571-593. W. Rogers. Organizational correctives for improving recognition of
7
Kahneman, Daniel. Thinking, Fast and Slow. Macmillan, 2011.; and near-miss events. Journal of Management 42, no. 3 (2016): 671-
Bazerman, M. H., and Moore, D. Judgment in Managerial Decision 697.
Making (7th ed.). New York: John Wiley & Sons. Bazerman, M. H., & 19
Arnold, Chris. Fired Wells Fargo Employees Allege Attempts To
Moore, D. 2008. Judgment in managerial decision making (7th ed.). Blow The Whistle. All Things Considered. National Public Radio.
New York: John Wiley & Sons, 2008. October 14, 2016.
8
Vaughan, Diane. The Challenger Launch Decision: Risky Technology, 20
Wachter, Robert. How to make Hospital Tech Much, Much Safer.
Culture, and Deviance at NASA. University of Chicago Press, 1997. Back Channel (2015). http://bit.ly/2g7r4uf
9
M. Aschenbrenner and B. Biehl, Improved Safety Through Improved 21
Landau, Martin, and Donald Chisholm. The arrogance of optimism:
Technical Measures? Empirical Studies Regarding Risk Compen- notes on failure-avoidance management. Journal of Contingencies
sation Processes in Relation to Anti-lock Braking Systems, in R.M. and Crisis Management 3, no. 2 (1995): 67-80.
Trimpop and G.J.S. Wilde, editors, Challenges to Accident Pre- 22
Hargadon, Andrew, and Robert I. Sutton. Technology brokering and
vention: The Issue of Risk Compensation Behavior (Groningen, the innovation in a product development firm. Administrative Science
Netherlands: Styx Publications, 1994). Quarterly (1997): 716-749.
10
Edmondson, A. Psychological Safety and Learning Behavior in Work 23
Burt, Ronald S. Structural holes and good ideas. American Journal
Teams, Administrative Science Quarterly 44, no. 2 (1999): 350-383; of Sociology 110, no. 2 (2004): 349-399.
and M.A. Roberto, Lessons from Everest: The Interaction of Cogni- 24
Burt, Ronald S. Structural holes and good ideas. American Journal
tive Bias, Psychological Safety, and System Complexity, California of Sociology 110, no. 2 (2004): 349-399.
Management Review 45, no. 1 (2002): 136-158. 25
Duhigg, C. 2016. What Google Learned from Its Quest to Build
11
Tinsley, Catherine H., Robin L. Dillon, and Peter M. Madsen. How to the Perfect Team. The New York Times, https://www.nytimes.
avoid catastrophe. Harvard Business Review 89.4 (2011): 90-97. com/2016/02/28/magazine/what-google-learned-from-its-quest-to-
12
Graham, B., W. K. Reilly, F. Beinecke, D. F. Boesch, T. D. Garcia, C. build-the-perfect-team.html
A. Murray, and F. Ulmer. Deep water: The Gulf Oil disaster and the 26
Burt, Ronald S. Brokerage and Closure: An Introduction to Social
future of offshore drilling: Report to the President: National Commis- Capital. Oxford University Press, 2005.
sion on the BP Deepwater Horizon Oil Spill and Offshore Drilling. 27
D. Lovallo and O. Sibony, The Case for Behavioral Strategy,
(2011). McKinsey Quarterly (March 2010): 30-43

EHSQ Leadership Summit


22 INTELEX TECHNOLOGIES INC. Gather, Predict, Change: How Smart Leaders Tame Risky Systems
EHSQ Leadership Summit
Gather, Predict, Change:
How Smart Leaders Tame Risky Systems

Copyright 2017
Intelex Technologies Inc.

intelex.com

1 877 932 3747


intelex@intelex.com

@intelex

/intelextechnologies

/intelex-technologies-inc.

/intelexsoftware

You might also like