You are on page 1of 7

LAUR-04-0385

Adversarial Safety Analysis:


Borrowing the Methods of Security Vulnerability Assessments

Roger G. Johnston, Ph.D., CPP


Vulnerability Assessment Team
Los Alamos National Laboratory

Abstract

Introduction: Safety and security share numerous attributes. The author, who heads the (Security)
Vulnerability Assessment Team at Los Alamos National Laboratory, therefore argues that
techniques used to optimize security might be useful for optimizing safety. Optimizing Security:
There are 3 main ways to attempt to improve security—security surveys, risk assessment (or
“design basis threat”), and vulnerability assessments. The latter is usually the most effective.
Safety Analogs: Vulnerability assessment techniques used to improve security can be applied to
safety analysis—even though safety is not ordinarily viewed as having malicious adversaries (other
than hazards involving deliberate sabotage). Thinking like a malicious adversary can nevertheless
have benefits in identifying safety vulnerabilities. Suggestions: The attributes of an effective safety
vulnerability assessment are discussed, and recommendations are offered for how such an
adversarial assessment might work. Conclusion: A safety vulnerability assessment can potentially
provide new insights, a fresh and vivid perspective on safety hazards, and increased safety
awareness.

1
Introduction

Safety and security have a lot in common. They both deal with probabilities and risk, and are both
intrinsically preventative in focus. Both need to be dealt with in a proactive manner, but both often
end up (in the real world) being handled reactively—typically with considerable finger-pointing,
retaliation, recrimination, and hysteria after incidents occur, especially in large organizations. Both
safety and security are often viewed by employees as impediments to productivity. Both can be
seriously hampered by unimaginative managers, reluctant employees, poor communication,
organizational inertia, and excessive bureaucracy. Optimizing either safety or security requires
dealing with complex cost/benefit analyses, subtle matters of human and organizational psychology,
and difficult issues of how to set priorities. Poor implementation of either safety or security
measures can seriously impact an organization’s productivity, its economics and reputation, and the
well-being and morale of its employees.

We have conducted a large number of analyses of physical security in the Vulnerability Assessment
Team at Los Alamos National Laboratory (LANL, 2003). This paper raises the question of whether
the type of adversarial analysis we use for security vulnerability assessments might be useful for
analyzing safety vulnerabilities. The underlying idea is that sometimes techniques borrowed from
one field can be useful in another field, especially if it has similar attributes.

Optimizing Security

In the field of security, there are traditionally 3 ways to improve security:

1. Security Survey (Broder, 1999). This is a type of walk-around exercise. The security manager
wanders the spaces and looks for problems, often with a checklist in hand. Security surveys are
useful because they catch obvious mistakes, such as a hole in the fence, an unlocked door, or a
guard asleep at his/her station. Security surveys, however, do not usually result in profound
security improvements because they do not encourage creative thinking.

2. Risk Assessment, sometimes called “Design Basis Threat” (Garcia, 2001; Roper, 1999). In
simplistic terms, this involves security managers thinking about the bad things that could happen,
and then considering what they will do to mitigate those risks. Likelihood and Consequences are
considered, and Vulnerabilities are given relative priorities. This is a useful approach for security
but it often fails to result in dramatic security improvements. Why is this? In my experience, it is
because the security people doing the analysis are often unimaginative. They tend to focus only on
past security incidents, ignoring changing circumstances and unfamiliar rare-event risks that may be
far more dangerous. More serious, however, is the fact that they usually have entirely the wrong
mindset. The security risk assessors are thinking about things from the perspective of the "good
guys", i.e., people who desperately do not want there to be security problems. As a result—human
nature being what it is—security risk assessors often see what they want to see (that everything is
secure), not necessarily what they need to see.

3. Vulnerability Assessment (Johnston and Garcia, 2003). In a security vulnerability assessment,


unlike the above techniques, we quit being the good guys and pretend to be the bad guys. This
requires a significant mental coordinate transformation. We try to get into the heads of the bad
guys, think like them, and eagerly look for security weaknesses and vulnerabilities to exploit. We

2
actually want to be troublemakers in our assessments, unlike the non-evil (but unimaginative)
security managers typically involved in security surveys and risk assessments. Because we want to
find problems, we do.

Safety Analogs

In the field of safety, security techniques 1 and 2 above have obvious analogs. The standard safety
“walkaround” is similar to the security survey (#1). “What if?” safety exercises, or more formal
safety risk assessments are like #2. On the surface, however, there wouldn’t appear to be a good
match for #3 (vulnerability assessments) because there usually isn’t a nefarious adversary for safety
—ignoring deliberate sabotage. [Deliberate sabotage is more properly thought of as a security issue
rather than a safety matter. It is likely that most organizations underestimate or even ignore the
insider security threat (Johnston and Bremer Maerli, 2003).]

It may nevertheless be possible to have an adversarial vulnerability assessment for safety. The trick
is to quit thinking like people who don't want there to be safety incidents, and start thinking like
people (the “bad guys”) who wish for injuries, death, environmental harm, and damage to the
organization. With that mindset, new safety hazards may suddenly become apparent—or at least
we can think about safety from a fresh perspective.

Another potential advantage, at least initially, to this kind of backwards thinking about safety is the
novelty and shock value. This approach stands in stark contrast to the standard, insipid “think
safety” slogans used in most organizations. Many organizations also encourage employees to think
about “what if?” hazard scenarios. But it is psychologically quite different to mentally strive for
non-safety, to enthusiastically envision scenarios involving injury or death for ourselves or co-
workers. This is a much more proactive, dynamic, vivid, and personal approach to thinking about
safety vulnerabilities than waiting around for “what if?” questions to randomly pop into one’s head.

Moreover, as suggested in the Introduction, safety incidents often generate considerable political
and career damage to individual employees, supervisors, and managers. The motivation for our
imaginary evil bad guys might also include the desire to see a much admired and respected co-
worker, supervisor, or manager get in career trouble as a result of a safety incident.

An additional reason that this type of adversarial safety analysis may have psychological value to an
organization is that the existence of “bad guys”—even if imaginary—can help to unify employees
behind safety. Nothing unites people like a common enemy, even if imaginary.

Suggestions for Conducting an Adversarial Safety Vulnerability Assessment

An adversarial safety vulnerability assessment should involve first understanding the operations,
facilities, and employees that are being assessed. The next step is to identify potential safety
vulnerabilities through brainstorming and analysis. This is followed by evaluating and prioritizing
the potential vulnerabilities. Finally, we devise practical countermeasures to the safety
vulnerabilities.

3
This process requires having the proper assessment personnel. Outsiders will often be useful since
they may have fewer conflicts of interest. [One of the reasons that security risk assessments are
often unsuccessful is that the people conducting the assessment are the same ones providing the
security services, and thus don’t want there to be security problems. After all, their egos,
reputations, and performance appraisals are on the line (Johnston and Garcia, 2003).] On the other
hand, outsiders may have a poor understanding of the realities and unique characteristics of a given
organization. In many cases, it might be prudent to form a safety vulnerability assessment team
consisting of both insiders and outsiders. The insiders must include some of the people conducting
the operations being evaluated.

The best assessment personnel will be clever, creative, hands-on people with a history of thinking
outside the box. Troublemakers, loophole finders, rule benders, smart alecks, renegades, and
hackers—the very people that should make us nervous in regards to daily safety (or security)
concerns—are exactly the types of individuals that should be part of the adversarial assessment
team. They will instinctively be able to spot hazards and potential mischief that other, less jaded
individuals miss.

In many cases, it will not be practical to assemble a formal adversarial vulnerability assessment
team. Instead, regular employees can be asked to assess their own working environment, but to do
so as “bad guys”. In getting employees to think like “bad guys”, organizations should exploit the
existence of any readily identifiable adversaries, such as a competing company or a troublesome
governmental auditing agency. Employees may find it much easier to think like bad guys if they
picture themselves as being these “villains”.

Employees engaged in adversarial safety vulnerability assessments must never be subject to


retaliation (or fear that they might) for finding potential safety problems. “Shooting the messenger”
is a common problem for security vulnerability assessors (Johnston and Garcia, 2003); it must be
avoided for safety assessments.

For an adversarial safety vulnerability assessment, we probably do not want to consider deliberate
sabotage by employees or outsiders. Sabotage is more appropriately thought of as a security issue,
rather than a safety concern. Thus, one employee deliberately hitting another over the head with a
pipe wrench (for example) is not a safety scenario that needs to be considered in this type of
assessment. Deliberately tampering with equipment is another act of sabotage that is more of a
security issue than a safety one.

In most cases, safety incidents caused by a single mistake or failure should be considered first,
followed by more complex scenarios that require multiple contingencies.

Note that in a security vulnerability assessment, the assessors attempt to envision (or even
demonstrate) concrete actions that bad guys can take in order to accomplish their nefarious
objectives. The bad guys in the proposed safety adversarial analysis, however, are more passive
(because we are leaving out deliberate sabotage), though just as malevolent. They are nefarious
observers who fervently hope for safety incidents to occur, for employees to get hurt or killed, and
for employees, managers, and supervisors to get in trouble as a result. The “bad guy” assessors
should gleefully attempt to identify possible ways these things might happen, but they do not
picture themselves actually taking deliberate actions to make safety incidents occur. That falls into
the category of sabotage.

4
It is particularly important not to misunderstand the word “adversarial”. It is one thing for safety
assessors to think like “bad guys” as part of a mental construct to assist in discovering safety
vulnerabilities. It is quite another matter for those same safety assessors to behave in a belligerent
manner, or to use the safety assessment process (or its resulting recommendations) as a weapon.
Attempts to unnecessarily stop or interfere with work, threaten and harass employees, institute
useless paperwork and bureaucracy, waste resources, or otherwise harm the organization are acts of
sabotage, not safety optimization.

Effective brainstorming is critical. The vulnerability assessors need to be encouraged to think


creatively, even recklessly, and to have fun with their “villainous” analysis. Assessors must feel
free to offer ideas (at least initially) without objections, criticisms, or value judgments from other
team members. It should be permissible to consider safety incidents that involve, for example,
flying monkeys, Elvis impersonators, or space aliens; doing so encourages unconventional
thinking. Only at a later stage, when brainstorming is largely complete, will the possible scenarios
need to be critically evaluated, then either dismissed or else modified into something more
probable.

It is essential throughout the process to maintain enthusiasm for finding mechanisms that can cause
injury, death, trouble, destruction, and chaos. The goal is to think evil, not think safety. Success
means finding ways for safety to fail, not seeking to be reassured that everything is fine. Indeed, an
adversarial safety assessment that finds no new safety vulnerabilities is a waste of time. Safety
vulnerabilities always exist. Finding none simply means that the process has failed and should be
redone correctly, ideally with different personnel who will do the job more competently.

Assessors should be sure to consider the psychological status of employees in evaluating safety
vulnerabilities. Neither safety nor security will be optimal under conditions involving high stress
levels, widespread disgruntlement, and/or low employee morale (Johnston and Maerli, 2003).

The adversarial safety vulnerability assessment considered here requires a certain glib suspension of
the traditional, serious way that safety is usually considered. If managers are not careful, however,
this could be misinterpreted by employees. Employees need to be convinced that the organization
really does take safety seriously, and does not want employees to get hurt or employees to get in
trouble over safety incidents. It must be made clear that the adversarial safety assessment is a kind
of role-playing exercise (or tool) for putting people in a dramatically different mental framework in
hopes of gaining fresh insights into safety hazards.

Conclusion

This paper presents what may be an unconventional way to think about and to analyze safety. It
borrows from proven techniques for conducting effective security vulnerability assessments based
on thinking like a malicious adversary. While security is all about neutralizing adversaries, safety is
not usually thought of in those terms. Nevertheless, it can be argued that there may be some benefit
to thinking of safety from the perspective of a malevolent observer. If nothing else, rooting for
injuries, death, damage, and general mayhem provides a novel, even shocking way to think about
safety that has the potential for bringing fresh insight and enhanced safety awareness. It may also

5
help employees to rally around safety by anthropomorphizing safety hazards in the persona of the
“bad guys”.

Acknowledgment and Disclaimer

Janie Enter provided useful comments. The views expressed in this paper are those of the author
and should not necessarily be ascribed to Los Alamos National Laboratory or the United States
Department of Energy.

6
References

Broder, J. (1999). Risk analysis and the security survey. Boston, MA: Butterworth-Heinemann.

Garcia, M.L. (2001). The design and evaluation of physical protection systems. Boston, MA:
Butterworth-Heinemann.

Johnston, R.G. and Bremer Maerli, M. (2003). The negative consequences of ambiguous
‘safeguards’ terminology. Proceedings of the Institute for Nuclear Materials Management (INMM)
44th Annual Meeting, July 13-17, Phoenix, AZ.

Johnston, R.G. and Garcia, A.R.E. (2003). Effective vulnerability assessments for physical
security devices, systems, and programs. Österreich Militärische ZeitSchrift (Austrian Military
Journal), Special Edition on Nuclear Material Protection, February 2003, 51-55.

LANL Vulnerability Assessment Team. (2003). VAT Home Page:


http://pearl1.lanl.gov/seals/default.htm.

Roper C. (1999). Risk assessment for security professionals. Boston, MA: Butterworth-
Heinemann.

You might also like