Professional Documents
Culture Documents
Unintended
Consequences of
Information Technology in
Healthcare
A Review of the Literature
Christopher Kiess
5/4/2009
Table of Contents
Abstract................................................................................................................. 3
Introduction ........................................................................................................... 4
Methods ................................................................................................................ 5
Table 1: Literature Search Methods ............................................................................................... 6
Table 2: Literature Search Criteria.................................................................................................. 7
Background ........................................................................................................... 7
1.1 History of Unintended Consequences .................................................................................... 7
1.2 Unintended Consequences in Healthcare Introduction..................................................... 8
1.3 Developing a Taxonomy of Unintended Consequences .................................................. 9
Table 3: Types of Unintended Consequences .......................................................................... 12
Table 4: Extent and Importance of Unintended Consequences .......................................... 13
1.4 Increased Mortality as an Unintended Consequence ...................................................... 14
1.5 Medication Errors as an Unintended Consequence ........................................................ 15
Conclusion .......................................................................................................... 16
2
Abstract
3
Introduction
4
Heetebry, 2004; Motulsky, Winslade, Tamblyn, & Sicotte, 2008). Much has been
written concerning specific obstacles related to the implementation of health
information technology (HIT). These include staff resistance to implementation
(Ward, Stevens, Brentnall, & Briddon, 2008), communication problems between
the physician and patient as a result of technology (Makoul, Curry, & Tang, 2001;
Teutsch, 2003) and workarounds as a result of design (Halbesleben, D. S.
Wakefield, & B. J. Wakefield, 2008). These obstacles have been referred to a
number of times in the medical literature as “unintended consequences” (Joan S
Ash, Berg, & Coiera, 2004; Joan S Ash, Sittig, Emily Campbell, Guappone, &
Dykstra, 2006; Joan S Ash, Sittig, Emily M Campbell, Guappone, & Dykstra,
2007; Joan S Ash, Sittig, Dykstra, Emily Campbell, & Guappone, 2007, 2009;
Joan S Ash, Sittig, Dykstra, et al., 2007; Joan S Ash, Sittig, Poon, et al., 2007;
Emily M Campbell, Sittig, Joan S Ash, Guappone, & Dykstra, 2006; Harrison,
Koppel, & Bar-Lev, 2007).
A recent study published in the New England Journal of Medicine cited
physician resistance as one of the top barriers to the implementation of
electronic-records systems in hospitals (Jha et al., 2009). Physician and staff
resistance are primary barriers to the implementation of technology (Ward et al.,
2008) and facilitators to unintended consequences in the form of workarounds
(Halbesleben et al., 2008). Unintended consequences and failure in the
implementation of technology in hospitals can, in part, be attributed to the lacking
of a sociotechnical approach (Coiera, 2003, 2007; Ward et al., 2008).
This paper will examine unintended consequences of technology in the
healthcare environment with a review of the literature and provide
recommendations for moving forward in addressing the issues presented.
Methods
An extensive literature search was performed using both keywords and exploiting
the controlled vocabulary of each database to develop a body of literature
5
representing the primary areas of interest. Five primary databases were chosen
and a set of keywords was developed to employ in searching (see Table 1).
Compound keywords were used as phrases and limits on fields were set to title
and abstract in all searches moving to broader search sets if satisfactory results
were not found. Compound searches were performed in which terms were
searched to develop a search set that was then coupled with a subsequent
search using Boolean logic and processing. This exploited the keyword search
for maximum results. Articles were selected and their bibliographies mined for
further resources, which were then added to the initial set. Keyword adjustment
was an iterative process through the search. The controlled vocabularies in each
database were exploited for all possible and relevant terms as well with the
subsequent mining of the bibliographies taking place with those resources.
The criteria for selection of articles were divided by topic and different criteria
applied for each topic (see Table 2). As the literature is relatively obscure in this
area, the criteria set were minimal to obtain maximum results and inclusion in the
review. Moreover, no criteria were set per technology. That is, there were no
stipulations made in relation to whether the literature related to any of the topics
referred to CDS, CPOE, EHR, EMR, HIS, LIS or IT in general. The interest of this
review was set to unintended consequences and the general criteria were that
the subject matter merely had to relate to healthcare settings in terms of
6
technology. The literature search for unintended consequences resulted in 14
articles being accepted for analysis representing 7 studies and 4 reviews.
Background
7
based but rather an action taken when one or more alternatives existed. Under
Merton’s Theory there can be positive or negative unintended consequences as
a result of our actions. Merton outlined five primary causes of unintended
consequences:
8
Prior to the work of Coiera and Ash, there was relatively little discussion of
unintended consequences in the healthcare literature (Joan S Ash et al., 2009; D
W Bates et al., 1999; Patterson, Cook, & Render, 2002). The earliest article
found was a 1998 analysis of the benefits and detriments to electronic medical
records (Silverman, 1998). The article cites two primary problems with electronic
medical records – lack of privacy and costs associated with implementation and
upkeep. This latter category is reported on in later works by Ash and her
colleagues (Emily M Campbell et al., 2006; Joan S Ash et al., 2009; Joan S Ash,
Sittig, Poon, et al., 2007; Joan S Ash, Sittig, Dykstra, et al., 2007). With the
exception of the first sentence, Silverman did not specifically use the term
“unintended consequences” in his analysis or maintain a primary focus on it.
Negative consequences resulting from new technology implementation are often
reported in the literature without using the specific terminology, “unintended
consequences” or focusing on the distinct sociologies related to the
consequences. Wachter used the term “unforeseen consequences” in his
general assessment of computerization in healthcare but maintains a focus on
quality and safety in healthcare from an administrative viewpoint. McDonald
described near misses in an article outlining the potential hazards of bar-code
administration in patient misidentification. There are a number of articles
correlating adverse drug events with computerization of processes (Han et al.,
2005; McAlearney, Chisolm, Schweikhart, Medow, & Kelleher, 2007; Nebeker,
Hoffman, Weir, Bennett, & Hurdle, 2005). But, these works have not discussed
unintended consequences as a concept in and of itself.
9
results largely in a set of descriptive categories. These categories are essential
for understanding the effects of technology both good and bad. But, there is even
less in the literature in terms of suggestions to avoid these consequences (Joan
S Ash, Fournier, Stavri, & Dykstra, 2003; Joan S Ash et al., 2007; Coiera, 2007).
It is imperative to continue both researching and categorizing unintended
consequences as they occur in healthcare environments and further our
explorations of suggested solutions.
Ash et al. explored unintended consequences first as part of
research including three different countries across the United States, Europe and
Australia. The results were the first attempt at categorizing the errors resulting
from the unintended consequences of implementation of technology. The original
intent of the study concerned gathering qualitative data in institutions using
Patient Care Information Systems. In gathering and analyzing the data, the
observers began to discover patterns indicating there existed possibilities of
errors occurring within these systems or attitudes reflecting this knowledge (Joan
S Ash et al., 2004). This initiated a series of related studies to both analyze the
existing data from new perspectives and obtain more data (Joan S Ash et al.,
2004; Joan S Ash et al., 2006; Joan S Ash et al., 2007; Joan S Ash et al., 2009;
Joan S Ash, Sittig, Dykstra, et al., 2007; Joan S Ash, Sittig, Poon, et al., 2007;
Emily M Campbell et al., 2006). The body of work Ash and her colleagues have
produced has provided insight into:
10
retrieval of information held in the system and errors in communication and
coordinating patient care (Joan S Ash et al., 2004). Both types of errors were
further broken down into subcategories where there were problems described
with the human-computer interface (wrong person orders, juxtaposition) that had
not been designed with a complex “interruptive” environment in mind. Hospitals
are environments where interruptions are common and thus systems must be
designed with this in mind. Cognitive overload and shifts in cognitive patterns
due to restructuring the charting process was another finding. Structuring the
information often results in forcing the physician to enter comments a certain way
and in a certain field. Physicians were found to be frustrated with pre-populated
fields allowing for no modification. Other works have shown physicians cannot
troubleshoot and diagnose in the same fashion as before since the information is
presented differently and, thus, interpreted differently (Harrison et al., 2007). Ash
et al. described the phenomenon as a “loss of overview” where the physician can
no longer get the big picture. Misunderstanding the complexity of the work was a
third subcategory and is described as seeing the work completed in linear
fashion rather than an interactive hub of activities. This can lead to problems in
the processes already in place and are exacerbated by an inflexible system. The
fourth and final subcategory refers to the change in communication patterns
among workers. Entering an order in a system is not effectively communicating
that order to anyone other than whoever receives the order on the other end.
This means a nurse working with a doctor may not know a medication was
ordered or another physician could conceivably enter a duplicate order. But, the
communication (or feedback) from the system can also prove frustrating for the
end-user. Alerts, for example, can often overwhelm the user and be ineffective in
prompting the user to new or improved behavior. These categories were
eventually fleshed out further with added analysis and data gathering to form a
taxonomy in which 9 types of unintended consequences were represented (See
Table 3).
11
Table 3: Types of Unintended Consequences
Multiple Passwords
More/New Work Issues Responding to alerts
Entering required information or more
detailed information
Extra time
System “re-orders” the workflow
Workflow Issues HCI problems
Inconsistencies between system and
policy/procedures
More space required for computers
Never Ending Demands Persistent upgrades
Screen space not large enough
Perpetual training
Maintenance
12
types and the overall level of importance of each type to hospitals with
implemented CPOE. 176 full interviews were conducted via telephone – the
results of which can be seen in table 4. Rated the highest in terms of importance
were system demands, communication and workflow issues. The lowest rating
went to shifts in power and new types of errors. Most interesting to note is there
did not appear to be any correlation between the length of time each hospital had
owned the CPOE system and unintended consequences. Ash et al. also noted
there were both positive and negative unintended consequences involved in
implementation of CPOE and that hospitals can either work to avoid the negative
unintended consequences or simply accept them as part of developing a new
system.
As part of the same series of studies, Ash and her colleagues were also able to
compile data on Clinical Decision Support Systems and unintended
consequences therein (Joan S Ash et al., 2007). There were two primary
categories derived from this data – those consequences related to the content of
the system and those related to the presentation of the system. Those
consequences related to the content of the system were a shift in roles and
responsibilities, the currency of the content and wrong or misleading content. The
13
consequences related to the presentation of the system included rigidity of the
system (or the inability to tailor certain procedures or notifications), alert fatigue
(developed from too many alerts) and sources of potential errors such as auto-
complete fields and paper routing issues. Three primary recommendations
followed analysis in this study. To address the currency of the system and the
content it was suggested a knowledge management structure be developed with
interdisciplinary participation. In this way, a knowledge base can be developed
and can address many of the issues outlined in the problems related to content.
In relation to presentation problems, the recommendations were two: implement
a taxonomy designed to mediate the number of automated alerts determine what
fields need structure data versus those that do not. The need to capture
structured data often results in rigidity for the input of information. When this
structure is not needed, it can be removed (or not added) to the system to allow
more flexibility to the end user.
14
would recognize the differences within organizations and to ensure future
informaticians are educated in these approaches.
15
Conclusion
The above represents a series of moving parts that must synchronize and work
towards a common goal. In designing systems the complexity of a hospital is
often not realized and it is evident we must adjust our approach in order to build
a better system and give our physicians and nurses tools that help rather than
hinder their work. If we were to suppose a hospital is much like an ecology, we
might understand how we can best approach the problems outlined in this
writing.
Island ecology is a subject that has fascinated scientists for years and has
been the subject of science writer David Quammen on a number of occasions
(Quammen, 1998, 1997). Island ecology is often equated with the term “insular
biogeography” and islands pose certain challenges in that species (both plant
and animal) are much more vulnerable to extinction based on their insulation
from other ecologies. The dodo bird is, perhaps, the most famous example of an
island species that became extinct as a result of its island habitat. The same
concept has been explored in relation to the division of state parks in the United
States. The 20th century saw many state parks being split to allow logging
16
companies passage through or the make room for travel. A result has been the
species in those parks have gone extinct. To cite an example, the Bridge
Mountains and The Crazy Mountains both lost their species of Grizzlies once
they had been insulated through development projects (Quammen, 1999). We
have also seen this same concept occur in the Amazonian Forest in what has
been termed “ecosystem decay” as a result of fragmenting the forest (W. F.
Laurance et al., 2002). The more complex and connected an ecosystem is, the
better its chance of survival. If we were to see hospitals as complex ecosystems
where connections between humans and systems must remain (as well as
connections between humans and humans), we would begin to understand just
how complex the system is and how our interventions can harm rather than help.
Chaos Theory is a related theory and has also proved useful in evaluating the
environment of organizations (Thiétart & Forgues, 1995). It is of value in
understanding the nature of chaotic environments where change occurs rapidly.
Both the interconnectivity of hospitals and their chaotic nature must be
understood and addressed prior to interventions.
A sociotechnical approach has been discussed in the literature in relation
to unintended consequences (Harrison, Henriksen, & Hughes, 2007; Harrison et
al., 2007) and is worthy of pursuit. History has shown that a frequent response to
problems with technology have been to develop new technologies to address
existing issues (Don Norman & Dunaeff, 1994; Donald A. Norman, 2002, 2007).
However, this sort of patchwork approach ignores the social interaction between
humans and technology. In order to best develop these interactions, we must
approach the problems that are social in nature and understand how they affect
the technology we have developed through ethnographies, interviews, data
analysis and observations. We must then gain an understanding of how to build
systems in coalition with those who will use them through participatory design,
collaboration in system development and maintaining iterative processes in
design.
Unintended consequences in healthcare technology is a new field of
exploration representing only a small number of studies. These studies, however,
17
are providing us with a taxonomy of both practical examples and categories of
the different types of unintended consequences. This provides a foundation with
which we can begin to move forward in solving the problems we see in relation to
technology and humans interacting. Research must continue in this field and we
must move forward in developing systems to define and outline how we will
approach the implementation of technology from sociological, technical and
sociotechnical perspectives.
18
Agrawal, A., & Mayo-Smith, M. F. (2004). Adherence to computerized clinical
Agrawal, A., & Wu, W. Y. (2009). Reducing medication errors and improving
Ammenwerth, E., Talmon, J., Ash, J. S., Bates, D. W., Beuscart-Zéphir, M.,
Ash, J. S., Berg, M., & Coiera, E. (2004). Some unintended consequences of
Ash, J. S., Fournier, L., Stavri, P. Z., & Dykstra, R. (2003). Principles for a
36-40.
Ash, J. S., Sittig, D. F., Campbell, E., Guappone, K., & Dykstra, R. H. (2006). An
19
Ash, J. S., Sittig, D. F., Campbell, E. M., Guappone, K. P., & Dykstra, R. H.
Ash, J. S., Sittig, D. F., Dykstra, R., Campbell, E., & Guappone, K. (2007).
Ash, J. S., Sittig, D. F., Dykstra, R., Campbell, E., & Guappone, K. (2009). The
Ash, J. S., Sittig, D. F., Dykstra, R. H., Guappone, K., Carpenter, J. D., &
Ash, J. S., Sittig, D. F., Poon, E. G., Guappone, K., Campbell, E., & Dykstra, R.
Bails, D., Clayton, K., Roy, K., & Cantor, M. N. (2008). Implementing online
20
Bates, D. W., Kuperman, G. J., Rittenberg, E., Teich, J. M., Fiskio, J., Ma'luf, N.,
Campbell, E. M., Sittig, D. F., Ash, J. S., Guappone, K. P., & Dykstra, R. H.
Chaudhry, B., Wang, J., Wu, S., Maglione, M., Mojica, W., Roth, E., et al. (2006).
742-52.
Publication.
103.
Cors, W. K. (n.d.). Physician executives must leap with the frog. Accountability
for safety and quality ultimately lie with the doctors in charge. Physician
Del Beccaro, M. A., Jeffries, H. E., Eisenberg, M. A., & Harry, E. D. (2006).
21
Computerized provider order entry implementation: no association with
295.
Doebbeling, B. N., Chou, A. F., & Tierney, W. M. (2006). Priorities and strategies
Han, Y. Y., Carcillo, J. A., Venkataraman, S. T., Clark, R. S. B., Watson, R. S.,
Harrison, M. I., Henriksen, K., & Hughes, R. G. (2007). Improving the health care
22
analysis. Journal of the American Medical Informatics Association: JAMIA,
14(5), 542-549.
Hatcher, M., & Heetebry, I. (2004). Information technology in the future of health
Institute of Medicine (U.S.). (2001). Crossing the Quality Chasm: A New Health
System for the 21st Century (p. 337). Washington, D.C: National Academy
Press.
Jha, A. K., DesRoches, C. M., Campbell, E. G., Donelan, K., Rao, S. R., Ferris,
Laurance, W. F., Lovejoy, T. E., Vasconcelos, H. L., Bruna, E. M., Didham, R. K.,
618.
Makoul, G., Curry, R. H., & Tang, P. C. (2001). The use of electronic medical
McAlearney, A. S., Chisolm, D. J., Schweikhart, S., Medow, M. A., & Kelleher, K.
23
(2007). The story behind the story: physician skepticism about relying on
Chicago Press.
Motulsky, A., Winslade, N., Tamblyn, R., & Sicotte, C. (2008). The impact of
Nebeker, J. R., Hoffman, J. M., Weir, C. R., Bennett, C. L., & Hurdle, J. F. (2005).
Norman, D., & Dunaeff, T. (1994). Things That Make Us Smart: Defending
24
Norman, D. A. (2005). Emotional Design: Why We Love (or Hate) Everyday
Patterson, E. S., Cook, R. I., & Render, M. L. (2002). Improving patient safety by
Shekelle, P. G., Morton, S. C., & Keeler, E. B. (2006). Costs and benefits of
(132), 1.
Sittig, D. F., Ash, J. S., Zhang, J., Osheroff, J. A., & Shabot, M. M. (2006).
118(2), 797-801.
Smith, A. (1977). An Inquiry into the Nature and Causes of the Wealth of Nations
25
America, 87(5), 1115-1145.
To Err Is Human: Building a Safer Health System. (2000). (p. 287). Washington,
Ward, R., Stevens, C., Brentnall, P., & Briddon, J. (2008). The attitudes of health
26