Professional Documents
Culture Documents
Software Validation
Computer System ValidationDefinition and Requirements....................................................................... 1
The Nine Most Common Computer Validation Problems Identify Frequent Deficiencies to Accelerate
Your Validation Projects..........................................................................................................................13
Accurately Identifying Your RequirementsWill Any Computer System be Right for You?........................17
[
For more Author ABOUT THE AUTHOR
information, Sharon Strause is a senior consultant with EduQuest, Inc. Sharon may be reached at sastrause@
go to aol.com.
gxpandjvt.com/bios
Computer Systems Quality and Compliance discusses the quality and compliance
aspects of computer systems and aims to be useful to practitioners in these areas.
We intend this column to be a useful resource for daily work applications.
Reader comments, questions, and suggestions are needed to help us fulfill our
objective for this column. Case studies illustrating computer systems quality and
compliance issues by readers are most welcome. Please send your comments and
suggestions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com
Rupert King/Getty Images
KEY POINTS
The following key points are discussed in this article:
Systems design is the process or art of defining the architecture,
components, modules, interfaces, and data for a system
System design should consider the entire system lifecycle to prop-
erly manage costs and compliance
System changes, maintenance, and future expansion or other orga-
nizational changes should be part of system design
The role of quality is often compromised in system design in favor
of project cost and timing
Security issues, both external and internal, are an important con-
sideration
System designers must consider the needs of the quality area in sys-
tem design and must actively solicit their input
Quality unit personnel, in turn, must carefully consider their
needs, and clearly communicate these needs to system designers
Do not under estimate the cost and time impact of even the small-
est change.
INTRODUCTION
My six-year-old daughter is often fascinated by things that fascinate me.
On the cover of a book that I had asked for her to bring to me was a
picture of a kettle with the spout and the handle on the same side. She
studied the picture for a moment and then reported carefully, that is not
a very good design! I was delighted in her discernment. It was easy for
her to understand the intended use and know that this will not work
very well.
How often do we fail to have these insights when designing GXP com-
puter systems? More often than wed like to admit. Pressures mount
to do more with less, hit timelines, show return on investment, and
meet commitments. These are all admirable things, and senior manag-
ers should push system designers and project managers to contribute
to the business by thoughtfully executing against those mandates. At
the same time those very same project teams need to keep stakeholders
informed about the technical debt they are accumulating. If teams are
making decisions to sacrifice quality or maintainability in order to meet
those demands, technical debt is incurred. The payment on technical
debt, like personal debt, has a cost that can be felt for a long time. The
recurring costs of technical debt are far greater than addressing the issue
presently.
The more likely that changes to a system will occur, the more impor-
tant it is to understand the long term cost of those not change a calculation for yearsit makes sense
changes. Elements of a system that are subject to to separate these. Often in looking at production or
higher velocities of changes are the best candidates deployment phase plans there is a one-size-fits-all ap-
for analysis. This column will explore some common proach. This often leads to something that is imprac-
tradeoffs that lead to technical debt. tical or worse.
Upfront planning to develop specific strategies to
FIX ONEBREAK TWO handle different change velocities and understand the
One small example that can lead to technical debt is risks associated with these changes helps significantly
hard coding a variable that, by its very name, we to develop cost-effective plans that look at the system
know will change over time, to save a few days devel- over time. Focusing on lifecycle cost planning will
opment time. This might be a password, a common minimize the technical debt of the deployed applica-
security mistake, or some configuration setting like tion.
the name of a database server. It is easy to hard code
such a thing to save time, but because the likelihood Changing A Password
of change is high, the cost of this shortcut is high. For example, systems that have passwords that are
This is true for two reasons. One is that a validation used infrequently are going to result in passwords
process must be re-executed and the other is the risk that expire or are forgotten by users. What is the
that something else might get inadvertently changed strategy for managing this? Let the help desk do
or that there is some unintended consequence. This is password resets manually by routing a ticket to the
commonly called the fix onebreak two syndrome. In database administrator? Thats the most expensive
short, it is a change that leads to technical debt. solution. Write a tool so that the help desk can
A password mistake is a perfect example. Good do it for them? This is the better approach. Add a
security requires frequent changing of passwords. If self-service feature in the application? This is the
a password is hard coded, then a new version of the best approach. Knowing what to do requires some
software (called a release) is required to update the planning and time up front. Imagine a 1000-user
password. For a validated system, this will result system and assume 30% will need one password
in an even larger cost the organization will pay over reset a year. This is an optimistic estimate. Suppose
and over again. If the organization does not change each help desk call costs $50 by the time the security
the password to avoid this cost, it has traded good administrator changes the password and the system
information security practice to pay the technical is in use for five years. The organization will spend
debt and also accepted a 21 CFR Part 11 compliance at least $75,000 on tickets alone. This is more than
risk. Assuming the system has a reasonable life of five it would cost to implement a self-service I-forgot-
years, the technical debt per year of not making the my-password feature. This model doesnt even
password easy to change is either poor security and consider any impact to the business, such as inability
a compliance risk or the cost of two or more releases to release a lot while an engineer is locked out, so the
per year over five years. Besides the recurring costs total technical debt could be much higher.
of the releases, the organization will also assume the
risks related to releasing and validating the applica- Changing A Storage System
tion. Surely it would be more efficient to handle the Another example is the case of an electronic record
password correctly in the first place. Pay now or pay a storage system. Lets use some numbers to illustrate
lot more later. the point. To make the math easy, lets assume that
a basic validated system costs $1 million and has
MANAGEMENT OF CHANGE a 10-year life. The team reports that they need an
Understanding the concept of change velocity is im- extra $100,000 to address an archiving feature or the
portant for any system, but even more so for validated system will outgrow the storage system early in the
systems. Specific strategies need to be in place for systems expected life. The extra money is deemed
dealing with varying rates of change. What is the best too expensive. The project was already spending ev-
way to manage these varying rates of change? What ery dime, so the decision is to address it later. Over
are the costs associated with the changes and how time business needs change slightly as it becomes
should an organization manage them? Vendor soft- paperless, and in five years the system is critically
ware may move at one speed. Internally developed low on storage. A new project is proposed to add the
customizations probably move at another rate until archiving feature. Because this is a validated system
the system matures, but may accelerate if business and now contains five years of electronic records, it
processes change. Microsoft patches its operating sys- will take a full release and sufficient testing to show
tem monthly, commercial application vendors might that the records are archived correctly. Lets say
patch quarterly, and an analytical chemist might the team can do this for $500,000 and delivers it
robustly on time. But now the last five years of the system, but is something that is designed in, then the
system depreciation costs twice as much. Would the stage is set for initial planning and subsequent discus-
$100,000 in initial project costs have been worth sav- sions about trade-offs and tuning to ensure that all
ing $400,000? This is the kind of technical debt that four variables have a place at the table. When quality
needs to be managed thoughtfully at the beginning. is simply assumed, then bad things can happen and
they usually show up in the form of technical debt.
SYSTEMS DESIGN In this authors experience, most organizations have
Systems design is the process or art of defining the strong formal and informal mechanisms to ensure
architecture, components, modules, interfaces, and data project costs do not exceed the plan. And for good
for a system to satisfy specified requirements. Today, reason, as the system development community has
more than ever, system design must be cost effective. accumulated few headlines for on-time, on-budget,
Todays economic conditions require full lifecycle cost on-scope, and on-quality success. The technical teams
to be factored into decisions. It is not uncommon for need to do a better job of expressing the quality trade-
the maintenance phase to prove more costly than the offs in business terms and identifying risk factors that
implementation phase. The maintenance phase is often the business can understand. Telling a business leader
not considered or analyzed but is a counter force of get- we need more time to fine tune the user interface or
ting the cost out of the business. The proverb of the frog make usability changes is hard to relate to a business
sitting in water with the temperature slowly going from impact. Stating that there are data that suggests one in
cool to boiling is a good reminder. The frog doesnt five users makes errors that could result in erroneous
notice the heat because the rate of rise is slow, but in the filings to a governing body and here are the errors is
end he is cooked. From the preceding examples we can something that can be processed in the business risk
clearly see that understanding and managing technical management and review framework.
debt can have a profound impact on GXP computer sys- Thus, in order to have a fact based dialog, decision
tems and allow us to jump out while the water is cool. makers need to be involved up front with competent
Anytime we are asking the organization to pay more system designers who understand both how to get
or take more time in the implementation phase, we have things done and how to consider what the organiza-
to articulate the value proposition. That proposition tion will pay over time. These pay-me now or pay me
will be the benefit of addressing a lifecycle cost now vs. later time bombs are not just measures of technical
assuming the recurring cost and risk over time. Few acumen. They are also indicators of business savvy.
teams are getting a blank check in todays environment. Business leaders need to have trusted technical leaders
How does a team explain the value proposition? Some that can help get the cost out of the business by not just
points are obvious, some are not so obvious. Most deci- excelling at technical execution, but also by understand-
sion makers want to be rational and make wise deci- ing how to speak to the business.
sions for their organizations. In order to support fact If a team understands its customers, it can imple-
based decision-making, teams must tally the technical ment in a cost-effective way. For example, enabling
debt and make sure that decision makers understand users to add reports using validated features can avoid
what they are buying on creditsort of the fair disclo- more costly-to-deliver and harder-to-get-scheduled IT
sure doctrine of GXP system development costs. It must releases. In this authors experience, it is rare to see
be expressed in business terms identifying clearly what those trade-offs surface up front. Most senior business
the cost and the benefits are. Numbers and specific leaders would rather know theyll get all the reports
examples that support business decision-making are that they asked for upfront in the validated system,
critical for influence. It cannot be expressed in techni- but anything else will be another costly release. Most
cal geek-speak language. would like the chance to ask if there is a way to avoid
those costly releases.
THE STOOL HAS FOUR LEGS When designing for maintainability, the concept
Yet another type of technical debt is assuming that qual- of change velocity comes up again. In this authors
ity of a system is simply something that exists at some experience, there are many tightly-coupled or inter-
constant level. Often this happens when quality is faced systems that should be loosely coupled. Tight
assumed by taking it off the table with statements like coupling occurs when one module or system relies
we never compromise on quality. Traditional project on another module or system so strongly that a small
management paradigms articulate that there are three change in one will require an implementation change
legs (i.e., scope, resources, and time), but with a wink in the other. The following is an example of tight
we all know there are really fourquality does not system coupling: System A needs to view System Bs
simply exist. Quality is often traded to make the other records. To make things fast, the B team sends the A
three. If teams and their sponsors agree right up front team source code from their system. A implements Bs
that quality is not a magic property that appears in a code and the organization is happy. Any time a user
of A needs a B record, they can get it. Later B adds subject to government regulation, most notably the
another record type and users of A still need to see it. Health Information Portability and Accountability Act.
But now both A and B have to release anytime there is Understanding the risks, vulnerabilities, and counter-
a changeNot goodPay a lot later. measures is important in system design, and it is the
What is the correct solution? B could have imple- most cost effective as part of design as opposed to later.
mented a service for A, show me a record. With a Often failure to plan for this creates expensive and
little thought something as simple as show-me-the- time-consuming redaction programs.
record-this-ID could be implemented. Then A and
B are loosely coupled so one system can be changed IMPLICATIONS FOR COMPLIANCE
without the need to change another. The cost effective Compliance personnel should always be part of com-
paradigm is to make tight coupling rare. It might cost a puter systems design activitiesthe fourth leg of the
little more up front, but it will save a lot later. stool. They can provide valuable input regarding qual-
This can pay back in more ways than one. Not only ity requirements that will minimize future costs and
can an organization avoid extra release costs, it can system downtime. When the quality area is overlooked,
also improve uptime, as now only one system needs be future changes to the system will surely be needed, and
taken offline to make an upgrade. these future changes equate to additional costs, down-
time, and potential problems affecting other systems.
PLANNING FOR THE FUTURE The quality area must also be mindful of the importance
Understanding how the user community is expected to of their input. The quality area must carefully consider
change and probable impacts on electronic data can its needs and must clearly communicate these needs to
have a dramatic impact on lifecycle costs. Does the the systems designersdo not underestimate the cost
system need to support a business acquisition plan? If and time impact of even the smallest change.
so, this could dramatically affect the user count and
make one design appropriate or inappropriate by alter- CONCLUSION
ing scalability needs. Will more than one geographic Good software design is complex. These are just a few
location be using the system? If so will data consolida- examples of how shorting the initial planning and
tion be required? Knowing the answer to questions implementation can result in significant downstream
like these may not only affect system architecture, costs. Business owners of systems and budget decision
vendor selection, and technology selection, it may makers should set clear expectations that while certain
also require the addition of a data warehouse to meet budget and schedule goals are in place, the expecta-
reporting needs. Often fixing things like these later tion is that system designers provide solid information
becomes massively expensive when compared to en- related to lifecycle costs. That information can be used
abling the system for scalability up front. Often, senior to get to the best decisions related to managing technical
leaders will make different choices if they have the debt and cost effectiveness.
data and facts to allow good decision support. Skip-
ping these steps frequently leads to unanticipated costs ABOUT THE AUTHOR
and can undermine the technical teams credibility. Robert Smith is an application technical lead responsible for quality
systems software development at Abbott Vascular. Prior to this, he
Security is often addressed as an afterthought.
was Sr. Director, Engineering at Symantec Corporation, where he was
Sometimes teams work hard to get the system to work, responsible for developing enterprise client, host, and server based
then say, lets make it secure. At this point it is too corporate security products as well as the Symantec and Norton Live
late. Security, like quality, needs to be designed in and Update offering. Robert has 25 years of software development expe-
requirements should be stated clearly up front. The rience including VC start-ups funded by The Mayfield Fund, Granite
Capital, and Wasatch Venture Fund, and holds CISSP and PMP
requirements need to be clear and related to risks.
credentials. Robert can be reached at robert.smithii@av.abbott.com.
Often GXP systems are closed systems on internal
networks and not subject to skilled, determined attack- Barbara Nollau, column coordinator, is director of quality services at
ers. But insider threats are real and the most prevalent. Abbott Vascular. She is responsible for validations, reliability engi-
These threats run the gamut from disgruntled em- neering, supplier quality, microbiology, and document management
ployee sabotage to someone correcting their mistakes at Abbott Vascular. Ms. Nollau has 25 years experience and increas-
ing responsibility in pharmaceutical and medical device industries,
to avoid reprimand to misappropriation of intellectual
spanning areas of manufacturing, quality assurance/compliance, and
property. information services/information technology.
Some systems in the life sciences sector may also Ms. Nollau can be reached at barbara.nollau@av.abbott.com.
contain protected health information and may be
Computer Systems Quality and Compliance discusses the quality and compliance
aspects of computer systems, and aims to be useful to practitioners in these areas.
We intend this column to be a useful resource for daily work applications.
Reader comments, questions, and suggestions are needed to help us fulfill our
objective for this column. Case studies illustrating computer systems quality and
compliance issues by readers are most welcome. Please send your comments and
suggestions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com
Rupert King/Getty Images
SUMMARY
An illustrative incident at a pharmaceutical company that is representative of
actual events is discussed. This incident involves software control of a drug
dispensing system in pharmaceutical manufacturing. An error in amount of
drug weighed occurred. The investigation identified several problem areas.
Lessons learned, areas of concern, questions to be asked, and actions to be
taken are discussed.
INTRODUCTION
The following discusses an illustrative incident at Pharma154, a fictitious
pharmaceutical company that makes the global commercial supply of
Pinkoswill, a potent drug product. Because this drug product contains a
potent active ingredient, weighing the correct amount of drug in the manu-
facturing process is critical.
Personnel involved in the incident include the following:
Alex, vice president of regulatory affairs
Bob, vice president of information technology
Annie, software development manager
Alicia, software contractor
Sam, systems test lead
Salli, system administrator
Manufacturing engineers and operators.
While the incident, company, drug product, and personnel involved are
contrived, the following is representative of actual events for which the US
Food and Drug Administration has issued warning letters.
THE INCIDENT
I need you here. Now! exclaimed Alex, the VP of regulatory affairs at
Pharma154.
Alex, are you crazy? Its Sunday. Its 5:00 AM, slurred Bob, Pharma154s
vice president of IT.
Bob, listen, there are three reported hospitalizations tied to Pinkoswill.
They are all in critical condition. Surveillance is coming in now, we think
there may be others. We expect the FDA to be here Monday morning. This
is serious, Alex explained coolly.
Bob started to wake up, What does this have to do with IT anyway?
Alex said, We are not sure. Something has gone wrong. The labs say the
dosage in the suspected lots is almost four times spec. We have got to figure
this out.
Look Alex, this is clearly some manufacturing problem. I have a life. If
something points to IT, then call me. Otherwise, I have things to do. OK?
They all wondered how that could be. Annie said, So if the scale was doing what we see now, the valve
I remember some problem a long time ago about boot would let in a lot of the drug? asked Annie.
order and the USB interface to the scale. They decided Yes, the engineer replied, Thats why we weigh it a
to reboot everything. They turned off the computer second time. Only the exact recipe will produce the cor-
system and the USB hubs. Some one said, Lets turn off rect post-mix weight. We have that down to a science.
everything. They did that too. Sam wondered aloud if Alicia was the first to see it. The scale error is constant.
there was some protocol for restarting. Both scales were off by exactly the same amount.
One of the manufacturing engineers on the other lines And though they all thought it, Annie was the first to
offered to help. He told them the order in which to turn say it, We have a serious problem. A real serious prob-
everything back on. They did and now the software read lem! We have got to tell Bob.
46.75 pounds just like the scale. Sam, said, This is not The team informed Bob of the situation who then
good. contacted the VP of regulatory affairs.
Why? asked Annie, Everything is working fine Alex, this is Bob. We have a problem. My team found
now. a situation. It appears that if there is some maintenance
Sam said, Lets just try a few things. What is this performed on the line, a real problem can occur. I am no
other USB cable for? chemist but I think something like five pounds of extra
The manufacturing engineer informed them that it drug might give some people a real bad day.
controlled the hopper shape knife gate valve. They all As would be expected, FDA investigated the Phar-
laughed. The what? sang the software team almost in ma154 situation. The FDA-483 the company received
unison. The engineer explained, It controls how much from FDA was not kind. A warning letter was expected
of each ingredient goes into the mixer. It opens until the to follow. The possible fines assessed could be astro-
right weight is in the mixer and then closes. nomical. The lawsuits the company may incur will
Alicia spoke up, I wrote the code for that. The valve probably be worse.
is closed. I send a command to open it, then when the
weight rises above the spec, I send the close command. INVESTIGATION
What happens if it stays open? asked Sam. The During the corrective action and preventive action
manufacturing engineer explained that would ruin the (CAPA) investigation, the following items were docu-
batch and the incorrect mix would be caught at the post- mented by outside investigators:
mixing weight station. Software developers were not practicing version con-
Sam pulled the bottle out of the pre-mix hopper and trol. Software and associated source code files were
put it in the post-mix hopper. It weighed 46.75 pounds not kept in a repository. This is in stark conflict
on the scale and the software. They all agreed that made with the International Society for Pharmaceutical
sense. Engineering (ISPE)s GAMP 5: A Risk-Based Approach
Sam asked the engineer, if he could unplug and re- to Compliant GxP Computerized Systems. This lack
plug the cables. Sure, he told them, the techs do that of appropriate software version control was a direct
sometimes if the valves need maintenance. So Sam contributor to the event
unplugged the USB-controlled hopper shape knife gate The company lacked a formal procedure for deploy-
valve and plugged it back in. The room was very, very ing baselines from a controlled repository. This
quiet. allowed the personnel to retrieve software from a
The software displayed a strange error message. Salli backup that was not controlled or cataloged and
commented, Thats odd. It says Unit test parameters then allowed the use of the software in a production
exceeded, using default test values. Click OK to contin- system
ue. Thats not any error message I have ever seen before. The lack of a software version control tool and cor-
The wording makes it seem like some default or testing responding processes allowed a unit test Dynamic
mode. Link Library (DLL), which is a way to deploy soft-
The engineer said, Weve seen that a few times after ware so it can be used by other software, to be used
valve maintenance, but we usually reboot everything. in production. The unit test scale interface DLL was
Sam clicked the OK button. The scale went blank written in such a way that it provided its expected
and then the software and the scale both reported 41.25 values if the scale encountered an error.
pounds. You could hear a pin drop.
Annie asked, What goes in the mixer first? The investigators interviewed the former software
The engineer replied, The active ingredient. We dont developer. He reported that the manufacturing engi-
want to add anything else unless that weight is accurate. neers and he were in dispute regarding the reliability of
It cuts down on scrap. That stuff costs like a thousand the scale firmware (firmware is software that has been
times more than everything else that goes in. We got a committed to a chip in hardware). He believed the scale
process validated to reclaim it a few years back. firmware was not in control. He reported his concerns
and was told to work around the problem. He created in systems and associated controls can and do lead to
code that simply ignored a malfunctioning scale and patient risks. Does your company have adequate tools,
supplied the parent program with historical successful controls, and management review?
values. This allowed the system development to proceed Systems today are very complex. Much of the soft-
without dependency on the scale. Evidence was found in ware and systems are assembled by contractors that often
various bug reports that this software engineer reported leave when the project ends. Is there a change control
these problems. It appears, in part, that his release from record? Is there a version history with accounting of all
the project was due to his reporting of poor controls. the changes? This is extremely important. It is impor-
The scale firmware was also not version-controlled. tant to know when changes are made and why. In the
This allowed scales on the new line to have old firmware story presented in this article, Alicia was given a piece
put into production. This firmware had a defect that of software. She did not know where it came from, who
in certain conditions, like the ones triggered by hopper wrote it, why, or when. It was test software but only the
shape knife gate valve maintenance, caused the scale departed contractor knew that. Alicia had no knowledge
to recalibrate. The original developer attached the new of the bug in the scale firmware, and due to pressure,
firmware to the bug report, but that report was closed the working system was released with a test software
after his departure. Due to the lack of version control component that simply reported to the parent program a
and formal procedures to control the validation and weight it was programmed to return if the scale firmware
deployment process, incorrect and unsuitable versions had an error.
were deployed.
The result of the inadequate software version control, Software Version Control
deployment practices and hardware/firmware version If the Pharma154 Company had software version control
control allowed approximately five pounds of the Pinko- and was using it properly, this scenario would have been
swill active ingredient to be added to the three affected prevented. Software version control provides key ben-
lots. Company chemists and lab personnel acknowledge efits that comply with good automated manufacturing
that this is, at a minimum, a serious overdose risk to practices (GAMP). These include the following:
patients. Frequent check-in and checkout (daily) of work.
The three lots were able to escape into the supply This provides clear visibility and accounting around
chain due to the lot-sampling plan being incorrectly who made changes and when
constructed because of a side effect of the test software. Good process ties check-ins to a stimulus (i.e.,
When in testing mode, the problem DLL did not send requirement, work instruction, bug, or task)
lot information to be included in the lot sampling plan. Labeling (i.e., production version, test version, devel-
Although the lot sampling plan was a validated approach opment version)
and relied on a risk-based analysis, that analysis did not A central and controlled repository where all soft-
identify any configuration management risks or failure ware or firmware is stored.
of the scale system to properly function. The failure to
identify and manage risks associated with configuration Computer Systems Do Not Always Work
management fails to comply with the regulations. Companies today need to recognize that computer sys-
The investigators noted that, per the regulations, the tems range from your SmartPhone to lab equipment to
company had an obligation to prevent mix-ups. The lack manufacturing control systems. As these devices have
of management controls and adherence to basic controls become pervasive, there is a tendency to just assume
around software and firmware versioning fell below they work and work together. In many cases, they do
minimum standards for industry. not. Bugs exist, incompatibilities exist, and often the
formal structure of good version control and software/
LESSONS LEARNED AND AREAS OF CONCERN system best practices is not in place on internal projects.
In most life-sciences organizations, management comes Some organizations confuse software-development-life-
from scientific, sales, finance, or other non-software or cycles (SDLC) for software development best practices.
system development backgrounds. As a result these However, most SDLCs are focused on an artifact trail
organizations often do not have adequate system devel- to satisfy regulation rather than on ensuring best or es-
opment controls in place. There are also many times sential practices are in place. Organizations need both
when organizations do not see themselves as needing to sound SDLC that ensures key steps and artifacts are
practice software and system development at anything executed appropriately and methods and procedures to
more than it seems to work. Where does your organi- ensure essential practices are in place and practiced.
zation fall? This can be particularly true when non-software
Software and systems have become pervasive in orga- and system development professionals are running
nizations from controlling quality systems to production projects. Today, there are many tool kits from leading
lines to devices instrumental in patient care. Failures vendors that allow users with no formal training in
system development to create powerful and complex pothetical, it is representative of real life. There have
systemsothers in the organization then usually de- been FDA warning letters issued for the lack of these
light because it works. However, there are real risks very controls and processes. These are essential and
in life sciences if those systems get used for quality or foundational processes that every organization needs
manufacturing purposes as bugs, version problems, or to make sure are in place and functioning to stay out
validation leakages (i.e., intended or actual use cases of the headlines and away from 483s, warning letters,
that do not traverse the full validation cycle but end and recalls. GXP
up in use) may affect safety or efficacy of processes,
devices, or drugs. ARTICLE ACRONYM LISTING
DLL Dynamic Link Library
CONCERNS AND ACTIONS FDA US Food and Drug Administration
Companies should take a good look at the software and GAMP Good Automated Manufacturing Practice
firmware systems they have in place and what the as- IT Information Technology
sociated regulations are in regards to those systems. SDLC Software-Development-Lifecycles
Computer Validation Forum discusses topics and clustered around nine types of deficiencies as plotted on
issues associated with computer validation in order to Figure 1. This case was an exception to the 80/20 rule, in
provide useful resources for daily work applications. that the top nine problem areas represented about 41%
This column provides readers information regarding of the categories.
regulatory requirements for the validation and qualifi- The following were the most frequent deficiencies
cation of computerized systems. found:
Your questions, comments, and suggestions are M issing information. Documents or records omit-
required to fulfill the objective for this column. Case ted fundamental information or content that should
studies submitted by readers are welcome. Please send have been included.
your comments to column coordinator Sharon Strause I nconsistency. Documents contained statements
at sastrause@aol.com or to coordinating editor Susan inconsistent with other statements about the same
Haigney at shaigney@advanstar.com topic in the same document or in the same validation
package. Whats more, no explanation or reason was
INTRODUCTION given for the difference. Jargon, varying terminology,
What validation problems are you likely to see over and and contradictions in logic frequently caused these
over? When tackling complex validation challenges, kinds of inconsistencies.
youll save time, money, and headaches when you know L ack of needed detail. This deficiency applied most-
the most common problems and where to find them. ly to requirements documents. The requirements in
The following analysis is based on validation work the validation package did not adequately describe the
performed for a large US Food and Drug Administration- characteristics of data, user interactions with business
regulated company. The goal was to bring the companys processes, or key processes internal to the software.
software validation evidence up to the level of FDAs T raceability. We found three frequent traceability
current expectations as well as those of the clients own problems:
independent auditor. The traceability matrix did not account for a
Our efforts yielded 1,720 observations. As part of traceable specification or an observation step
a lessons learned review, these observations were in a test script
grouped into 22 different categories. The documents The trace was broken. Either a requirement was
that most frequently contained the observations were barren (lacked decedents or a test) or one of the
identified. The results, in the authors experience, are detailed requirements or test results was an orphan
typical of the problems most companies face. (lacked a parent somewhere in the requirement
tree).
APPLYING PARETO ANALYSIS TO The traceability matrix was incomplete. Require-
COMMON VALIDATION PROBLEMS ment details were not explicitly numbered and
Through Pareto analysis of the categories of problems, it traced to associated test steps. Requirements were
was discovered that about 80% of the observations were not traced at a detailed level, so the reviewer need-
[
For more Author ABOUT THE AUTHOR
information, Melvin F. (Frank) Houston is a senior validation consultant with EduQuest, Inc. of Washington, DC.
go to He is a recognized authority on ISO 9000 Quality Standards and Quality System Regulation. Sharon
gxpandjvt.com/bios Strause, the column coordinator, is a senior consultant with EduQuest, Inc. Sharon may be reached
at sastrause@aol.com.
Cumulative %
60%
15%
50%
10% 40%
30%
5% 20%
10%
0% 0%
.
cy
..
ng
...
s
i..
lit
ou
d.
in
GD
st
at
en
di
bi
st
te
gu
ed
rm
or
ea
ist
te
le
bi
ed
w
fo
ns
ac
e
ab
Am
et
e
ne
in
co
Tr
gu
rifi
pl
g
of
In
Va
m
in
ve
iss
co
k
Un
c
La
In
M
ed to infer the detailed links between specifica- tions are allowed without additional documenta-
tions and steps in a test script. tion only for obvious typographical errors, such
V ague wording. Documents used generalities as dropped or transposed letters (e.g., correcting
such as in accordance to an approved procedure, th or teh to the).
or applicable regulatory requirements, or all asso- I ncomplete testing. Test scripts did not fully or
ciated GXP and business processes. In addition, adequately test the associated requirement.
documents used vague words such as may, pos- Ambiguity. Text could be interpreted more than
sibly, more or less, and approximately. one way, so it did not establish a single, unique
U nverifiable test results. Expected results were requirement. The words either and or in a
not described sufficiently so that an independent requirement are strong clues the text is ambiguous.
reviewer could compare and verify actual results.
The IEEE Standard for Software Test Documentation, ADDITIONAL OBSERVATION CATEGORIES
Std. 829.1988, Clause 6.2.4 (1) states, ...provide the Beyond these top nine categories, 13 other categories of
exact value (with tolerances where appropriate) observations were identified. These category definitions
for each required output or feature. For executed may seem to be somewhat subjective, but for this sort of
scripts, actual results were not recorded or captured analysis the objectivity of the definitions was less impor-
in a way that allowed an independent reviewer to tant than consistency in classifying the observations.
compare them to expected results. For example, For this reason, all the classifications were reviewed
OK was noted in the actual-result column with several times before locking in the data for the lessons-
no reference to a screen shot. learned pivot tables. Even so, it was noted that between
Good documentation practice (GDP). The fol- the Ambiguous and Vague Wording classifications,
lowing three frequent good documentation practice many observations could have fit in either one.
problems: The following additional categories of deficiencies
Hand-recorded data and testing evidence, such (i.e., ones that did not rise to the level of our most
as test results, were presented in a way that could common findings but were still worth noting) were
cause doubts about their authenticity (e.g., cross- identified:
outs without initials, date, and reason) Compound requirement. Requirements that
Data that confirmed a specific requirement was were not unique; that is, the requirement statement
hard to find in the evidence provided (e.g., a busy actually stipulated two or more system characteris-
screen shot crammed with data) tics. (When the predicate of a requirement sentence
Handwritten corrections were made that changed contains and or a series of commas, or when the
the sense of a requirement or an expected test requirement is presented as a compound sentence or
result, but no discrepancy report or change request series of bullets, its probably a compound require-
was filed (e.g., changing an expected result from ment. This deficiency was often coupled with trace-
indicator Off to On). In GDP, hand correc- ability problems.)
Cumulative %
80%
15%
60%
10%
40%
5% 20%
0% 0%
t
ns
t
an
an
rix
lts
...
...
...
t
y..
en
di
rip
t1
rip
st
io
em resu
pl
pl
at
au
ar
m
c
su
ar
at
n
m
sc
ts
m
n
st
ss
tio
or
P
fic
io
m
n
Te
de
e
s
st
se
nd
io
at
ia
ac
Te
su
ci
Te
as
at
ed
lid
Ve
e
Tr
st
Sp
lid
Va
P
st
Te
Va
Gx
Re
Sy
F or your information. Here comments on the the overall evaluation and acceptance or rejection
potential to improve a document or process were of the test and validation results.
included. The issue that generated the comment L ack of process for resolving deviations. A
may or may not have had an impact on a determi- plan, protocol, or script lacked a process for resolving
nation of substantial compliance. Remarks on deviations (e.g., failure to meet expected test results,
particularly good examples of documentation or discovery of unanticipated behavior, or deviations
development practice were also included. from GDPs).
I ncomplete requirements. Findings in this cat- Questionable statement. A statement appeared
egory fell into the following four subcategories: to be inaccurate or incorrect.
The requirement in question implied another R edundant requirement. The same require-
requirement, possibly complementary, that needed ment appeared more than once in a specification
to be explicit to ensure verification document.
Regulatory impact analysis and risk assessment T opical inconsistency. The text within a topic
indicated a need for requirements that were miss- pertained to a different topic.
ing from the user requirement specification (URS) Typo. Typographical errors were observed.
Requirements in a software requirements specifica- U nsupported deviation. The summary document
tion (SRS), a software design specification (SDS), omitted reporting on differences between planned
or a configuration specification (CS) were not suf- activities and those that were actually carried out.
ficient to address the associated URS item. This Not testable requirement. The requirement was
deficiency was often associated with a broken trace not presented in objective, observable, or measur-
System and business process analyses indicated able terms. In other words, the requirement did not
the software had functionality that was used but describe a system response or characteristic that a
had not been described in the URS reasonable person could sense or measure.
R ationale. Statements or assertions were made Violation. The text set up or highlighted a violation
without supporting rationale or justification. Or, the of procedures or regulations.
rationale or justification for a particular statement
or assertion was not persuasive. These categories should be considered nothing more
Lack of acceptance criteria. Test and valida- than suggestions or starting points to create a list of obser-
tion plans did not establish objective criteria based vations. As experience is gained, the list may need to
on the outcomes of various tasks in the validation be revised to cull out some categories and/or identify
process, such as vendor audit, testing, and problem new ones.
resolution. The plans did not include criteria for
assessing the seriousness of deviations as a basis for
Identifying the Most Vulnerable found that the results described in this article are
Documents and Records typical of companies worldwide.
Taking the next step to document the lessons learned More importantly, the author has seen first-hand
from this project, the documents and records where that companies who reduce the frequency of these
the most frequent deficiencies were found were cat- problems with focused remediation efforts are much
egorized. It was discovered that about 85% of findings more likely to weather future FDA inspections. It can
were concentrated in six key documentation areas, as be reasonably assumed the same would be true if the
shown in Figure 2. frequency of such problems were low in the first place.
The following were the top types of flawed It is recommended that companies use these results
documentation: and definitions to assess their own validation projects,
Specifications (including user requirements) or devise their own categories and charts to pinpoint the
Test scripts companys most common problems. Either way, youll
Validation plans have a major headstart in better allocating validation
Test plans resources and making needed improvements quickly.
Trace matrix
Test results.
REFERENCES
Although the exact order of problem areas may 1. IEEE, IEEE Standard For Software Test Documentation, Std
differ in any individual organization, its likely these 829-1998, 16 Dec 1998.
same six documentation areas will float to the top.
From the authors experience, specification documents
are usually the biggest pitfall for most companies. ARTICLE ACRONYM LISTING
CS Configuration Specification
FEWER VALIDATION PROBLEMS FDA US Food and Drug Administration
AND INSPECTION SUCCESS GO GDP Good Documentation Practice
HAND-IN-HAND SDS Software Design Specification
After auditing many companies, large and small, and URS User Requirement Specification
participating in countless remediation projects, it was
Accurately Identifying
Your RequirementsWill
Any Computer System
be Right for You?
Janis V. Olson
Computer Validation Forum discusses topics and Write requirements for how the system should
issues associated with computer validation in order not work
to provide useful resources for daily work applica- Review all requirements with all levels of users.
tions. This column presents information regarding
regulatory requirements for the validation and INTRODUCTION
qualification of computerized systems. Requirements are the foundation for determining
Your questions, comments, and suggestions are what you want and what you need. People, in general,
required to fulfill the objective for this column. do not write down their needs, wants, and intend-
Please send your comments to column coordinator ed uses of the things they buy. Some do extensive
Sharon Strause at sastrause@aol.com or to journal research by going shopping, reading information, or
coordinating editor Susan Haigney at shaigney@ searching the Internet. Others buy the first thing
advanstar.com that appears to meet their needs. Others buy what
everyone else seems to have bought, thinking that
KEY POINTS if it meets other peoples needs, it will satisfy them.
The following key points are discussed in this article: Often, different people have different requirements
A clear statement of requirements is fundamen- and understanding of what is really needed. The only
tal to determining what you want and what way to resolve the conflict when purchasing computer
you need systems for regulated industries is through written
Write your requirements so they are unambigu- requirements.
ous, complete, consistent, and testable Writing requirements can be very difficult. Vague
The quality of your computerized system will statements of goals and needs are often expressed.
be a direct result of getting quality requirements Statements like user friendly, easy to use, and
written intuitive to the user are often seen but rarely
All system users should have input into defining defined. Requirements must be written so they are
the requirements unambiguous, complete, consistent, and testable.
Map the current process or processes the com-
puterized system is designed to replace. Incorpo- DETERMING THE REQUIReMENTS
rate any regulatory, statutory, and/or standards The quality of your computerized system will be a direct
requirements. result of getting quality requirements written. I have
Optimize the process or processes you want to use not used user requirements because those are only
Write your intended uses and requirements for one part of all the requirements you need to document.
the system in terms of how you will be able to Requirements should specify what the user and business
test that the requirements are satisfied need, not the abilities of the various products available.
[
For more Author ABOUT THE AUTHOR
information, Janis V. Olson (Halvorsen) is Senior Validation Consultant at EduQuest, Inc., a global team of FDA
go to compliance experts. Sharon Strause, the column coordinator, is a Senior Consultant with EduQuest,
gxpandjvt.com/bios Inc. Sharon may be reached at sastrause@aol.com.
erized system. Scenarios are often easier for users to Complete requirements cover all aspects of what the
review to assure all of their needs are being met by the system will and will not do. The design of the system
system. They will help you identify standard operating will determine what is done by hardware, software,
procedures that will need to be rewritten or written prior and people following procedures. All the users should
to performance qualification. review the requirements to assure that all of them
have been covered in the requirements document.
Requirements For How The System Should Consistent requirements do not conflict with one
Not Work another. For example, one requirement stated that
Write requirements for how the system should not the user will enter the date when the complaintant
work. Ask the What if? question as many times as reported an issue. A second requirement stated that
needed. Conduct a risk analysis for the system and the computer will pre-populate the report date of the
identify mitigations for those risks. Mitigations for complaint with the date the complaint was entered
the risk identified become requirements of the sys- in the system. The two requirements are inconsistent
tem. The goal is to assure that the system will fail with one another. Neither in itself is wrong, but taken
in a safe manner. Define a safe manner. Safe could together, the two requirements cannot be fulfilled at
mean that the data are not corrupted; that the data are the same time, and one must be changed. Testable
checked for consistency prior to being accepted; the requirements can be tested singularly and together
user receives a warning message and instructions on to determine if they are met. For example, stating
what to do next; the system flags the fields that have that the user will enter complaintant information
not been completed and are mandatory; etc. Again, into the system without defining the type of infor-
develop scenarios for how the system will not behave mation is not testable. As long as any information
and assure the scenarios are testable. is entered, no matter what it is, the test would pass,
even if there is not enough information to respond
Review All The Requirements to the complaintant. Generally ambiguous require-
The reviews should take place on multiple levels. The ments are not testable.
requirements must be reviewed to assure they are unam-
biguous, complete, consistent, and testable. Unambigu- SUMMARY
ous requirements are interpreted the same way by each Because the quality of a companys computer system can
person that reviews them. One company had require- directly depend upon the quality of the established user
ments that appeared, on first reading, to be well written requirements, it is important to be as specific as possible
and unambiguous. However, the following were misin- when creating a list of written requirements. Require-
terpreted by the system developer: ments should include all user needs and regulatory and
Users will have user names and passwords to oper- standards requirements. Written requirements should
ate the system be clear, complete, consistent, and testable. Establishing
Users will be operators, supervisors, or quality these requirements before a system is purchased can save
personnel. a company money in the long run.
The resulting system was designed so there were
only three user names and passwords the system
would accept, one for each type of user, not one for REFERENCE
each user. Unfortunately, this was discovered dur- FDA, Title 21 Food And Drugs, Chapter IFood And Drug
ing operational qualification and did not meet the Administration, Department of Health And Human Ser-
intended needs of the company because it was plan- vices, Subchapter AGeneral, Part 11 Electronic Records;
ning on using electronic records. The company had Electronic Signatures, April 1, 2009. JVT
to continue to use its manual batch history records.
operations are automated. Computer systems have rapidly evolved, and industry and
regulatory guidance regarding their use has evolved as well.
This column addresses computer systems quality and compliance with real life
scenarios and challenges in mind. It is our intent to present these topics clearly and in
a meaningful way so that our readers will have a basic understanding of principles, and
then be able to apply these principles in their daily work applications.
Reader comments and suggestions are needed to help us fulfill our objective for
this column. Suggestions for future discussion topics or questions to be addressed are
requested. Case studies illustrating computer systems quality and compliance issues
by readers are also most welcome. We need your help to make Computer Systems
Quality and Compliance a useful resource. Please send your comments and sugges-
tions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com or journal
coordinating editor Susan Haigney at shaigney@advanstar.com.
SUMMARY
The following are key points that should be considered in computer systems
quality and compliance:
An evolution has occurred regarding thinking and terminology from soft-
ware validation to computer systems quality and compliance
Computer systems include software, hardware, operating system, technical
infrastructure, use and maintenance processes, and the people who use
the systems
Computer system quality and compliance includes all the activities
associated with acquiring or developing and deploying a system and then
maintaining it until eventual retirement
A true quality system builds quality in because it is the right thing to do,
not because we are obligated to do sobecause obligation typically doesnt
foster the same level of commitment
Computer quality and compliance best practice is to apply quality prin-
ciples and practices with respect to all the elements of the computing
environment across all phases of the system life cycle
When systems or technology services are purchased from outside vendors,
the client company must gain assurance that the supplier has built quality
into the product they are selling
Building quality into the system results in systems that are reliable and
compliant.
any requirements to fulfill regulatory expectations planning and execution of decommissioning activities
or necessary quality controls and checkpoints must also have quality built in, to ensure proper dispo-
should be included. Requirements should also be sition and accessibility of data and records, controlled
testable. transitions to other systems when applicable, and a
Design and build. In the design and build phases, compliant decoupling of the retired system from the
any required standards should be followed, and sys- infrastructure and any interfacing systems.
tem configuration and/or code should be adequately
documented for traceability and ease of mainte- Building quality into the system across all the compo-
nance. nents of the computing environment, and throughout all
Test/validation. The test or validation phase is typi- the phases of the system life cycle results in systems that
cally the phase associated with quality. However are reliable and are also compliant with todays regulatory
this is just a confirming event meant to demonstrate expectations.
the quality built in, and assure of sustainable quality
operation of the system. ABOUT THE AUTHOR
Maintenance. In the maintenance phase, practices Ms. Nollau is a Director of Quality Services at Abbott Vascular, respon-
sible for validations, reliability engineering, supplier quality, microbiol-
such as change control and configuration manage-
ogy, and document management. Ms. Nollau has 25 years of experience
ment, problem reporting and resolution, and ongoing and increasing responsibility in the pharmaceutical and medical device
controlled operation of the system are all ways we industry, spanning the areas of manufacturing, quality assurance/com-
sustain quality and the validated state of the system pliance, and information services/information technology. Ms. Nollau
over time. can be reached via e-mail at barbara.nollau@av.abbott.com.
System retirement. Finally, at system retirement,
Computer Systems
Change Control
Farhad Forozesh
Computer Systems Quality and Compliance discusses the quality and com-
pliance aspects of computer systems, and aims to be useful to practitioners in
these areas. We intend this column to be a useful resource for daily
work applications.
Reader comments, questions, and suggestions are needed to help us fulfill
our objective for this column. Please send your comments and suggestions to
column coordinator Barbara Nollau at barbara.nollau@av.abbott.com or journal
coordinating editor Susan Haigney at shaigney@advanstar.com.
Rupert King/Getty Images
KEY POINTS
In this issue of the column, the following key points are discussed:
Change control as good business practice
The importance of having a change control process in place
Regulatory compliance drivers for change control
Developing a change control procedure and pro-
cess for computerized systems
Determining the level of re-testing required
Different types of change control and value in consistency.
INTRODUCTION
Change control is a common term describing the process of managing
how changes are introduced into a controlled system. Experts agree
that most problems of software and computer systems are introduced
when changes are made either during development or during use of the
systems. Change control is required to ensure that validated systems
remain under control even as they undergo changes.
Changes to the system are likely to disqualify the original valida-
tion if not performed and tracked carefully. Lack of documentation for
changes and testing after changes is one of the most frequently cited
deviations during internal or external audits. A robust change control
process must be in place to prevent unfavorable or non-compliant out-
comes as a result of change to systems.
[
ABOUT THE AUTHORS
Frank Houston is a senior validation consultant for EduQuest, Inc. His career includes digital de-
For more Author sign, clinical engineering, and biomedical engineering. Mr. Houston has done software quality audit-
information, ing and consulting for clients of all sizes in both the medical device and pharmaceutical industries.
go to Mark Weinglass is a senior validation consultant for EduQuest. He has over 25 years of professional
gxpandjvt.com/bios experience in the design, development, and validation of computerized process instrumentation, control
systems, medical devices, and related project management activities in the FDA-regulated industries.
Performs extensive and complicated data input Figure: Risk vs. criticality vs. complexity.
checking or control
P rocesses numerous types of transactions
Requires extensive support to maintain the system
Involves large numbers of users
Includes significant customization of a standard
software package through configuration or addi-
tion and modification of the source code.
R
i
CRITICALITY, COMPLEXITY, AND RISK s
As the Figure shows, criticality and complexity com- k
ity
bine somewhat like severity and probability do in
lex
risk assessment. In fact, this analysis is a good start-
mp
ing point for a systematic risk assessment. Take your
Criticalit
Co
number of Yes answers for criticality and multiply
it times the number for complexity, and the result
y
gives you a rough initial risk factor estimate to use
for planning your validation deliverables. sheet, risk factor 0 to 2, one should be able to do nearly
all the documentation needed within the spreadsheet
PLANNING THE VALIDATION itself with maybe one or two other documents or files
DELIVERABLES to cover change control and decommissioning.
It is important to remember that standard operating Validation records must cover the following:
procedure (SOP) documents are never optional, and Development or acquisition planning
plans should not be used as substitutes for SOPs. The Supplier assessment (up to and including supplier
following tasks must be addressed in SOP documents: audit)
Software acquisition, development, and User requirements
implementation Ongoing risk assessment
R isk assessment Functional requirements
Validation Design documentation
Supplier assessment (including audits) Design verification (including reviews)
Change control Qualification of software implementation includ-
Design ing the following tests as needed:
C ode review Installation
Testing. Operation
Performance
To begin validation planning, consider the follow- Traceability
ing questions. Change control and maintenance of validation
In your system are there criticality issues with: status.
Patient safety?
P roduct quality? Use the risk factor number to set an initial goal
P roduction operations (usability or efficiency for the number of documents or files to produce as
for example)? evidence of validation. Remember, SOPs do not count
In your system are there complexity issues with: as validation records.
The work process? Documents that combine easily include the
The computer programs or the equipment to be following:
used? System development plan and validation plan
Staff familiarity with the programs or Requirements documents
equipment? Test plans, protocols, and associated reports
Infrastructure changes needed? Test report and validation report
System implementation report and validation
Count up the number of Yes answers in each cat- report
egory and calculate a rough risk factor by multiplying Installation qualification and operational
them together. qualification.
The risk factor calculation should result in a number
between 0 and 12. The lower the number, the more A generalized validation procedure with a valida-
documents you can combine. For a simple spread- tion report form could be developed for the simplest,
least critical systems. These are rarely found in prac- Validate for Intended Use becomes easier with good
tice, but they should be used more often. Many vali- planning and use of the criticality, complexity, and
dations are fairly routine activities, and they do not risk processes. JVT
require extensive plans and reams of documentation.
CONCLUSION
With some careful homework and a few rules-of-
thumb, one can cut validation effort down to size.
Computer Validation Forum discusses topics and tion requirements to maximize benefits
issues associated with computer validation in order The organizations information technology
to provide useful resources for daily work applica- (IT) strategic vision is one way to define how to
tions. It brings information regarding regulatory identify, select, prioritize, plan, and implement
requirements for the validation and qualification automated tools for computer system validation.
of computerized systems. These IT initiatives can realize significant value by
Reader questions, comments, and suggestions the adoption and integration with the computer
are required to fulfill the objective for this col- system compliance process.
umn. Case studies illustrating principles submitted
by readers are welcome. Please send your com- INTRODUCTION
ments to column coordinator Sharon Strause at sas- For those of us working in US Food and Drug Admin-
trause@aol.com or to journal coordinating editor istration-regulated industries, computer system vali-
Susan Haigney at shaigney@advanstar.com dation (CSV) has been the long standing practice of
establishing documented evidence that a specific pro-
KEY POINTS cess will produce, with a high degree of assurance, a
The following key points are discussed in this article: product meeting its predetermined specifications and
This discussion addresses the use of enabling quality attributes. The FDA definition of validation
technology in computer system validation (CSV) rolls effortlessly off our tongues when those not famil-
projects to most efficiently achieve the validated iar with the discipline ask. And, as we continue into a
state in a pragmatic cost-effective manner more detailed explanation of the validation lifecycle,
Requirements definition management (RDM) the eyes of those who ask the question begin to glaze
and automated testing software are used regularly over as we cite regulatory references and enthusiasti-
for the validation and verification of embedded cally dive deeper into the details of how validation is
software in the design and development process accomplished. Invariably, those discussions include
for medical devices terms such as controlled processes, risk assessment,
G AMP 5 (March 2008) states that automated CSV documented requirements, and documented testing
testing tools can be used to improve test execution results that typically are met by the manual methods
efficiency and effectiveness of CSV. The outcome of a validated computer sys-
Automated CSV tools provide the most benefit tem is for the benefit of the organizations use of an
for larger enterprise applications such as enter- enabling technology in a regulated process. Typically,
prise resource planning, document management the organization or business unit is using technology
systems, laboratory information management to transform or improve manual or inefficient busi-
systems, corrective action and preventive action, ness processes. Yet, the process of CSV has historically
and so on been mostly manual and paper driven. However,
Organizations should consider a formalized the use of enabling technology in CSV projects can
validation plan for each tool or set of tools to serve industry well as a way to achieve the validated
describe the risk, use, and validation or qualifica- state, while reducing the overall duration of validation
[
ABOUT THE AUTHOR
For more Author Ms. Jae Burnett is a senior manager at Deloitte & Touche, LLP with 10 years experience in the
information, pharmaceutical, biotech, and medical device industries and extensive knowledge of required system
go to controls and processes to comply with FDA regulations 21 CFR Parts 210, 211, 820 and Part 11. Sharon
gxpandjvt.com/bios Strause, the column coordinator, is a senior consultant with EduQuest, Inc. Sharon may be reached
at sastrause@aol.com.
with gains in efficiency. Careful consideration and on a GXP regulated system, it becomes subject to
purposeful application of the appropriate technology specification and verification based on risk. While
tools for your organization can help you gain greater GAMP 5 doesnt focus on all types of automated tools,
control and streamline processes in a pragmatic, cost- these principles should be applied when considering
effective manner. automated tools for GXP systems.
advantage of the integrated functionality to support a level of competency for automated tools used for
the compliance activities can provide big benefits supporting non-GXP systems.
to a life science company. For example, leveraging
Solution Manager as a repository for application CONCLUSION
configuration provides the traceability to specifica- Automated tools can have a real impact on computer
tions. The change request component of Solution system compliance and serve as a way to gain great-
Manager, also known as ChaRM, links the change er control and efficiencies. Life science companies
request, approval, test, and migration of the change should consider the various tools supporting the
through the SAP system landscape with its integration management of requirements, configuration, change
with the transport management system. SAP also control, documentation, and automated testing as
offers adapters with third-party applications (3, 4), real options. These IT initiatives can realize signifi-
such as HP Quality Center, adding additional testing cant value by the adoption and integration within the
functionality. Full traceability between the change computer system compliance process.
request and the production system build is realized.
Life science companies recognize the business and REFERENCES
compliance value of this tool and make plans early in 1. ISPE, GAMP 5, A Risk-Based Approach to Compliant GxP
the lifecycle of their SAP implementations to validate Computerized Systems, ISPE, page 207, 2008.
or qualify Solution Manager. 2. Components and Tools of SAP Netweaver: SAP Solution
Manager, http://www.sap.com/usa/platform/netweaver/
STRATEGY FOR USE OF AUTOMATED components/solutionmanager/index.epx.
CSV TOOLS 3. SAP Solution Manager Adapters, http://www.asug.com/
Before an automated tool can be used in the CSV process Search/SearchResults/tabid/211/Scope/All/Default.
or other compliance activities of a GXP-regulated system, aspx?Search=solution+manager+adapters&ResultTy
planning and assessment of any tool should be consid- pes=38,102,2.
ered. The functionality of the tool and its intended use 4. Pharmaceutical Online, Genilogix Announces Avail-
will determine the extent of validation or qualification ability of Validation Accelerator with e-Signature for
requirements. Organizations should consider a for- the Latest Version of HP Quality Center, Pharmaceuti-
malized validation plan for each tool or set of tools to calonline.com, December 15, 2008. http://www.pharma-
describe the risk, use, and validation or qualification ceuticalonline.com/article.mvc/Availability-Of-Valida-
requirements. Operating procedures should also be tion-Accelerator-0001. JVT
in place to detail system administration, configuration
management, and any other control processes. NOTE: This publication contains general information
As automated tools become more widely used, life only and Deloitte is not, by means of this publication, ren-
science companies can take advantage of the benefits dering accounting, business, financial, investment, legal,
and leverage the technology for CSV and compliance. tax, or other professional advice or services. This publication
A pragmatic approach to integrating automated tools is not a substitute for such professional advice or services,
in the validation process should be taken. The effort nor should it be used as a basis for any decision or action
to qualify a tool for validation of smaller-scale systems that may affect your business. Before making any decision
could be greater than the effort to validate manually. or taking any action that may affect your business, you
Identification and selection of enabling tools should should consult a qualified professional advisor.
be carefully considered. Companies need to have a Deloitte, its affiliates, and related entities shall not be
clear vision of how these tools are used to sustain responsible for any loss sustained by any person who relies
system compliance and provide benefit to the organi- on this publication.
zation. A computer system compliance roadmap that
complements the organizations IT strategic vision is ARTICLE ACRONYM LISTING
one way to define how to identify, select, prioritize, CAPA Corrective Action and Preventive Action
plan, and implement automated tools for computer CSV Computer System Validation
system validation. The compliance roadmap aligns DMS Document Management Systems
with the IT strategic plan and offers a method to lay ERP Enterprise Resource Planning
out the compliance activities and tools for computer FDA US Food and Drug Administration
system validation and operational controls for GXP IT Information Technology
systems. Organizations that involve members from LIMS Laboratory Information Management
the IT, business, and compliance groups will benefit Systems
from early assessment and planning for enabling tools RDM Requirements Definition Management
for GXP systems. Many organizations already have
Originally published in the Autumn 2009 issue of Journal of Validation Technology
[
For more Author
information,
ABOUT THE AUTHOR
Sharon Strause is a senior consultant at EduQuest, Inc. working in quality assurance compliance and
go to
computer system validation. She may be reached by e-mail at SharonStrause@EduQuest.net.
gxpandjvt.com/bios
Using an outside vendor can provide the following software development, rather than doing that work
benefits: in-house.
Internal resource availability The following are four main objectives of vendor
Technical matter experts availability management for software:
Experience with multiple implementation Selecting the right vendor
approaches Working with the chosen vendor
Expertise and knowledge. Keeping control of a software development project
(who does what?)
Drawbacks of Software Development Developing a vendor partnership.
by a Vendor
The following are some drawbacks to using a vendor Selecting the Right Vendor
for software development: There are a few ways to approach finding a quali-
Not accountable to company management (just fied vendor. First check within your own company
what is in the contract) to see what other vendors have been utilized and
Delays due to communication, lack of knowledge the lessons learned from that companys contract.
of company policies and procedures, conflicts Second, check with affiliate organizations, like
with their own policies and procedures, or lack the American Society for Quality or the Parenter-
of knowledge of company operations al Drug Association. Third, you can use industry
Budget and resources may be fixed (dependent networking resources or industry publications and
on contract terms) journals. Fourth, you can ask other vendors for
Team approach may not be evident recommendations.
(i.e., we versus them). Once youve chosen the vendor, its time to audit a
vendor. The following are three types of audits that
Company Concerns with Vendor you should use with a vendor:
Companies do have concerns with vendors that need Pre-selection audit. This audit determines who the
to be addressed as a part of any contract, but they also vendor will be based on a set of criteria.
play a role in determining whether an outside vendor In-process audit. This audit determines how the con-
will be utilized. Concerns include the following: tact, communication, and coding are proceeding.
Determining that the vendor has the personnel Post-development audit. This audit determines
with the expertise required maintenance requirements.
How will the project be communicated so that
all parties understand their roles? Working with the Chosen Vendor
Can the vendor work independently or will they Once a vendor has been chosen, a contract should
require constant communication? be developed between the vendor and the company.
Can the vendor deliver a functioning system within The contract should include the following:
the time and budget and meet all the internal quality The contract should be formal and signed before
assurance (QA) standards required for the project? the work starts. Usually this is a normal func-
What about the accountability level? tion of the vendor management process and
purchasing control.
Vendor Concerns with the Company The contract should have terms and conditions
A vendor may have its own concerns, as follows: (i.e., type of service, identification of deliver-
Can the project be completed on time and meet ables and associated timelines, requirements
the terms of the contract? for personnel, requirements for documentation,
Who will coordinate the plan and keep work in quality and regulatory requirements, etc.).
the pipeline assuring that procedures, guidelines, The contract should have a section on distri-
and regulatory requirements are met? bution of work (i.e., company and vendor and
A vendor has multiple clients and must be able associated personnel at each).
to service all of them, which means the vendor The contract should have quality checkpoints.
needs to be flexible to meet different standards and These could be the in-process audits or docu-
requirements as well as regulatory expectations. mentation deliverables.
The contract should have a cost and payment
OBJECTIVES FOR UTILIZING A VENDOR schedule.
FOR SOFTWARE DEVELOPMENT
Now that weve seen both sides of the argument for Key vendor deliverables established as a part of
software development, lets determine what actions the contract include the following:
would be necessary to utilize an outside vendor for Design and development documentation. If
the company is going to do the maintenance of compatibility of the systems so that the enterprise
the system, this will be critical. If the vendor works well and efficiently.
were doing the maintenance, this documenta-
tion would be part of the in-process and post- AUDIT REQUIREMENTS AND
development audit review. PREPARATION
Test plans and results documentation. The ven- The following are three types of audits that you
dor would retain this and would be reviewed by should use with a vendor:
the company during the in-process audit reviews A pre-selection audit determines who the vendor
and any post-development audit reviews. will be, based on a set of criteria
System and user manuals with release note and An in-process audit monitors the contact, com-
quality program documentation. munication, and coding of the project
Training plan and materials. This would be A post-development audit determines mainte-
developed with the company. nance requirements.
Knowledge-transfer process (if maintenance is
going to be the responsibility of the company). Preparation for All Audits
Preparation for audits is a key component to for-
Keeping Control of a Software mulating a plan and maintaining control of an
Development Project audit. Preparation for the three types of audits
It is important for the company to keep control will be similar. It begins with a schedule, estab-
at all times during the development project. If a lishing an agenda, a date and time for the audit,
good contract has been completed, this should the personnel involved in the audit, the require-
be easy. If not, you will have missing or incom- ments for review, and the audit results follow-
plete documentation; var ying quality standards ing completion of the audit. The requirements
on the code itself as well as the deliverables; more for review will change with each type of audit.
in-house work will be required; and there will Audits should show that the vendor is operating
be some hostility between the vendor and the in a quality manner and that project deliverables
company because of missed deadlines, missing are complete and accurate.
functionality, and a system over budget. Good
project management is key to keeping control in Pre-Selection Audit
a software development project. This audit is a critical one because it should reveal the
most important areas for the company to understand
Developing a Vendor Partnership regarding the usage of this vendor. It should include
Developing a vendor partnership provides leveraging questions on the following:
opportunities for the company. There can be shared Vendor stability, both financial and the number
work between vendor personnel and company per- of years working in the industry
sonnel. The company doesnt have to do extra work An organization chart should be requested to see
when the project is delivered by the vendor. The where quality fits into the vendors management
vendor does what they do best and the company does structure. Is quality a separate department or a
the same. Both will develop a common language function of one of the managers?
for terminology and deliverables. There will be an Procedures covering the software development
inter-dependent work relationship, a reduction in lifecycle, quality manual, quality policy, disaster
the time needed for a project, and a reduction in the recovery, document management, etc. (reviewed
cost required for a project. and assessed against the companys procedures
Preferred providers give support and development and regulatory requirements).
resources on a continuing basis. The vendor will Development methodology. How does devel-
learn the companys environment in order to better opment occur? What safeguards are in place to
understand the companys business requirements. ensure sections of code are secure? What types
All lessons learned can be applied to future projects of testing are completed?
and especially for on-going support of the projects. Measurement systems should be reviewed (i.e.,
A preferred provider means that the company gets customer issues, bug fixes, etc.).
first priority for vendor resources and a more con- Resource availability and technical expertise.
sistent look and feel to the information managements Does the vendor have enough personnel to do
systems being utilized by the company, which is your project as well as others in the timelines that
always helpful in the regulatory environment that you require? Can you review resumes of the per-
everyone operates in today. There should be a better sonnel to see what types of education and years
of experience the developers of the vendor have? would take place as a part of the companys audit sched-
Training of the personnel. Is any regulatory train- ule for vendor management.
ing included?
Industry knowledge or your companys specific CONCLUSION
knowledge (you must understand what you might Developing a partnership with a vendor begins
need to train). by selecting a qualified vendor that is determined
Will they fit with your company? Can you in your by a pre-selection audit to ensure that the vendor
discussions determine whether open communica- is capable of providing the services you require. A
tion will be possible and factual information pass partnership establishes expectations for both parties;
between the company and the vendor? examines methodologies for differences; identifies
specific deliverables required in the contract; identi-
In-Process Audits fies the roles and responsibility for the work and key
These audits are performed during the process of develop- milestones; reviews checkpoints with key contacts for
ment. Depending on the criticality of the development, communication; and allows time for implementation,
more than one audit may occur. These audits have a QA validation, and review.
person to lead and conduct the audit, a technical spe- Most important, however, is that the company and
cialist from the information technology department to vendor treat each other as a valued partner working
review the technical issues, and a business person for the toward a common goala quality, regulatory secure,
user needs of the code being developed. The following software development project.
should be considered during an in-process audit:
Review the deliverables for the project and any
corrections. Are you staying on the established REFERENCES
schedule or must negotiation take place? Is the FDA, 21 CFR 11, Electronic Records, Electronic Signatures, 62
documentation in place? Does the code demon- Federal Register 13464, March 20, 1997.
strate the user requirements? FDA, 21 CFR 210, Current Good Manufacturing Practice in
Manufacturing, Processing, Packing, or Holding of Drugs:
Post-Development Audits General, 43 Federal Register 45076, September 29, 1978.
These audits are usually completed after the software code FDA, 21 CFR 211, Current Good Manufacturing Practice for
project has been delivered and is in place in the company. Finished Pharmaceuticals, 43 Federal Register 45077, Sep-
They are usually a result of either enhancement or changes tember 29, 1978.
that need to be completed for the code or if the vendor is FDA, 21 CFR 820, Quality System Regulation, 61 Federal Reg-
providing the support of the system. ister 52654, October 7, 1996. JVT
If the vendor becomes a valued partner, the audit
INTRODUCTION
Businesses function in an electronic world where potentially sensitive
information and data are stored on computers and networks. These same
networks may be vulnerable to attacks that could result in corruption of data
or loss of property. Information security should be an important part of any
business practice.
This article describes a hypothetical breach of computer security. It
describes how easily a corporate computer system may be accessed, both by
unauthorized internal personnel and by an outside hacker. The results of
such a breach may be disastrous. And it may be surprising how easily these
breaches can be accomplished. Suggestions for preventing these types of
problems are provided.
A BUSINESS NIGHTMARE
NewGen43 is a (hypothetical) pharmaceutical biotech company located in
the US. Sally is NewGen43s complaint-handling lead. NewGen43s man-
agement has become concerned about late medical device reports (MDRs).
Sally has been with NewGen43 for 15 years and she likes it there. Sally is not
sure why new management was brought in for her department. As far as she
can tell, she was doing fine. She thinks her new boss is giving her, and her
team, an incredibly hard time over late MDRsshe had even been given a
warning! Sally is troubled.
SoftBio Systems is a software development company located in Estonia.
The company was founded and is led by Gunter. Their primary customer
is TopBioPharma, a competitor to NewGen43. Gunter and his coworkers
in Estonia are making more money than they have ever imagined. The deal
with TopBioPharma has been incredibly lucrative for them. When they set
up shop, they never imagined five years of development work setting up the
software for clinical trials would follow. By Estonia standards they were rich.
So Gunter and his team were distraught when their contact at TopBioPharma
called to say they were terminating their contract with SoftBioSystems. It
seemed that NewGen43, a rival to TopBioPharma, was expected to complete
its final trial and would have compelling outcomes that would give TopBio-
Pharma poor prospects at best. Gunter asked when the submission was
plannedabout four months, he was told.
A Well Intentioned Insider contract with TopBioPharama. Using social business net-
Bob, Sallys husband, could not believe that after 15 years working sites they developed a list of people they could
Sallys job, or at least the raise they were counting on target to get them into NewGen43 and in particular they
for that new RV, was at stake. He asked her to explain targeted a number of people that were looking for jobs.
the problem she was having at work. She said, Its the The next step of their attack was to get some of those
software, its very hard to use. My team wants to do a people to compromise their work computers. It was easy
good job, but the system is old and has all these rules we to cull a list of work e-mail addresses. They sent each
have to follow. Thats why we keep filing late. Bob was a person a carefully worded recruiting e-mail.
pretty good software developer and had some experience One of these e-mails found its way to Sally. She was
with software like the complaint system Sally used. They already feeling pretty good about her new status at New-
decided after dinner Sally would log in remotely (VPN) Gen43. But what was the harm at looking at the Top
and give Bob a demo. She had to fix a complaint any- Tier Pharma Company seeks Senior Manager, top salary
way. Sally showed Bob the system as she worked on the + bonus + signing bonus job posting? She clicked on the
complaint. Then she said, Look now I have to log out link and provided all the information they asked for. Af-
and back in to change roles. Bob laughed and pointed at ter all, if her profile was selected she would get a free iPod
the URL; it read: Touch. She was a little irked by the security warning that
kept popping up, but she wanted to complete that profile.
http://complaints.newgen43.com/process/ref=complain She was relieved when she was donemaybe she would
t45689?role=supervisor get the iPod!
Gunter exclaimed, We got one! His team went to
Bob said, Log back out and back in like you did be- work. They installed software that would allow them to
fore. Sally did. Now the URL read: control Sallys computer on the iPod and on a CD they
sent as well. They had it packaged and on its way that
http://complaints.newgen43.com/process/ day.
ref=complaint45689?role=handler A little over a week later Sally got a nice letter congratu-
lating her on her accomplishments and thanking her for
Bob pointed out the role in the URL. See, Bob submitting her profile. She immediately plugged in the
said, The system is granting you permissions based on new iPod and installed the software.
what it sees in the URL. Bob asked Sally what the other Later that day Gunter was scanning NewGen43s
roles were. Sally said, Id love to be admin, they can fix network. He found Sallys system secure, but was able to
almost anything. Lets try editing the URL, suggested install a hacking tool to infect other systems. He compro-
Bob. Sally edited it to read: mised a few systems, but in general NewGen43s IT team
had done a good job. Then Gunter noticed something
http://complaints.newgen43.com/process/ he could not believe. Sally was suddenly connected to
ref=complaint45689?role=admin the complaint management systems as Admin. He used
that connection to connect to the complaint database. He
The administration screen appeared. Sally could not quickly learned the complaint system had a programmat-
believe her eyes. She threw her arms around Bob and ic link to the clinical system. He pulled that code back to
said, I can fix anything now! I can easily hit the MDR on- his system. Then he used the databases command shell
time targets now. Bob said, Dont get carried away! But to infect the database server with his own remote access.
Sally could see that her team was on its way. He set it to call him over a standard web port every few
At first Sally felt a little guilty changing the filing dates hours.
using the administrator access. She started looking for Castle keys. Gunter reverse-engineered the code he
a new job, just in case. She was not sure she could keep pulled back and found the NewGen43s clinical system
changing the dates. But slowly it got easier. Her boss login ID and password. It took him a few hours, but he
praised her and her team. She said Sally and her team wrote some interesting programs. The first was to change
were role models, that other teams could do the same. She the code in the complaint system that talked to the clini-
even got a small bonus. Sally and Bob bought that new cal system to insert small random errors as well as insert
RV. There was no way she could stop now. bogus complaints, tricking the clinical system into think-
ing that there were additional failures. These changes
Trouble from the OutsideAttack were subtle. His goal, after all, was to derail approval by
Meanwhile in Estonia, Gunter and crew were plotting. corrupting the trial data.
They decided that they had nothing to lose. The global Subtle manipulation. Next, he added a program to the
economy was bad and the chance of finding another clinical database that made small but insidious changes.
lucrative contract soon was nil. So they decided to try His intent here was to do a small amount of damage over
to sabotage NewGen43s trial so they could keep their the next several weeks to months. His program would
change certain key data randomly, but viably, so as not Every element of the story presented in this article is
to be immediately detected. He knew what he was doing completely plausible using off-the-internet hacking tools.
and what results would disrupt the trial. So slowly the The imaginary Gunter is not a top computer scientist. In
trial populations blood pressure dropped, pulse rates fact, the skill to perform this attack would be considered
went up, blood iron levels rose, and so did HDL. moderate to advanced intermediate. So what happened?
Two months passed and TopBioPharma called Gunter. We have all learned from television and big screen
We are going to keep the program going. NewGen43 just crime dramas that we need a motive. In this case there
pulled out of a conference where they were going to pres- are two key motives. First Sallys motive: she just wanted
ent their trial results, our R&D team decided to start the to keep her job. She loves her company and her job. She
next trial. Well send you a purchase order for the next just had a clash with a new manager over a few percent-
phase of the project. Gunter was happy, but not greedy. age points on late MDRs. She never intended to hurt the
Erasing his tracks. Gunter quickly connected back company; she was just scared.
in to NewGen43. He deleted his programs and cleaned Gunter was a reputable software consultant who had
up as best he could, but knew a few traces would be no idea how, in this bad economy, he would replace the
left behind. He then inserted a common virus that had kind of lucrative contract TopBioPharma represented. He
a payload that would encrypt the disk, including the knew that TopBioPharma made good products and he
database. This would also secure any local evidence of was sure people would be just as well off with TopBio-
his programs tampering. He knew the company would Pharmas drug vs. the product made by NewGen43. In
restore from a backup, but that didnt matter, as long as he his mind, patients werent hurt, he kept his contract, and
could erase his tracks. TopBioPharma stayed a lucrative customer.
Next, he used the other systems he compromised Could this scenario happen at your company? Do you
to launch a wide spread attack inside the NewGen43 have an employee that values their job? Could they, for
network, installing a common botnet (a way for external what they think are innocent reasons, take advantage of a
hackers to control computers that are not theirs.) He did vulnerability in a system to help them keep that job, to get
this so that any investigation would point to a run-of- a raise, or get a bonus? Is there a Sally in your organiza-
the-mill compromise of the system and not trigger any tion?
alarms. Do you have a supplier or contractor similar to
Finally, he backed out of the complaint systems and in- SoftBioSystems? Does one of your competitors? Is there
fected Sallys computer with a destructive virus knowing someone who depends on a revenue stream that is large
the IT staff would baseline the system (erase the disk and enough to induce them to attack you? Keep in mind
install all new software), thus covering his last probable that governments are compromised for what amounts to
track. trinkets and pocket change.
NewGen43s IT staff responded quickly to the virus
outbreaks, cleaning the infected system. They saw the Taking Advantage of a Security Weakness
iTunes on Sallys computer and she told them she won Bob, Sallys husband, had some skill and he knew enough
the iPod in a contest. They found the infection on it, but to exploit a weakness in NewGen43s complaint system.
saw it just installed a remote control program that looked The method Bob used (URL tampering and altering
like the others they had been dealing with. They cleaned unsecured security information) is not esoteric. This type
it for her and gave it back with a warning not to install of attack is on the top 10 vulnerability lists of two security
unapproved software in the future. organizations. The attack Bob used is really two issues
in one. By setting the role in the URL, the application
Trying to Recover did not sufficiently protect credentials. Also, the URL
Over the next weeks, NewGen43s clinical and regulatory alteration is a type of web parameter tampering. How
teams realized something had gone very wrong. They does this happen? For most organizations, developing
kept restoring older and older versions of the data, but software that works at all is hard, and developing secure
could not piece enough data together to confidently pro- software is even more difficult. There is evidence of this
ceed. They had an electronic system and scraps of paper everywhere. That is how Sally became an admin. The
that could be used to see that some data was wrong. But administrator connection is what gave Gunter the access
other data was right! to compromise the complaint system and from there, the
NewGen43s stock dropped 22% upon the news that clinical system. Keep in mind Gunter did not care at all
they would restart their trial. It dropped another 10% about the complaint system. It was only a way to gain
when word spread that the US Food and Drug Admin- access to the clinical system. It was the complaint system
istration was auditing them for inconsistencies in their that gave him the key to his damaging attack.
MDR filing practices.
Social EngineeringThe Attack Method
SO WHAT HAPPENED? The iPod trick is one example of social engineering. We
read about these attacks all the time. Social engineering The Wired.com posting (2) also reported that the
attacks range from a thief getting a kind person to hold University of California at Santa Barbara observed one
the door open while they carry out an armful of laptops botnet, Torpig, for 10 days and observed 70 gigabytes of
or a person in a uniform standing in front of an ATM data being stolen from computers remotely-controlled
and taking deposits because the ATM is downmany by the botnet, including financial data. The harvested
people just hand over the envelope. data included 1.2 million Windows passwords and over
Hacker turned security researcher, Kevin Mitnick, 1 million e-mail items, such as e-mail addresses and
is famous for his social engineering skills. In his book, login credentials.
The Art of Deception (1), Mitnick states, Social engineer- Wired.com quotes the University report (2) as stating,
ing uses influence and persuasion to deceive people by In ten days, Torpig obtained the credentials of 8,310
convincing them that the social engineer is someone he accounts at 410 different [financial] institutions. The re-
isnt, or by manipulation. As a result, the social engineer searchers continued, The top targeted institutions were
is able to take advantage of people to obtain information PayPal (1,770 accounts), Poste Italiane (765), Capital One
with or without the use of technology. (314), E*Trade (304), and Chase (217).
The lesson here is to not underestimate the ease of
Vulnerabilities these attacks or how simply an IT team could mistake a
In the scenario presented in this article, the complaint targeted attack (what Gunter did) for a run-of-the-mill
system software held the login id and password for botnet attack. Gunter was clever and used social engi-
the clinical system. Far-fetched? No. The hard coded neering on the IT team, tricking them into thinking they
credentials problem is also on the list of top 10 vulner- were fighting a botnet and using a common erase and
abilities. This happens all the time. It is easy to just stuff replace strategy, thus covering his tracks.
credentials in-line with code; called hard coding. It takes Now some astute readers might point out that there
zero extra lines of code to do this. To make credentials are products and techniques to thwart these attacks.
secure and configurable is a lot more work, maybe 100 They are right, but in this authors experience those
times more by the time all the scenarios are tested. If are rarely deployed and staffed by sufficiently trained
credentials are hard coded, over time this weakness gets personnel to be consistently effective. A proof point is
worse and worse as more people and more systems gain that governments, banks, and financial institutions that
access to those hard coded, never changing credentials. do have highly competent technical staffs and great tools
The next point in our fable relies on another top 10 still have determined attackers that get through their
vulnerability: elevated privileges. Developers like to defenses.
run at the highest privilege level. Its sort of like having It is important that readers understand business risk
the keys to the castle: no worries, we can go wherever and the value of information security. It is easy to break
we want. But good security requires the opposite in. It is easy to compromise systems. It is really easy to
least privilege. Least privilege means only the absolute social engineer people. Do you understand these risks,
minimum to do the one thing the program needs to or do your advisors? Have you mitigated those risks?
do at that moment. Its hard to develop and hard to Does Sally work somewhere in your organization? Does
test. It takes time and costs money, hence its persis- Gunter work for a competitor? Are you sure?
tent presence on the top 10 list. But this is precisely The good guys have to protect all possible points of
how Gunter gets unrestricted access to the complaint attack. The bad guys (even well intentioned ones) need
system. only find one unprotected or inadequately protected
Botnets and command and control may sound point to get in. Once in, for most organizations, its game
like something that is cutting edge and difficult. But over.
it is really easy to use. There are lots of websites that
offer the software that any competent administrator or RECOMMENDATIONS
programmer can easily use right off the shelf. But it gets System-Administrator, Audit, Network Security (SANS)
better. There are hackers who will build you whatever is a globally trusted source for information security train-
you like for $50 to $250! In May 2009, Wired.com (2) ing, certification, and research that recommends protect-
reported that there are bots active on 12 million IP ad- ing your organization with approaches called defensive
dresses. (An IP address, or Internet Protocol address, walls (3). The following is a brief explanation of each
is like the phone number for your computer system). By wall that will help create awareness of what a compre-
trailing his attack with common botnet and virus drop- hensive program looks like.
pings, our fictitious Gunter covered his tracks. The IT
staff erased all the evidence for him. Defensive Wall 1: Proactive Software Assurance
By the time the clinical team realized their data was This level of defense relates to the following:
bad due to Gunters slow careful corruption, they had How software is developed
no way to prepare a trial submission. How software is tested
information security landscape. http://www.geekonomic- are busy and thus hide suspicious activity well.
sbook.com/. GXP Uniform Resource Locator (URL). Also know as a web ad-
dress. www.google.com is an example.
GLOSSARY Virtual Private Network (VPN). A network inside a network
Command Shell. A shell is a piece of software that provides that is created for a private use. A VPN rides on some
an interface for users. Shells generally fall into one of two existing infrastructure (like wires) but has been secured so
categories: command-line and graphical. Command-line it is private.
shells provide a command-line interface (CLI) to the system.
Users type key words and symbols to get the command ABOUT THE AUTHOR
shell to perform tasks. Robert Smith is an application technical lead responsible for quality
Medical Device Report (MDR). An FDA required report. systems software development at Abbott Vascular. Robert has 25 years
of software development experience including VC start-ups funded by
These reports are always important, but at times can be The Mayfield Fund, Granite Capital, and Wasatch Venture Fund, and
critical to meeting regulations and agency goals of protect- holds CISSP and PMP credentials. Robert can be reached by e-mail at
ing the public. robert.smithii@av.abbott.com.
Port (web port). A number, like and extension to a main
phone number used for two devices to connect. When you Barbara Nollau, column coordinator, is director of quality services
connect to a website you generally do so on Port 80. There at Abbott Vascular. She is responsible for validations, reliability
engineering, supplier quality, microbiology, and document man-
are potentially thousands of ports on each system. Smart agement at Abbott Vascular. Ms. Nollau can be reached at Barbara.
attackers usually use well-known and popular ports. They nollau@av.abbott.com.
KEY POINTS
The following key points are discussed in this article:
In todays environment of technology and automation, it is important to
understand disaster recovery (DR), business continuity (BC), and con-
tingency plans (CP) and how they all work together to ensure continuity
and integrity of systems and availability of data and records
System owners and technology professionals should understand how
these plans should be developed and when/how to exercise them
Having a DR plan in place is important to the compliance of computer
system validation and Part 11 for regulated systems
The DR team and the associated roles and responsibilities should be
clearly defined and understood
Disaster identification, notification and coordination processes, com-
munication plans, alternate computing facilities management, return
to normal operations, plan testing, and maintenance procedures are all
required elements of a robust DR program
Minimally, a company should have a functional plan that addresses all
of the processes required to restore technology, an individual respon-
sible for that plan, and a disaster response team at the ready.
INTRODUCTION
I attended a disaster recovery conference a while back, and one of the
speakers said, If you want to see how real experts plan disaster recovery,
go to Puerto Ricowhy? Look at the number of hurricanes they deal with
on an annual basis. Theyd better know what they are doing from a disaster
recovery standpoint! I never forgot that statement, and Ive been interested
in best practice relative to disaster recovery ever since.
In this issue of the column, we will examine the terms disaster recovery,
business continuity, and contingency planning. Understanding these
terms and implementing these measures are important for the integrity
and compliance of the systems we use. We will further explore the disaster
recovery (DR) element to gain a deeper understanding of what is required.
Figure 1:
Elements and hierarchy of a DR/BC/CP program.
Short Term
Contingency
misfortune. A disaster is an event that is catastrophic to connection). Localized system outages and brief periods
the business, meaning people cant work, or even worse. of system downtime (i.e., a document control system
An example of a disaster in this context is an earthquake down for a day or e-mail unavailable for several hours)
that destroys an entire facility. A smaller event may also are not considered disasters and are, therefore, treated
be considered a disaster in some cases, for example a differently, usually with simple contingency plans. What
fire in a data center that brings all computing capability constitutes a true disaster for a company should be de-
in the company down. A disaster can be defined as any fined up front, including determining criteria. This must
unplanned event that prevents an entire organization be understood ahead of time, so it is clear what condi-
from functioning as intended or causes damage to people tions will lead to invocation of the DR plan. Depending
or facilities (e.g., fire, explosion, or extensive building on the magnitude of a disaster, invocation of the broader
damage). business continuity (BC) plan may or may not be war-
A disaster can have a significant, direct impact to a ranted (DR and one or several functional area BCs may
firms ability to continue business processing. There may suffice.) Disaster recovery is designed to recover from a
be an inability to develop submissions or collect clinical true disaster, not an outage or fault.
trial data, delayed or limited ability to get information to
the field or process sales data, or the inability to manufac- ELEMENTS OF THE DR/BC/CP PROGRAM
ture, pack, ship, or track product, samples, and pro- Now that we have reviewed what constitutes a di-
motional material. The ability to sustain time sensitive saster and how that differs from an outage, we need
processes such as payroll may also be hindered, effecting to gain an understanding of the elements of a DR/
financial relationships. The enterprise may be unable BC/contingency plan (CP) program, how they work
to communicate internally or with customers, and there together, and for what conditions each element is
could be residual outcomes such as non-compliance with used. The elements and hierarchy of the program are
regulations and lack of alignment with a parent company shown as follows (see Figure 1):
and partners. Some of the effects of these outcomes are Enterprise business continuity (EBC). A broad
financial in nature (lost revenue from inability to ship program that covers all aspects of the business
product, loss of sales from delayed submissions, loss (e.g., process, technical, physical, human, etc.).
of worker productivity, or damaged credit rating from Focuses on keeping the business viable in the
inability to pay bills). The companys reputation with cus- event of a disaster.
tomers, employees, partners, or other stakeholders may Disaster recovery (DR). A program focused on
be damaged. technology recovery in the event of a disaster, an
There is a difference between a disaster and an outage element of EBC.
or fault, which is the temporary loss of some or all servic- Functional business continuity (BC) plan. A
es (e.g., hard drive failure, power outage, loss of network functional area- or business area-specific plan
focused on keeping business processes moving in re-qualification should be based on risk (risk level of the
the event of a disaster, an element of EBC. affected system(s), level of change, and planned sustain-
Contingency plans (CP) for system downtime. ability of change). The re-qualification criteria should
A functional area- or business area-specific pro- be pre-determined and documented in the DR plan.
cess used as a workaround during non-disaster Requirements listed in 21 CFR Part 11 (1) that are
system outages, usually contained in an operat- related to, amongst other controls, DR are the ability
ing procedure. to generate accurate and complete copies for review
and inspection, and that records must be retrievable
The broadest level of BC (enterprise level) covers facili- throughout required retention time. In the case of a
ties, human resources, safety, equipment and furniture, disaster, without a DR plan, we cannot say that we are
communications (internal and external), and invocation able to produce accurate and complete copies or that
of lower level plans. Disaster recovery is focused on the records will be retrievable during that time.
technology only and covers the recovery facility (on-site,
hot site, or cold site), computer hardware, operating DISASTER RECOVERY PLANS
systems, networking, and other infrastructure, applica- A disaster recovery program is more than just how
tion software, databases, and records. Functional BC to restore systems and data. The plan must include
plans are lower level plans specific to a functional area disaster identification, notification and coordination
or given business process. They are usually put in place processes, communication plans, alternate comput-
for critical business processes and cover the manual ing facilities management, processes to return to
workarounds to be used until technology is recovered. normal operations, and DR plan testing and main-
These workarounds may involve the use of log books, tenance procedures. It must be a functional plan
cell phones, hard copy documents, etc. in place of the that addresses all of the processes required to restore
technology that is unavailable. Finally, contingency technology and it must have a defined owner respon-
plans for system downtime are similar to functional area sible for maintenance of the plan on an ongoing basis.
business continuity; however, they cover localized out- A disaster response team must be identified and at
ages only (e.g., one department, one system, etc.) They the ready.
are usually feasible for short durations only, assume some When developing the plan, it is important to deter-
sort of infrastructure being in place, and typically involve mine the priority order of restoration across infrastruc-
paper-based manual workarounds. ture and systems. One of the inputs to determining
Developing and maintaining a tested DR/BC program this is pre-requisite technology (e.g., the network must
is important to the computer validation process and to be restored before applications that rely on networked
compliance with 21 CFR Part 11, Electronic Records and communications are restored.) A second input is the
Signatures. A commonly accepted definition for validation required level of uptime for each system. Systems
is establishing documented evidence which provides a requiring 24/7/365 uptime will need to be restored
high degree of assurance that a specific process [system] before those that dont have such stringent uptime
will consistently produce a product meeting its predeter- requirements. Another factor to consider is the sustain-
mined specifications and quality attributes. In order to ability of the defined workarounds (i.e., how long can
address the consistently portion of the definition, as the manual workaround realistically suffice without
part of system validation the following should be verified causing bigger problems such as unmanageable back-
as in place and tested: logs, etc.). The person developing the DR plan must
Disaster recovery plan collect this information about all technology elements,
Backup plan perform a triage activity, and resolve any conflicts in
Business continuity plan the case of systems with dependencies or the same
Contingency plan for system downtime. uptime requirements or conflicting priorities, and then
determine the overall order of restoration required and
Maintaining Compliance document it in the plan. This information should be
Another validation-related consideration is regarding the communicated back to the business area system own-
maintenance of the validated state of regulated systems/ ers so everyone is aligned with the planned order of
infrastructure. In the case of a major disruption to restoration in the case of a disaster. This is important
service that requires restoration in a completely different because recovery time expectations must be managed.
environment and/or replacement of major components, Business area system owners whose systems are lower
measures must be taken to ensure the validated state of in recovery order must understand this fact and the
the system is maintained. A disaster, and subsequent drivers for that order.
DR, interrupts the qualified state of the IT infrastruc-
ture. Once the environment is restored, some level DISASTER RESPONSE TEAM
of re-qualification must be performed. The level of The disaster response team must be identified ahead
Figure 2:
Example DR team organization.
Execution Platforms
Principle: Platform 1 Recovery Team
Backup: Principle:
Backup:
of time. Roles, responsibilities, and backups must be Makes decision, based on initial assessment, to
defined, documented, and understood. Figure 2 shows activate the DRP and subsequent recovery teams
an example DR Team organization. Monitors the hot site recovery and the home site
Typical roles and responsibilities for personnel restoration efforts
involved in DR are as follows. Establishes and ensures the receipt of updates from
the hot site coordination team lead on a regular basis
DR Team Lead Keeps senior management informed of the progress
The team leaders role and responsibilities include the of the recovery effort
following: Facilitates planning for return to a new or repaired
Facilitates the disaster recovery process facility.
Ensures the workability of the plan by working
through assigned teams Hot Site Coordination Lead
Maintains and distributes the final copy of the The hot site coordination leaders role and responsibili-
plan ties include the following:
Conducts impact studies Assembles hot site coordination team members at
Develops recovery strategies and response proce- the command center
dures Briefs, organizes, schedules, and mobilizes all subor-
Coordinates testing dinate recovery teams
Monitors team response in actual disaster situa- Oversees the preparation and restoration activities of
tions. all hot site environments
Coordinates the identification, retrieval, and distri-
IT Management Lead bution of all off-site disaster recovery backup tapes
The IT management leaders role and responsibilities and vital records
include the following: Updates the IT management lead of restoration
Assembles team leaders at the command center progress on a regular basis
Places hot site on ALERT and makes formal Receives and responds to restoration progress re-
disaster declaration ports from all associated recovery teams
Monitors the initial assessment activities Assists with planning for return to a new or repaired
ties include the following: initially and then drilled periodically. Drills typically
Provides centralized coordination for all help desk identify snags, which should result in updates to the
requests DR plan. A drill doesnt always have to be a full-
Provides end user problem resolution and assis- blown simulation of the actual processthere can be
tance throughout the recovery period segmented drills (for selected portions of the technol-
Maintains communications with end users ogy/selected systems) at the DR location, and in some
Communicates the prepared disaster statement cases, a conference room drill (one in which the
Coordinates the setup and staffing of required process is walked through procedurally) can suffice.
operations at the hot site. It is not recommended to ONLY perform these abbre-
viated options, however. Hot site contracts typically
RECOVERY FACILITIES include several drills per year, of which the company
The type of facility required for the DR operation must should take advantage.
also be determined based on business requirements. Some common (and easily avoidable) mistakes
A hot site is needed if fast recovery of data and con- with respect to DR execution are such things as
nectivity is required and taking the time to actually missing or forgotten software product keys, outdated
rebuild the technology platform prior to recovery is not contact information for key personnel or service pro-
feasible. In the case of a hot site, hardware will already viders/vendors, not assigning backups for DR team
be on hand and mobile computing resources and desk roles, and blank or corrupt backup tapes. One of the
space for critical staff are available. The network is most frustrating mishaps is discovering that the DR
designed to be able to quickly connect all unaffected plan was maintained in electronic form only and is,
systems to the hot site and telecommunications carri- therefore, not available when needed.
ers are prepared to switch those capabilities to the hot One person should be assigned the overall responsi-
site. The hot site is typically provided by a third-party bility for maintenance of the DR plan (normally the DR
service provider contracted by IT and provides these lead). The plan should be updated when drill results
services on a subscription basis, governed by a con- dictate a change, when there are system implementations
tract. The subscription also typically covers periodic or retirement, and when significant changes are made
drilling of the DR plan using the hot site. Some corpo- to systems that would affect their recovery method. The
rations choose to designate one of their own locations DR lead must maintain the plan master copy and ensure
as a hot site for the others; however, these locations that all copies of the plan are the most recent version and
must also be tested and drilled. that old versions are destroyed. Additionally, the DR
A cold site is used for build and recovery of data lead must maintain any sensitive combinations, pass-
and connectivity in a situation where time is not as words, etc. that will be required during DR but cannot
critical. Many DR plans use a hot site for immediate be put into the plan.
recovery of business critical systems and then move
to a cold site to rebuild lower priority platforms. A DEVELOPING A DISASTER RECOVERY PLAN
cold site is much less expensive than a hot site, be- If you do not have a DR plan in your company, it is ad-
cause it is really only providing a facility. This space visable to develop one. Steps to do so are as follows:
must be outfitted at the time of need by the subscrib- Stakeholder support. Identify management
ing company, and the arrangement should include stakeholders and gain support and funding by creat-
quick-ship agreements with vendors because there is ing a business case for why it is needed. This can
no equipment on hand. This option is certainly less sometimes be a tough sell because DR is similar to
costly but if used solely, significantly slows recovery insurance and it is sometimes difficult to imagine
time. needing such a thing. Be persistent.
Whichever type of recovery facility is selected, Project requirements. After approval and sup-
choose a location that will likely not be affected by port to proceed, gather uptime and recovery time
the same disaster, but that is still within a reasonable requirements and technical requirements and
travel distance and time. Storage location for back- constraints from the business and from IT subject
ups must be accessible within a reasonable time and matter experts.
effort and/or an arrangement in place for quick-ship Project team. Form a team to define a plan to bal-
to the recovery site. With respect to storage of the DR ance the recovery time requirements with relative
plan, keep a copy of the plan in several locations (e.g., priority and available resources, and use a risk-based
company facility, recovery site, in possession of the approach to determine the overall recovery order.
DR lead.) Gap analysis and remediation. Identify any gaps
and remediate them.
MAINTAINING THE DISASTER RECOVERY PLAN Disaster recovery plan. Draft the plan, review with
Once developed, the DR capability must be tested stakeholders, finalize the plan, and conduct a drill.
Revise the plan as required. Drugs, Chapter IFood And Drug Administration, Department
Of Health And Human Services, Subchapter AGeneral, Part
CONCLUSION 11, Electronic Records; Electronic Signatures. GXP
This article discusses disaster recovery, business
continuity, and contingency planning and how ARTICLE ACRONYM LISTING
understanding and implementing these measures BC Business Continuity
are important for the integrity and compliance of the CP Contingency Plan
systems in todays environment of technology and DR Disaster Recovery
automation. EBC Enterprise Business Continuity
System owners and technology professionals should
understand how these plans should be developed ABOUT THE AUTHOR
and when and how to exercise them. System owners Barbara Nollau, column coordinator, is director of quality services
at Abbott Vascular. She is responsible for validations, reliability
should have a DR plan in place and all team roles and
engineering, supplier quality, microbiology, and document man-
responsibilities should be clearly defined. A company agement at Abbott Vascular. Ms. Nollau has 25 years experience
should have a functional plan that addresses all of the and increasing responsibility in pharmaceutical and medical
processes required to restore technology, an individual device industries, spanning areas of manufacturing, quality assur-
responsible for that plan, and a disaster response team ance/compliance, and information services/information technol-
ogy. Ms. Nollau can be reached at barbara.nollau@av.abbott.com.
at the ready.
REFERENCE
1. FDA, HHS, Code of Federal Regulations, Title 21Food And
System Definition:
Defining the Intended
Use for a System
By Robert W. Stotz, Ph.D.
Figure 1
Upper Part of PhRMAs System Development Life Cycle
Verify Qualify
Software Hardware
Figure 2
Upper Part of PDAs Life Cycle
1
Validation Policies
Plan Validation
Validation Project Plan
Activities
Validation SOPs
2 Functional
Requirements
Define Computer-related
Computer-related System Requirements
System Requirements Design
Requirements
Figure 3
GAMP 4 Basic Framework for Specification and Qualification
Functional Operational
Specification Qualification
Verifies
Design Installation
Specification Qualification
Verifies
System Build
21 CFR Part 11 became effective in August defines validation as confirmation by examina-
1997, policy guide 7153.17 issued in July tion and provision of objective evidence that the
1999 followed by five Part 11 guidance particular requirements for a specific intended
documents in 2001/2002. The policy guide use can be consistently fulfilled, and 820.25(c)
and five guidance documents were subse- covering design input states in part:
quently withdrawn in February 2003, and
replaced in September 2003 with Docket No. Each manufacturer shall establish
2003D-0060, Guidance for Industry, Part and maintain procedures to ensure
11, Electronic Records; Electronic Signa- that the design requirements relating
tures - Scope and Application. to a device are appropriate and ad-
FDA published their systems-based inspec- dress the intended use of the device,
tional program (Compliance Program Guid- including the needs of the user and
ance Manual Program 7356.002) in February patient The design input require-
2002, and in September 2004 a draft guid- ments shall be documented and shall
ance subsequently replaced by the final be reviewed and approved by a des-
guidance in September 2006, both entitled: ignated individual(s). The approval,
Quality Systems Approach to Pharmaceuti- including the date and signature of
cal Current Good Manufacturing Practice the individual(s) approving the re-
Regulations that defines the role of quality quirements, shall be documented.
systems in the pharmaceutical current good
manufacturing practice regulations. Both A common error found in many system defi-
the draft and the final guidance were devel- nition documents is a description of a systems
oped by the quality systems working group capabilities, often extracted from vendor provided
(now the Council on Pharmaceutical Qual- information, rather than a definition of intended
ity) formed as part of the Pharmaceutical use. The impact of this type of error is particu-
CGMPs for the 21st Century: A Risk Based larly acute relative to Part 11 requirements when
Approach initiative. a system has extensive capabilities for generating
FDA issued their new GMP initiative in Au- or maintaining electronic records and/or utilizing
gust 2002 that described an increased focus electronic signatures and only a portion of these
on those aspects of manufacturing that pose capabilities are intended to be used. The end
the greatest potential risk, and their intent to result is wasted time and resources in extensively
integrate quality systems and risk manage- testing a systems capabilities rather than the por-
ment approaches into its existing programs tion of those capabilities that are intended to be
with the goal of encouraging industry to used.
adopt modern and innovative manufacturing The Facilities and Equipment section of the
technologies. The final report on the new Quality Systems Approach to Pharmaceutical
initiative published in September 2004. Current Good Manufacturing Practice Regula-
Publication of several guides/guidances rel- tions guidance states:
evant to computer systems such as De-
sign Control Guidance for Medical Device Under a quality system, the tech-
Manufacturers in March 1997, Off-The- nical experts (e.g., engineers, de-
Shelf Software Use in Medical Devices in velopment scientists), who have an
September 1999, and General Principles understanding of pharmaceutical sci-
of Software Validation; Final Guidance for ence, risk factors, and manufacturing
Industry and FDA Staff in January 2002. processes related to the product, are
responsible for defining specific facil-
ity and equipment requirements, The
Section 820.1(z) of the medical devices CGMP Glossary section defines validation
characteristics which are mandated 3. How will you assure appropriate actions
by external systems and outside the are taken by the end user?
control of the developers. One in- What aspects of the OTS Software and
terface which is important in every system can (and/or must) be installed/
case is the user and/or patient inter- configured?
face. What steps are permitted (or must be
taken) to install and/or configure the prod-
uct?
The FDA guidance on Off-The-Shelf Software How often will the configuration need to
Use in Medical Devices provides a series of six be changed?
questions, with additional questions following What education and training are suggested
each of the primary six, to help define the basic or required for the user of the OTS soft-
documentation requirements for OTS software. ware?
The following is an adaptation of those questions What measures have been designed into
that can be used as an aid in defining the intended the computer system to prevent the opera-
use of OTS software. tion of any non-specified OTS software,
e.g., word processors, games?
4. What does the OTS software do? -
1. What is it? What function does the OTS software pro-
For each component of OTS software used, vide in the computer system? Specify the
specify the following: following:
Title and Manufacturer of the software What is the OTS software intended to do?
Version Level, Release Date, Patch Num- The design documentation should specify
ber, and Upgrade Designation as appro- exactly which OTS components will be
priate included in the design of the computer sys-
Any software documentation that will be tem. Specify to what extent OTS software
provided to the end user is involved in error control and messaging
Why this OTS software is appropriate for in the computer system error control.
its intended use What are the links with other software
including software outside the computer
2. What are the computer system specifica- system (not reviewed as part of this or
tions for the OTS software? another application)? The design doc-
For what configuration will the OTS soft- umentation should include a complete
ware be validated? Specify the following: description of the linkage between the
Hardware specifications: processor (man- computer system software and any out-
ufacturer, speed, and features), RAM side software (e.g., networks).
(memory size), hard disk size, other stor-
age, communications, display, etc. 5. H
ow will you know the OTS software
Software specifications: operating sys- works?
tem, drivers, utilities, etc. The software Describe testing, verification, and valida-
requirements specification (SRS) listing tion of the OTS software. Software test,
for each item should contain the name verification, and validation plans should
(e.g., Windows 95, Excel, Sun OS, etc.), identify the exact OTS software (title and
specific version levels (e.g., 4.1, 5.0, etc.) version) that is to be used in the com-
and a complete list of any patches that puter system. When the OTS software is
have been provided by the OTS software tested it should be integrated and tested
manufacturer. using the specific OTS software that will
be delivered to the end user.
Is there a current list of OTS software ware, even if purchased off-the-shelf, should
problems (bugs) and access to updates? have documented requirements that fully define
its intended use, and information against which
6. How will you keep track of (control) testing results and other evidence can be com-
the OTS software? pared, to show that the software is validated for its
An appropriate plan should answer the intended use. The guidance defines a require-
following questions: ment as any need or expectation for a system or
What measures have been designed into for its software, and goes on to state: Require-
the computer system to prevent the in- ments reflect the stated or implied needs of the
troduction of incorrect versions? On customer, and may be market-based, contractual,
startup, ideally, the computer system or statutory, as well as an organizations internal
should check to verify that all software requirements. There can be many different kinds
is the correct title, version level, and of requirements (e.g., design, functional, imple-
configuration. If the correct software is mentation, interface, performance, or physical re-
not loaded, the computer system should quirements). Software requirements are typically
warn the operator and shut down to a safe derived from the system requirements for those
state. aspects of system functionality that have been
How will you maintain the OTS software allocated to software. Software requirements
configuration? are typically stated in functional terms and are
Where and how will you store the OTS defined, refined, and updated as a development
Software? project progresses. Success in accurately and
How will you ensure proper installation completely documenting software requirements
of the OTS software? is a crucial factor in successful validation of the
How will you ensure proper maintenance resulting software.
and lifecycle support for the OTS soft-
ware?
DEFINING REQUIREMENTS
The FDA guidance on General Principles of
Software Validation describes how certain pro- It should be clear at this point that the first and
visions of the medical device Quality System most vital step in defining an automated system is
regulation, which became effective in June 1997, the definition of its requirements, i.e., its intended
applys to software and the agencys current ap- use. The requirements are the foundation for the
proach to evaluating a software validation system. system specifications and all subsequent design
Validation of software is a requirement of the documents. One cannot prove that a system does
medical device Quality System regulation, i.e., what it is intended to do if just what it is intended
Title 21 Code of Federal Regulations (CFR) Part to do has not been clearly defined. The require-
820, and applies to software used as components ments define what the system is to do rather
in medical devices, to software that is itself a than how it will perform a given task.
medical device, and to software used in produc- Definition of a systems requirements fre-
tion of the device or in implementation of the quently begins with a preliminary concept of the
device manufacturers quality system. Although required (and desired) functions of the new sys-
the guidance is directed at the medical device tem. Through an iterative process with input from
industry, it is based on generally recognized soft- the systems users and others involved with the
ware validation principles, and can therefore, be design and implementation of the system, the re-
applied to any software. quirements are further refined in terms of required
Section 2.4 of this guidance (Regulatory Re- functions (needs or musts), desired functions
quirements for Software Validation states in part: (wants), data to be processed, design constraints,
All production and/or quality system soft- performance and documentation requirements,
and validation criteria. The desired functions or The document should also include security re-
wants should be prioritized. The ability to under- quirements; safety considerations; specific hard-
stand both the activities being automated as well ware and software implementation requirements;
as the needs of the individuals or operators who and level of education, training, and experience
will be using the system is necessary in defining of each person who will interact with the system.
the requirements. In many cases, these needs may Personnel (i.e., in-house experts, consultants, etc.)
not be known at the beginning of the project, but required or available for each part of the project,
they must nevertheless be anticipated to the great- and a description of environmental factors should
est degree possible. be included as well. Graphical information such
A rigorous review and verification process is as system flow charts and diagrams that show the
required in defining the requirements of a system impact of the new system on existing manufactur-
that not only considers the needs of the end- ing functions and corporate data bases is useful
user(s) but also includes a clear understanding in communication of requirements. Definition of
of the operating environment that is to surround the requirements (intended use) for an automated
the proposed system. Configurations that might system should not be taken lightly. The quality
satisfy the requirements should be considered in and ease of maintenance of the system depend on
terms of cost; availability of required technology, the care taken at this point in the planning phase
facilities, equipment and effectively trained per- of the project.
sonnel; interface with current systems (e.g., en- A typical requirements document10 contains
terprise resource planning, ERP); legal liabilities; the following:
etc. Prospective vendors can also be contacted for Overview of the project and its objectives,
additional information. expected benefits, as well as constraints
Requirements can be developed using a top- caused by finances, time, and human re-
down process. General requirements for the auto- sources
mated system are established first, and then more Required and desired control functions
detailed requirements are developed. In large Sources and characteristics of input data
projects, defining the requirements of each logical Data manipulation and output requirements
entity may be required. A typical requirements Technical, electrical, and mechanical re-
document could contain the following: an over- quirements
view of the project and its objectives; expected Spare capacity
benefits; and financial, time, and manpower con- Human/Machine Interfaces (HMIs)
straints. The requirements document should de- Schedule for desired completion of important
scribe the required and desired control functions; milestones in the project
sources and characteristics of the input data; data Basis for system evaluation (in terms of per-
manipulation and output requirements; technical, formance requirements), and validation (a
electrical, and mechanical requirements; human summary of the general approach to be used
interfaces; desired timetable for completion of for validation of the system)
important milestones in the project; and the basis Devices, equipment, and/or databases in-
for system evaluation and validation (i.e., a sum- cluded in, or controlled by, the system
mary of the general approach to validation of the Block diagrams or sketches showing the
automated system). Each device and/or piece of physical location of the components of the
equipment included in, or controlled by, the auto- system
mated system should be described in the require-
ments document. Block diagrams or sketches that
show the physical location of the components of Because the requirements document describes
the system are also helpful and should be included. the sequence, timing, and scheduling of opera-
The requirements document should describe the tions, it should also include the following:
sequence, timing, and scheduling of operations. Security requirements
Figure 4
Requirements/Specifications and PhRMAs Lifecycle Model
User Functional
Requirements Description
System
Contact
Prospective Vendors Definition
Functional
Requirements
Define System
System
Requirements Function
Design Structure
Requirements
Requests for
Proposal
System
Specification
Vendor
Selection
In system evaluation and acceptance, the for- Despite the above discussion, experience has
mal mechanism for judging the performance of shown that the definitions of requirements and
the new system and the minimum requirements specifications are often incorrectly combined into
for acceptance should be identified. a single document. Done sometimes by design
and other times through the evolution of the re-
quirements document, this practice often results in
Requirements versus Specifications oversights of user needs and the mixing of require-
ments with specifications. If all or a major part of
Each of the above lifecycle models (Figures the automated system is supplied by outside ven-
1-3) shows two separate and distinct steps in de- dors, more the rule than the exception with todays
fining the attributes of an automated system. The more complex systems, a separate requirements
first step, defines the systems requirements and document is required to convey user requirements.
the second its specifications. Although the level Providing a detailed specifications document to
of detail can vary, the requirements must establish potential vendors may lead them to rule out some
the criteria for system design and testing, while viable solutions or attempt to satisfy the specifica-
also allowing for flexibility in the selection of tion with an expensive, customized system.
specific hardware, software, and vendors. On the
other hand, specifications provide highly detailed
definitions of specific hardware components and CONCLUSION
their functions, software considerations, and the
systems interaction with its operating environ- Defining an automated system in terms of
ment, i.e., specifications define in detail how its requirements, i.e., what a system is intended
the system will meet the requirements described to do is the first, and most important, step in
in the requirements document. building a quality system. A clear definition of
Figure 4 shows the lifecycle relationships and requirements, and specifications based on these
separation between requirements and specifica- requirements, results in systems that are more
tions using PhRMAs SDLC model. The process straightforward to construct, easier to operate,
of system definition starts with a high-level de- better documented, and more reliable. These sys-
scription (User Requirements) of what the new tems are subsequently simpler and less costly to
system must do to be acceptable for its intended maintain, and vendors are better able to determine
use. Depending on the complexity of the new and meet user needs.
system a narrative description of its intended The development of a systems requirements
use (Functional Description) may be extracted (intended use), as well as its specifications, is
from the User Requirements and used to solicit an iterative process that requires effective com-
information from prospective vendors on systems, munication among diverse disciplines. Too often
technologies, and/or system components (hard- the system user either is neglected or fails to par-
ware and software) that could be utilized in the ticipate adequately in these phases of the project.
development and construction of the new system. Invariably, the result is an inferior system which
Subsequently, this information can be formulated is difficult to learn, confusing to use, and expen-
into the Functional Requirements (i.e., prioritized sive to maintain.
required and desired functions) and Design Re- Although defining system requirements and
quirements (i.e., the new systems architecture, system specifications are closely related, they
its operating environment, design and/or software should be defined at two distinct points in the
development standards to be followed, etc.) sec- lifecycle. This two step process may seem
tions of the System Requirements document. The lengthy, tedious, and simply not worth the extra
System Requirements document in conjunction effort; however, taking these additional steps
with the selected vendors is then used to generate consistently proves to be time well spent, making
a separate System Specifications document. validation a value added process rather than an
been actively involved with validation issues for PDA Technical Report No. 18 on Validation of
more than twenty-seven years and was a member Computer-Related Systems, and has presented
of the Pharmaceutical Research and Manufactur- and published several papers on the subject of
ers of Americas (PhRMAs, formerly PMAs) validation. Dr. Stotz holds a doctoral degree
Computer Systems Validation Committee for sev- from the University of Florida and B.S. and M.S.
eral years. He was also a member of the PDAs degrees from the University of Toledo. He can be
Computer Validation Committee that published reached at (610) 594-2182.
Lessons Learned in a
Non-Regulated Software
Validation Project
B y B rian S h o e mak e r , P h . d .
Figure 1
List of Applications in the UnderOver Project
Figure 2
The Standard V Model
be unnecessary and impractical, since the appli- documenting the specifics of how and where the
cations were already in use. The decision was to applications were installed would have consider-
document the as built design of these applica- able value for future software maintenance. In-
tions, to serve as a baseline for subsequent change stead of performing IQ, a so-called Configuration
control. Specification was created for each application, to
Because thirteen of the custom applications document anything a programmer or information
were either Microsoft Access or Lotus Notes technology (IT) specialist would need to know in
databases, documenting their design required order to reinstall, maintain, or decommission the
more than an annotated code listing. Fortunately, application. Figure 3 lists examples of the types
several tools will generate complete reports of all of information to include in these Configuration
tables, forms, queries, reports, modules (program Specifications.
code), and macros (if any) in an Access applica- In both the as built design documentation,
tion; a similar utility exists within the Lotus Notes and the Configuration Specifications in place of
development environment. These outputs could installation qualification, this project bent the
be automatically generated and printed to PDF, classical validation products - and in so doing,
and archived to capture the complete design of the fulfilled the project purpose.
database applications.
It also proved necessary to adapt the concept Be Ready to Delve into Technical Specifics,
of installation qualification (IQ), to provide even if these Should Be the Province of Devel-
useful information for this project. The applica- opers and Architects
tions were already in place, so a detailed proce- Consider the shop-floor workflow application
dure to confirm that they were being installed (dubbed ManuFlow for this discussion). This
correctly would have no meaning. However, system consisted of an off-the-shelf container
Figure 3
Information to include in Configuration Specifications
Version of any underlying system (Access, Excel, Lotus Notes)
C
onfiguration options (where applicable such items as default file-save directories,
user security settings, or compatibility switches)
R
esources required on the users station (e.g., client-side program, browser plug-in,
mapped drives, or shortcuts)
with drivers for barcode scanners and a desktop studying the flowcharts in detail. This kind of
interface, but within which any given company down-in-the-code study is not typically expected
had to build its own suite of workflow scripts. The of a validation consultant, who may or may not be
container was a commercial off-the-shelf appli- familiar with the program language, but for this
cation, but all the functionality of the system re- project it was vital.
sided in the custom-developed scripts. Complete
User Requirements for these scripts were virtually Help Solve Specific Technical Issues Where
impossible to build from user interviews, since Necessary
(a) the manufacturing floor users were too close Several times in the course of the UnderOver
to the functioning system, and had difficulty ex- project, it was necessary to help the project team
pressing what the ManuFlow application should see that a certain output was manageable, and not
do; and (b) the scripts, which had been inherited some insurmountable obstacle.
from the parent company, underwent extensive Creating design documentation was a prime
revision (in part to clean out unused scripts and example. The UnderOver team leader at first
code) in the course of the project. quailed at the task of documenting design of
Determining both the user requirements and the Access databases. After researching Access
overall design of the workflow script suite be- documentation tools, it was possible to recom-
came an exercise in reverse engineering. The IT mend several possibilities, and to list the essential
Director provided automatically-generated flow- information such a tool would need to provide.
chart diagrams for all of the scripts. From these With these suggestions, what seemed unattainable
diagrams, the connectivity of the scripts could became a fairly straightforward task.
be determined (which ones comprised the main Once the Access examples had been generated,
menu, which ones were called by the main selec- the Lotus Notes developer could see the type of
tions, and so on down the hierarchy - see Figure information that would be needed, and employed
4), and the general actions occurring in each built-in developer tools to create equivalent out-
script could be puzzled out. The developer pro- puts for the Notes applications.
vided brief synopses of the scripts, but deducing
the important logic tests and user inputs required Learn the Client Systems, and Adapt to the
Figure 4
Connectivity of the ManuFlow Scripts
Figure 5
Interdependence of ERP / Access Databases
ManuFlow
Plan_Sampl Com_ERP
ProdSched PP_Sched
Figure 6
sential throughout the project. How far the work
Interdependence of Lotus Notes Databases had come and how far it had yet to go, when the
next site visit was scheduled, what issues needed
to be addressed: all were featured in a brief
weekly update. After completing several User
CO_Sys Requirements documents, a table was added to
the weekly report to list the applications being
validated and the status of each document (re-
New product quirements, configuration specification, design,
change order test procedures). A glance at the table quickly
told team members both the work which had been
completed and the tasks still ahead.
RD_Track Eventually, the UnderOver team leader asked
that the company president be copied on these
weekly reports. This visibility to top management
as well as to the project team helped immensely
New product
in planning activities, keeping focus on the issues,
setup sheet and allowing team members to see and celebrate
how far they had come.
Figure 7
Overview: Phases of UnderOvers Manufacturing
Post-processing
Preparation Manufacturing
Rough Finished
Incoming Intermediate Product Product
ber of cases, the tester believed that actual results and practical, but completely objective. Human
disagreed with expected results and had to be beings need to use that machinery, however - and
marked as fail; some of these were true failures, working with a team of human beings on the
some resulted from performing the test incor- task of showing that the thought stuff is correctly
rectly, and some showed that the correspond- built, taught several powerful lessons in relating
ing requirement was erroneous or had not been to human beings.
implemented. Figure 9 lists several cases where
an initial fail result was changed to pass (or Learn How to Listen
not applicable) on review. Of twenty-five cases This project team taught the validation consul-
marked fail, only eight remained after review tant some crucial listening skills. During the very
(of course, all were explained in the project final first site visit, as key users described the Prod-
report). 2Spec application (see Figure 1), important state-
Happily, the UnderOver tests did not include ments for the User Requirements emerged either
any cases where the tester counted a result as a as logical conclusions or as implicit assumptions.
pass but in fact the result should have been Restating these apparent requirements (So what
marked fail. These have arisen in other proj- youre saying is that the program needs to keep
ects, and are often very difficult to communicate track of XYZ - is that correct?) allowed refining
to the tester or developer (to the point of causing the points that would be documented. In only one
disagreements within a project team). case could the person interviewed express an ap-
plications user requirements without assistance
INTERPERSONAL LESSONS: - and that person had developed the application.
DIFFICULT TO DEFINE, Simply listening also contributed to the test-
BUT VITAL TO SUCCESS ing phase. More than one team member asked for
help while executing a test procedure; these re-
Computer software is working machinery cre- quests identified a number of script errors or true
ated from pure thought stuff - both ethereal application failures. In one case, the frustrated
Figure 8
The UnderOver Project Plan, in Flowchart Format
Reqmts
Review/Revise
Applications List Risk Analysis
Configs
Document:
User Reqmts,
Conduct Risk, Configs
CSV Training
Develop SOPs
Project
Assemble Project Closure
SOPs Closure Report Report
tester working on the ProdSched test pointed out a rect; others needed to talk through the document
confusing section, talked through a series of steps, with reference to the program itself, to provide
opened the dialogs described in the test, and an- feedback. Each application, and each team mem-
swered her own question without the test coordi- ber, required a different type of communication
nator ever saying a word! Her comment: I guess to make sure the documents were correct - and in
all I really needed was to have you listen to me. the end, those few errors which remained either in
the User Requirements or in the test documents,
Each Individual on a Small Team Has a Dif- were the result of still less-than-perfect commu-
ferent Communication Style nication.
Some team members could write a require-
ments document with very little input. Others Show Patience as Timeframe and Priorities
could describe their application verbally and Change
provide nearly enough information to deduce the Throughout the project, the UnderOver team
requirements. Still others had to show the user was clearly stretched. The ten participants were
interface, live or as screen captures, in order to nearly the entire salaried staff at the company; all
explain what they expected the program to do for of them needed to address not only their everyday
them. work but other priorities in addition to software
Similarly, some could read the numbered User validation. Customers visited the plant, quality
Requirements and judge whether these were cor- audits needed to be performed, production re-
Figure 9
List of Initial Test Failures and Reasons for Changing to Pass
views were held on a regular schedule, and other and had no way to coerce this individual or the
areas of training had to be addressed. One team programmer, so the matter was referred to Under-
members husband fought a losing battle with Overs internal team leader.
cancer through most of the project, and died just Getting cooperation took time, but referring
as the testing was to begin. the issue to internal management was precisely
Against this backdrop, the initial project time- the right choice. Beginning with a surprise tele-
line proved unworkable. No amount of chastis- phone call one Friday afternoon, the floodgates
ing via email would help team members provide opened: screen captures arrived, questions were
timely feedback on requirements documents or answered, and this corner of the validation project
test procedures, and site visits were only practical was back in motion.
every few months. Indeed, roughly six months
after project kickoff, all work on the project THE ULTIMATE SUCCESS:
ceased for six weeks. LESSONS LEARNED ON BOTH
Friendly prodding sometimes yielded results, SIDES
but when the validation project stalled, the only
reasonable response was patience. The teams From start to finish, the UnderOver project
silence did not mean that the project had been took roughly nineteen calendar months (bearing
abandoned - rather, that other issues had taken the in mind the hiatus mentioned above). During that
foreground for a time. Patience and confidence time, several new employees came on board at
paid off; when the project resumed, the teams UnderOver, several applications underwent major
focus was even sharper than before. changes, and specific programs were added to and
removed from the project.
Lead the Team Where Necessary, but Let the Validating these twenty-one programs (Fig-
Client Team Leader Address Internal Issues ure 1) helped the UnderOver team members see
An internal manager led the UnderOver team, software not as a magic genie, always doing the
providing resources where needed and keeping masters bidding flawlessly, but as an engineered
the top management apprised of the progress. product, designed to serve a purpose but limited
On technical and organizational issues - how to by the developers fallible understanding. Getting
organize the requirements documents, whether a down to basics - writing down what a program
test procedure would be workable, outlining the should do, then testing to be sure that the pro-
needed procedural documents, updating the proj- gram works as intended - has encouraged these
ect schedule and keeping all informed of progress team members to watch for possible future errors.
- the validation consultant worked directly with From there the software problem reporting work
UnderOver team members, and in time with the instruction gives them a mechanism for reporting
contract software developers. In nearly all cases, those errors.
this direct interaction worked exceedingly well. For the validation consultant, this project was
No project is free from snags, however. One at least as much of a learning experience. How
application proved even more difficult to char- to listen, how to persuade, how to determine dif-
acterize than the rest, perhaps because there had ferent communications styles, and how to keep
been no opportunity to meet the keeper of that an entire team informed as a large project moved
program in person at the outset of the project. forward: all were skills the UnderOver interaction
Early information was helpful - a flowchart, a helped sharpen. o
number of screenshots, some amount of explana-
tion - but filling in the holes became problematic. NOTE: Access, Excel, and Visio are trademarked
When obtaining information became difficult and products of Microsoft Corp. Lotus Notes (also
communication strained, work on this application called Notes) is a trademarked product of
was set aside for several months. The validation Lotus Development Corporation.
consultant could only use collegial influence,
ABSTRACT INTRODUCTION
Few would argue that the principles and processes of In 2001, I wrote a Hitchhikers Guide to the Universe
Validation (Big V) have undergone some transformative of Validation. It was a tongue-in-cheek introduction to
alterations over the last few decades, and more recently as the culture of validation: an ethnography of quality
we entered the 21st Century. In fact, it was in the very name engineering. It was written for the uninitiated, that is to
of the 21st Century that the US Food and Drug Administration say, for those who had not yet had the pleasure of string-
provided the nudge (1). In this article, I review what could be ing thermocouples. The document highlighted, what
seen as the three great forces that today form the discourse seemed at the time, the crucial concerns of our industry:
(language, acronyms, assumptions) on Validation; three dis- Part 11, not surprisingly, at the center of the mix.
cernible influences or factors that are shaping the universe But much has transpired in the field of validation
of Validation. For one, we can witness a move toward a more since then, especially within the field of computer sys-
probabilistic as opposed to deterministic paradigm loosely tems validation. For that reason I feel compelled to write
articulated under the terms of a risk-based approach. or, more accurately, compile varying thoughts and opin-
Secondly, at least in the world of computer validation, the ions on the topic of computer validation.
collision of Validation & Verification (V&V) with the x-Qs has The premise behind this collection of thoughts is that
opened a new space of dialogue between disciplines, which validation, today, is substantially different from what it
in the past did not have much occasion to talk. Finally, our was in the formative 1970s, or even more recently as
third influence is the ever-changing landscape of quality practiced in the name of Electronic Records Electronic
system discourses: from total quality management (TQM) Signature (ERES, a.k.a, Part 11) compliance (2). With-
to capability maturity model (CMM). FDAs Quality System out anticipating too much of what follows, I can safely
Initiative has provided a new framework from which to view say that today we find ourselves at a crossroads whose
Validation. As a consequence of these three forces, the field historical outcome has yet to be written (3).
of computer validation has become a repository of poorly- In fact, one can discern at least three influences or fac-
articulated acronyms (FMEA, QbD, UAT), some hybrid tors that are shaping our universe. The first such force is
expressions (Lean CMM), as well as a curious new-speak a move toward a more probabilistic as opposed to de-
(valudation and lean validation). terministic paradigm loosely articulated under the terms
The concepts (ideology) and practices (rites/rituals) of of a risk-based approach. Ever since FDA opened the
validation have historically elicited fear and awe among the door to a risk-informed approach to validation in 2003
uninitiated. For those who have not experienced the rites and tied it (at least thematically, if not effectively) to
of passage (redlining a P&ID), validation is perceived as an GMPs for the 21st Century, there has been a shift from a
obscure (perhaps dark) science/sance. Today, we increas- structured, determined, causal worldview (design quali-
ingly run the risk of promoting such misunderstandings, fication [DQ] leads to installation qualification [IQ]
when in fact validation boils down to something quite simple. which begets operation qualification [OQ] which) to
As the antidote to these speculative discourses, I propose, in a genealogy of many worlds and parallel universes (4).
this article, a return to the basics of Validation; basics that Risk-based validation has come to embrace the quantum
are primordial if we are to effectively navigate the new-speak mechanical insight that you cant have both position and
of Validation: Validation in the new-clear age. velocity, without some level of uncertainty: A new-world
where stochastic modeling (probability of failure) is a
better gauge than causal, mechanical determinacy. The
second moment occurred when FDA sanctioned the V&V as if somehow the principles of aerodynamics and the
model and subsequently undermined the sanctity of the coefficient of drag (Cd) could be applied to improve the
x-Qs (IQ, OQ, PQ, etc.) (5). This opened a new space of performance of validation. Turnover packages, construc-
dialogue between disciplines and discourse (i.e., IT and tion qualification, and factory acceptance testing (FAT)
Quality) that did not have much occasion to interact. As were designed such that repetition and redundancies (in
such, translation devices were created to build bridges verification and testing) could be avoided, and where
between the names for things: for example, regression overlaps could be leveraged. The prophecies were grand,
(in the IT sense) and qualification (not in the IT sense). the idea was simple: follow good engineering practices
As a consequence of this expanding universe, the lifecycle (GEP) and the area of validation will subsequently di-
model has come to permeate the discourse of validation. minish (perhaps totally eliminated). Was the implica-
The canonical terms of validation have slowly given way tion that we had been following poor engineering prac-
to the new-speak of user acceptance testing (UAT), re- tices (PEP) prior to this point in time? Expectations
gression, where even the word performance in per- were high: 90% of validation would be performed by
formance qualification (PQ) has suffered a shift. The the vendor under the banner of FAT (8). The impulse
third influence is the ever-changing landscape of quality to leverage development testing or FAT is deeply rooted.
system discourses: from TQM to CMM. FDAs Quality Arguments vary from the economic (high costs of vali-
System Initiative, to which we can add Six Sigma and dation) to the ubiquitous timeline imperatives. Some
quality by design (QbD), has provided a new framework arguments are compelling in their simplicity: Look, the
from which to view validation. In these terms, validation equipment or system already works upon implementa-
is conceived as a quality system as opposed to a qualifica- tion, so why do we need additional testing?
tion activity. This approach has expanded the scope of But project management is founded on the holy trin-
validation to include upstream development activities ity Cost-Time-Quality; three factors caught in a universal
and downstream maintenance controls. To paraphrase balance of power. When one expands, the other two
a brainteaser from a 2002 FDA guidance document (5): must adapt accordingly in order for the triad to maintain
the demonstration that a system is validated extends be- its integrity and for the universal balance to be equili-
yond validation in the strict sense of the term. Validation brated. In the past, the discourse on streamlining valida-
(Big V) exceeds validation (IQ/OQ/PQ activities, testing). tion was often at the expense of the quality role through
Validation has been propelled beyond the strict sense of benign neglect and silence. Today, perhaps ironically
the term, and has obtained connotative nuances. The and prophetically, it is in the very name of quality that
little v of testing (x-Q) has become the big V of quality the discourse is re-surfacing. Since the principles of time
controls process and procedure. and money have never swayed the regulatory agencies,
In summary, today we find ourselves at the crossroad it is reasonable that the idiom or principles of quality
of three great discourses: a theory of probability, a life- should intervene. In fact it is the very regulatory body
cycle model, and a systems theory approach. The con- (FDA) that has opened the door with its call to integrate
juncture of these three idioms, I will argue, has not yet quality systems and risk management into current manu-
been completely and thoughtfully fleshed out. As a con- facturing processes as the model for GMPs in the 21st
sequence of this incompleteness, the field of computer Century. And the publication of ICH Q9 (Risk Manage-
validation has become a repository of poorly-articulated ment) (9) and Q10 (Quality Systems) (10) has reinvigo-
acronyms (failure mode and effects analysis [FMEA], rated the old cry to streamline validation.
critical control points [CCP], UAT, QbD), some hybrid But the method is no longer predominantly GEPs.
expressions (Lean CMM), as well as a curious new-speak Sound Scientific Principles (SSPs) are now the call to
(valudation, lean validation, and risk-based validation) arms. Perhaps as amorphous and all encompassing as
(6). The concepts and practices of validation have histori- GEPs, the SSPs are never defined. By SSP do we mean
cally elicited fear and awe among the uninitiated. For parsimony, Occams Razor, Falsificationism (Vienna
those who have not experienced the rites of passage Circle), and gedenken experiments? For who would
(redlining a process and instrument diagram [P&ID]), argue with science (beside creationists) as the basis for
validation is perceived as an obscure science. Today, at a demonstration and confirmation (a.k.a., validation)?
this juncture we increasingly run the risk of promoting We need to understand the process and the critical con-
such misunderstandings (7), when in fact validation trol points, we are told. We should monitor and control
boils down to something quite simple. the parameters that impact quality. Define the design
space. The new-speak of QbD-driven, lean-validation
would have us believe that in the dark ages of validation
WHERES THE BEEF? FAT AND sance, we were testing in a vacuum. Is the implication
LEAN VALIDATION here that if you test against a design specification, you
In the 1990s the buzzword was streamlining validation have elevated your project to that of a scientific enter-
prise? If this new-speak of validation is to be more than resolved. This testing faces inwards, towards itself, so
a sound byte in the language game of obfuscation, it will to speak. For the exercise of validation (a demonstra-
need to be re-grounded in the foundational principles tion that the system performs reliably), the goal is not to
of validation. have issues surface at all. In fact, problems during valida-
The discourse on the science of quality and risk specifi- tion are not bugs, they are called deviations. This is not
cally in the area of computer systems has been supple- simply a semantic slight of hand intended to justify the
mented by a third term: The lifecycle concept. Since the accompanying paperwork. A problem during validation
publication of the FDA General Principles of Software testing must be assessed regarding impact on any previ-
Validation (5, Section 3.1.2), it has been generally rec- ous testing, the criticality of the problem regarding the
ognized that: a conclusion that software is validated is business process (intended use) must be evaluated, and
highly dependent upon comprehensive software testing, the root cause of the problem (for it might be a tip of an
inspections, analyses, and other verification tasks per- iceberg) must be investigated. Imposing this overhead
formed at each stage of the software development life during the development phase of a project, or conversely
cycle. As such, the final conclusion that software is vali- taking FAT (at face value) as Validation, would transform
dated is grounded in a determination that is somewhat our Janus face into a schizophrenic. In fact this illustrates
broader than the scope of validation in the strict sense of the classic definition of the double blind: find as many
that term. (5, Section 2). problems as you can while demonstrating that the sys-
In the strict sense of the term, validation has histori- tem is reliable, robust, and problem free.
cally been understood as the three (or more) Qs: installa- This, I fear is the risk of under-estimating validation in
tion, operation, and performance qualification (IOPQ). the broad sense of the term, and of conflating testing
IOPQ engineers have traditionally not ventured much with validation. When validation becomes sublimated
into the realm of design. Although a few forays by vali- in design and development, it risks becoming a parody
dation into the domain of design and development have of itself. As a consequence, one could easily imagine the
occurred, leading to such aberrations as construction emergence of validation tropes or styles. One example
qualification or design qualification packages, for the would be metonymic validation, where the part (partial
most part validation has been content to operate within regression) is taken for the whole (the validated state).
its holy trinity of acronyms. But the quality systems ap- Metonymic validation could be applied to application
proach to device software development, with its design upgrades, by selecting functionality that is intended to
review requirements, has slowly come to influence the represent the system as a whole. Perhaps another varia-
rest of the FDA-regulated software development arena. tion on the theme would be metaphoric validation.
The software development life cycle concepts, around for Here the terms (language and conditions) of validation
some time now in the software engineering disciplines are adopted to provide the allusion of the state of con-
(Institute of Electrical and Electronics Engineers [IEEE], trol. FAT, site acceptance testing (SAT), and turn over
Software Engineering Institute [SEI]), have come to frame package (TOP) can be infused with the essence of this
much of the validation being performed today. In the state with some minor rituals such as pre-approvals or
biopharma industry, the good automated manufacturing quality reviews. These rituals bring with them a whole
practice (GAMP) model (11) has certainly influenced language game which transforms the mundane into the
this direction with its v-model concept. And yet despite sacred. The key to a successful metaphoric validation
the history, despite the guidance, despite the principles, is to maintain the vivid imagery (the validation effect)
there continues to be an unhappy marriage between the throughout the implementation lifecycle and to repre-
V&V and the IQ/OQ/PQ approaches. While it is true sent change and flux (breakdowns as breakthroughs) as
that the scope of validation can no longer be confined the underlying substratum of a stable foundation.
to testing (the x-Qs)and has expanded to cover up- In fact, it will not be long when the parody of valida-
stream activates (design reviews) and downstream pro- tion will be confused with the act of validation proper: A
cesses (maintenance)this should not imply that since simulacrum, more real than reality itself. We will know
validation (object little v) is everyones responsibility, when validation has become truly post-modern when
it will be absorbed in design. The conflation of FAT, or the demonstration that a system satisfies its intended
development testing, QC or verification with valida- use is achieved by simply pointing to the absence of evi-
tion is bound to fail for one simple reason: Testing is dence to the contrary; or when the existence or presences
Janus-faced (12). FAT (or development testing) has a of an installed application is merely confirmed through
well-defined purpose: Find problems before the prod- the existentialist cry I am here (aka the splash screen/
uct goes out the door. A successful exercise will find scream). That will be the day when distinctions between
an abundance of issues that will be punch listed and retro- or pro-spective give way to the post-spective (or
tion is not measured by the binder, the page, or the kilo; the effort at hand (planning). The era of the stand-alone
although, that has certainly been used as a strategy when processor, where the extension of hardware could easily
the terms were misunderstood. When clarity is lacking, define the limits of The System, and consequently the
the best strategy is to obfuscate, thus raising the bar, and boundaries of validation, is but a fleeting memory. With
upping the ante. Many validation packages are, in this enterprise applications, storage area networks (SAN), vir-
sense, a bluff, and a confidence game. tual servers, CITRIX, and inter-NETed applications, the
proper definition of the term system becomes crucial.
In fact, the elements of computerness (18) will vary
DEFINING THE VALIDATION SCOPE with respect to where this line is drawn. Even if the line
The first question to be asked of any validation project is a decision, defining the system-ness of the system (i.e.,
(once we have understood the terms of validation) is the qualities of being system) is the first step in the act of
what philosophers call the ontological question, and characterization. Complexity, control, and perhaps even
takes the form of What is? This question is particularly elements of criticality, will vary in response to how we
important for a computer system/application implemen- have circumscribed the system, and the boundaries we
tation. Manufacturing equipment (e.g., a lyophilizer) have defined.
might not provide the best illustration of the challenges So for example, does the extract transform load (ETL)
in properly coming to terms with this ontological ques- integration, between the enterprise resource planning
tion. After all, the boundaries of a piece of equipment (ERP) and manufacturing execution system (MES) ,
are usually demarcated by the utility connections at the become part of the ERP or MES boundary? Should
skid (and by the skid itself as physical frame). Or again, network and infrastructure components (e.g., switches,
the P&ID clearly defines the system boundaries, often router, clusters, SAN) and support software (e.g., CITRIX,
conveniently on a single drawing. What the thing is can Perl) be incorporated in an infrastructure qualification
be effectively walked down, empirically validated. Not (i.e., leveraged by individual systems) or included in the
so for a computer system, where defining the system and boundary of the system proper? How dissimilar in design
its boundary can be an art form; if not properly executed, (i.e., commercial-off-the-shelf [COTS] vs. custom) and
things can get very ugly. In fact the problem of valida- sourcing (e.g., software vendor, in-house development)
tion today is defining practical boundaries or scope for can applications be before the monolithic (The Manu-
facturing System = DCS+MES+ERP) approach exceeds its tended use, integrations, and dependenciesin order to es-
coefficient of elasticity and becomes unmanageable? All tablish the basis for a risk-informed approach to validation.
of these are scoping questions that have a lasting impact on Its intent is four-fold: to a) locate a computerized system
the maintainability of the validation throughout the lifecycle within its regulatory environment (applicable regulations),
of the application. b) describe its intended use (business process), c) outline
Certainly in this day of hyper-integration, a single system its architectural design, and d) map the system data flow
(e.g., laboratory information management system [LIMS]) is (integrations and dependencies). This information can be
only six degrees of separation from all other GMP applica- gathered and analyzed for the purpose of documenting the
tions. This fact does not, however, legitimate the lumpers technical, business, and regulatory risks associated with an
desire to conflate disparate components into one hegemonic electronic records management or process automation sys-
classification. It is not uncommon to see systems designed tem.
as four functional components (a COTS automation com- Characterizing a computer system (for the purpose of
ponent, a custom integration application, reporting tools, validation) can be accomplished through the elaboration
and a data warehouse), validated in two parts (automation of three domains: Intended use and regulatory context, sys-
and information), and maintained (change management) as tem design, and context of operation. These three domains
three (with of course overlap to other systems, not originally capture what is fundamental about a computer system, what
identified in the project plan). In the absence of an inte- I have come to call computerness. The following briefly
grated approach to the scoping of an application (between describe what is involved in the act of characterization:
design, use, and maintenance) the SDLC deliverables (qual- Intended use and regulatory context. In this domain,
ity records) become dissociated and no longer traceable. information regarding the business process, predicate
Appropriately scoping a validation project requires input rules, as well as data criticality is described. The purpose
from at least three functions: the technical function to deter- of this domain is to clearly document how the system is
mine the parts (i.e., hardware and software) and their inter- used to support a business function and/or regulated ac-
actions (i.e., interfaces/integrations); the user community to tivity and to determine the extent to which controls over
define the intended use and operational environment (i.e., e-records/e-signature need to be implemented. The use
people and procedures); and the quality organization (i.e., of a computer system has multiple dimensions that can
validation, change control) to manage the paper trail (i.e., impact its successful implementation in a regulated en-
change control, revalidation, discrepancies) from baseline vironment. The first dimension, which we could call
deployment. If system design (hardware and software) GXP impact, can be broadly assessed as the degree to
does not provide compelling reasons to establish boundar- which the application/system influences or affects the
ies around an application, then one should turn to intended quality, safety, purity, effectiveness of a product; or by
use. Boundaries established around intended use make the extension, how that application affects or impacts the
exercise of validation more effective and defensible (in so statement/claims to safety, quality, purity, and effective-
far as validation is often equated with the demonstration of ness. These claims can be found on product labels, cer-
intended use) (19). Yet in the era of enterprise applications, tificates of analysis (CofAs), safety reports. Determining
which cut across multiple functions, business processes, and the impact of a system on product/information quality
predicates, it is often difficult to demarcate clean boundar- defines the Criticality of use.
ies based on intended use. This is where one turns to the The second dimension of use, the predicated use,
quality systems. Change control (and discrepancy manage- defines a system in relationship to the records identified
ment) should also influence the choice of a boundary. Once in the Code of Federal Regulations (CFRs) and com-
baselined and deployed to production, a validated applica- pany standard operating procedures (SOPs). How a
tion will require perfective (upgrades) and corrective (bug system is used to create, modify, store, or transmit such
fixes) maintenance. All of the associated records (e.g., de- records needs to be defined. In addition, in complex
sign changes, code, testing, change control, corrective action systems, the predicated use also outlines the functional
and preventive action [CAPA]) will need to be maintained boundaries of a system, which may have corresponding
and immutably linked to the defined system. organizational structures (roles to features). Defining
the predicated records and business processes that are
satisfied or controlled by a system provides the Context
SIX Cs VS. SIX SIGMA of use.
Scoping is the first step in defining what a system is by es-
tablishing the boundaries of a system. But scoping does not System design. The purpose of this domain is to char-
necessarily provide an approach to validation. It delimits acterize the risks to application data derived from the
the territory but does not describe it. The next step is what technical design of the system. This entails a review of
could be called the characterization of the system. The system/architecture diagrams, integrations, and depen-
purpose of characterization is to assist in the development dencies, as well as flow of data (input/output [I/O]).
of a detailed description of a computerized systemits in- The design of a system is a significant contributor in
the ability of an application to satisfy its intended trol, conditions, confidence) have been documented, the
use. Risks associated with design may be related to application of a risk methodology can be achieved (20).
performance, user interface, or platform stability/ I dont claim this approach to be uniquely novel, or an
compatibility. The first dimension of system design untimely meditation. In fact the documentation of in-
is Complexity. System complexity comes in many tended use or criticality has been a central activity of most
forms, including technical and organizational. But validation planning, since time immemorial. More often
complexity is not simply a function of the number than not, however (from my experience), this thought
of I/O or branches in an algorithm. System depen- process, which is central to defining a validation strategy,
dencies and integrations (with their corresponding rarely finds its way to paper. Rationale for testing is either
information flows) contribute to complexity risks. undocumented (project decisions long since forgotten) or
Functions, roles, menus, features, screens (and their based on some shaky foundation (i.e., tautological). To
interdependence) also contribute to the complexity evaluate risk before a system has been adequately scoped
factor. The second dimension of design that requires and characterized is tantamount to placing the cart (of
characterization involves Control over the records risk) before the horse (of system). And yet this is not an
a system manages. The control element of system uncommon scene where we encounter fully-developed
design covers both the logical and physical security risk assessments, without a clear definition of the system
risks associated with the storing and transmitting of scope or characteristics.
(network, internet) electronic records.
Context of operation. The context of operation in- RISK MAY CAUSE FAILURE, BUT SUCCESS
cludes a description of the context or environment CAN NOT COME WITHOUT IT (21)
of operation. In this domain, issues regarding sys- Risk management is a complex subject because each
tem/data security and confidentiality are addressed. stakeholder places a different value on the probability
Business continuity and system support require- of harm occurring and on the detriment that might be
ments are defined. The purpose of this domain is to suffered on exposure to a hazard. (22)
define the procedural controls necessary to operate Operating under the influence of the three constella-
the system in compliance with its business critical- tions defined above (risk, quality, and life-cycle) provides
ity, regulatory impact, and technical complexity. some interesting challenges. If the call to a risk-based
The accuracy, integrity, attributability, and security approach is to be anything more than an empty signifier
of system data is not simply a function of its use and in a marching order, we must better understand how and
design. Systems are interactive and dynamic; they where we can apply this approach. In my capacity as an
undergo use, abuse, and change. How these Condi- auditor, I have reviewed many sophisticated (both from
tions of operation are designed and implemented a process and mathematical perspective) risk assessment
will directly affect the systems ability to perform its methodologies that come to the trivial and uninformative
intended use. Operations such as data backup and conclusion that, for example an MES system is a critical,
recovery, procedural controls over use, change con- high-risk, GMP system, indeed. The ultimate irony of
trol, and the management of system issues are all this exercise is that it simply leads, more often than not,
contributing factors to the environment of a system. to a classification without consequence. The system gets
System characterization must address the conditions assigned a category (usually a 1), a check box is filled, a
of operation as a contributing risk element. Finally, paragraph entered in a validation plan, and voila, instant
the sixth element of computer characterization that risk-based validation.
can affect risk of use includes the Confidence factors. The purpose of characterization, however, is not sim-
Confidence (in the human and statistical sense) in ply to catalogue a system within a pantheon of appli-
a computer system can be derived from a variety of cations (the naturalist impulse). Characterization must
sources, including the maturity of the product, infor- provide insight and justification for the control strategies
mation on the vendor (through audits, for example), (technical, management, operational) selected to ensure
and the product itself (i.e., level of documentation that system records are secure, accurate, have integrity,
available). All of these are mitigating factors in the and are attributable. System characterization informs the
determination of risk. Validation effort will be (in- validation strategy and the risk assessment. As such the
versely) proportional to the level of confidence in the characterization documentation must be risk informed.
system. Although confidence can be subjective (and A risk-based approach to system characterization must
often misguided), it is important to document these identify the particular potential vulnerability of the sys-
factors in the overall definition of risks associated tem under investigation. For each of the six domains
with the use of a computer system. defined previouslycriticality, context, complexity,
control, conditions, and confidencethe analysis will
Once the six Cs (criticality, context, complexity, con- identify those elements that affect the risks posed by the
system to product quality, data accuracy, security, etc. that question is, of course, What is Validation?
Risk informed means that each element (e.g., the num-
ber of functions and/or users) in the characterization of
a domain (e.g., complexity) can be assessed regarding its CONCLUSION
relative risk factor (and consequently assigned a value on In this article, I have tried to return to the primal question
an ordinal scale or a rank on a relative scale). What is?, using the computer system as the object in
Typical risk strategies begin with the application of question. I have taken up this topic, because valida-
functional requirements (set of functions and features) tion in the new-clear age finds itself at the confluence
to plot the probability, consequence, and detectability of three great forces: The revival of quality systems dis-
factors associated with a function/requirement. This ap- course, a probabilistic approach structured around an
proach provides a clear trace from requirements, through evaluation of risk, and an integrated perspective under
risk assessment to testing strategy. The problem, how- the framework of the lifecycle. This constellation, one
ever, is that it focuses predominantly on intended use could argue, constitutes a new paradigm for the new-clear
(criticality), at the exclusion of other risk factors (such as age of validation (although its history has yet to be writ-
application design, data flow, and condition of opera- tencomedy or tragedy).
tion) identified previously as key risk contributors. The Without a clear articulation of the fundamentals (e.g.,
general problem with these approaches is that they begin definition, scoping, and characterization), the act of vali-
at the end, as if to reverse engineer a desired outcome or dation runs the risk of being lost in sound bytes such as
an apodictic truth (the self-evident). The process, how- streamlining validation, Lean Validation, or risk-
ever, must begin at the beginning. It begins with scop- based validation: expressions without consequence.
ing, proceeds through characterization, and concludes Without a clear understanding of the basic tenet and first
in a strategy that is risk informed. I am not advocating principles of validation, we will never reach the heights
here a particular methodology Hazard Analysis and that these narratives promise. If we are truly to benefit
Critical Control Point (HACCP), HAZOP, FMEA, Fault from the great forces that today shape our universe, we
Tree Analysis (FTA)for documenting hazards, faults, must not forget our origins, even if they are only myths.
or effects. In fact I would warn against the fetishism of
method. Too often the method overshadows the process
and takes a life of its own. Whether or not a FTA is prefer- ENDNOTES
able to an FMEA is less important than knowing what the 1. FDA, Pharmaceutical cGMPs for the 21st CenturyA Risk-
object of investigation is (recall the ontological question). Based Approach, Final ReportFall 2004, September 2004.
The method is only as good as the staging or prework. 2. FDA, 21CFR11 Electronic Records, Electronic Signatures,
The final product can take many forms. My preferred Final Rule (20 March 1997).
approach is a document (stand alone or part of validation 3. Although I am no longer a card carrying member of the
plan) outlining the individual risk factors for each do- profession, having retired my thermocouples some years
main as a narrative description, with the corresponding ago, I would like to retain the form of the we throughout
risk mitigation strategy (controls, test strategy, etc.) that this paper.
will be implemented. I am not a proponent of quan- 4. By recording dates I run the risk of getting embroiled in
tification, and am more easily swayed by clear exposi- (false) historiographic debates over origins and first encoun-
tion or rationale. My personal bias, however, should not ters. I am only interested here in the confluence of forces
sway others from embarking on a model that would rank that are driving present terms and future directions. I am not
system risks along the domain defined previously. This interested, here and today, in cataloguing the first sighting of
relative ranking could trigger (pre-defined) strategies such a risk-informed validation that may have occurred in 1984.
as do nothing because risk is acceptable, to implement 5. FDA, General Principles of Software Validation; Final Guid-
procedural controls, monitor and report, or demonstrate ance for Industry and FDA Staff, 2002.
mitigation of risk through a formal test protocol. One 6. My argument, here, and the claims that follow, is not that
word of caution, however, is that the ranking should be FMEA or Hazard and Operability Analysis (HAZOP) can
tied to a strategy of control, otherwise it is without con- not be successfully leveraged to better understand system
sequence. A second word of caution is that, against the risks and points of failure (vulnerability), nor that a QbD
scientific precept (metaphysics), data do not speak for approach will not help us better focus our validation ef-
themselves. There is always an interpretive overlay that forts, but rather that a failure to attend to the first principles
makes sense of the facts (as we call them). If we are not of validation (outlined below) will ineluctably lead to the
to be seduced by our own ventriloquism, we must take diminution (or dilution) of their impact.
care not to misinterpret the risk score (quantifier) as the 7. An essay that best encapsulates this misunderstanding is
solution to the problem, as the end state in the analysis; (ironically) the 2005 ISPE white paper on Risk-Based Quali-
as if somehow the number (e.g., 42) could be a response fication for the 21st Century.
to a question that was never (and neednt be) asked. And 8. Many equipment vendors (and software vendors) today pro-
vide their own set of test scripts and validation protocols, is the fallacy of the empiricist who takes individual occur-
which for a small fee can be executed by the company for rences and events as general categories. Validation is not
instant validation (gratification). Pre-packaged protocols/ fundamentally an ontological exercise, it is exegetical. And
scripts sold with equipment or software can be useful as a yet it cannot escape this first act (and its trappings).
smoke test to confirm that the installation was successful, 18. Reference to Talking Heads - True Lives.
but rarely do they provide an adequate basis for validation. 19. FDA, Guidance for Industry Part 11, Electronic Records;
The reason is simple: These packages cannot provide ad- Electronic SignaturesScope and Application, 2003. In the
equate challenges without running the risk of significant Part 11 Scope and Application document (2003), FDA has
deviations. This is especially true for applications that are provided another compelling reason to link use (regulated
highly configurable, with multiple, complex, and divergent activity, predicate requirements) with validation, by suggest-
final states. The adaptive response is to provide a vanilla ing that validation might be optional (or reduced) if the
package, which has already been pre-tested at the factory to business process (and the corresponding record risks) can
guarantee success. be shown to be minimal.
9. ICH, ICH Q9: Quality Risk Management, November 2003. 20. I am not married to 6Cs, it could just as well be 4Rs or
10. ICH, ICH Q10: Pharmaceutical Quality System, September 3Ps. The point of this exegesis is that one cannot validate
2005. what one has not defined. And by extension, one cannot
11. ISPE, GAMP Guide for Validation of Automated Systems, validate well, what has not been well characterized. In my
Volume 4, 2001. ISPE, GAMP Volume 5, A Risk Based Ap- career, I have audited many validation protocols that failed
proach to Compliant GxP Computerized Systems, 2008. to describe the system in a manner that would enlighten
12. Janus, the Roman God of gates and doorways. that validation effort. Defining a LIMS as a Laboratory In-
13. One could easily discount my examples of validation tropes formation System is a truism; at best uninformative, at worst
as mere fantasy or exaggeration. Unfortunately, on more a platitude.
than one occasion, I have reviewed validation packages that 21. Actual fortune cookie wisdom.
can only be described in these terms. 22. ISO 14971, Medical DevicesApplication of Risk Manage-
14. The term empty signifier is used here to represent expres- ment to Medical Devices, 2000. JVT
sions (or acronyms) that are no longer grounded in the
history and traditions of a discipline, but circulate freely as ARTICLE ACRONYM LISTING
banners and call to arms. Because they are not grounded CAPA Corrective Action and Preventive Action
(weighed down) with the gravity of practice they can be CCP Critical Control Points
exchanged without consequence (FAT = OQ, UAT = PQ, Cd Coefficient of Drag
Design = Test). The terms are interchangeable, not as a func- CFRs Code of Federal Regulations
tion of an economy of signs (a formal exchange value), but CMM Capability Maturity Model
as a function of their propinquity or strange attraction. CofAs Certificates of Analysis
15. FDA, Guideline On General Principles of Process Valida- COTS Commercial-off-the-Shelf
tion, May 1987. DQ Design Qualification
16. There is some debate as to whether the relative pronoun ERES Electronic Records Electronic Signature
that was intended here, in which case the auxiliary clause ERP Enterprise Resource Planning
provides a high degree of assurance is intended to be ETL Extract Transform Load
restrictive of the evidence provided. Not all documented FAT Factory Acceptance Testing
evidence counts as validation, only that which provides an FDA US Food and Drug Administration
assurance. I have modeled the sentence structure accord- FMEA Failure Mode and Effects Analysis
ingly. It is the evidence that provides an assurance. How- FTA Fault Tree Analysis
ever, one could argue that by extension, applying the transi- GAMP Good Automated Manufacturing Practice
tive property of equality, validation does also provide such GEP Good Engineering Practices
assurance, in which case the choice of that or which is GMPs Good Manufacturing Practices
irrelevant. As a point of curiosity, it is not uncommon for HACCP Hazard Analysis and Critical Control Point
authors to misquote this passage in the literature on valida- HAZOP Hazard and Operability Analysis
tion. IEEE Institute of Electrical and Electronics Engineers
17. There are two well know philosophical traps associated with I/O Input/Output
the question What is? The first is known as the norma- IQ Installation Qualification
tive fallacy, which involves confusing What ought to be Lean
(principles, ideas, theories, Platonic Forms) with What is. CMM Lean Capability Maturity Model
This is the fallacy of the rationalist or idealist who takes first LIMS Laboratory Information Management System
principles (theory, what ought to be) as the truth of the real MES Manufacturing Execution System
(what I encounter). The second is known as the naturalist OQ Operation Qualification
fallacy, which takes What is for What ought to be. This PEP Poor Engineering Practices