You are on page 1of 92

Computer &

Software Validation
Computer System ValidationDefinition and Requirements....................................................................... 1

Computer System Design.......................................................................................................................... 4

System Design and Control....................................................................................................................... 8

The Nine Most Common Computer Validation Problems Identify Frequent Deficiencies to Accelerate
Your Validation Projects..........................................................................................................................13

Accurately Identifying Your RequirementsWill Any Computer System be Right for You?........................17

Computer Systems Quality and Compliance vs. Software Validation......................................................... 20

Computer Systems Change Control.......................................................................................................... 23

How to Right-Size Computer System Validation Based on Criticality and Complexity............................. 27

Practical Use of Automated Tools in Computer System Compliance...........................................................31

Selecting and Partnering with a Vendor for a Qualified Software Product................................................. 34

Information SecurityA Critical Business Function................................................................................. 38

Disaster Recovery and Business Continuity............................................................................................. 44

System Definition: Defining the Intended Use for a System........................................................................51

Lessons Learned in a Non-Regulated Software Validation Project............................................................ 66

Computer Validation in the New-Clear Age.............................................................................................. 78


Sharon Strause

Computer System Validation


Definition and Requirements
Welcome to Computer Validation Forum. tently produce a quality result meeting its predetermined
This column discusses topics and issues associated specifications. Many will recognize this definition as an
with computer validation in order to provide a useful interpretation of the US Food and Drug Administrations
resource for daily work applications. It provides infor- original process validation definition.
mation regarding regulatory requirements for the
validation and qualification of computerized systems. Components Of A Computer System
Computer systems are used widely in the daily work The components of a computer system include hardware,
structure of all the life science industries. Technical software, operating procedures, processes, and personnel.
considerations associated with computer systems and The Figure illustrates the areas required for consideration
the validation and qualification required are broad in the validation and qualification of computer systems.
and complex. Understanding the basic principles
supporting computer systems is fundamental to daily IMPORTANCE OF PROJECT MANAGEMENT
operations. Control and compliance are the key inte- A CSV project that meets budget, is implemented in a
grators for all computer systems in the life science timely fashion, and meets all the regulatory requirements
industries today. for the system must start with a formal project planning
Your questions, comments, and suggestions are process and a system development lifecycle (SDLC). These
required to fulfill the objective for this column. Please programs require both an experienced project manager
send your comments to column coordinator Sharon and a qualified validation manager. FDA has stated many
Strause at sastrause@aol.com or to journal coordinat- times, Those who fail to plan, plan to fail. Planning is
ing editor Susan Haigney at shaigney@advanstar.com a critical factor for the entire CSV project. If your com-
pany does not have a project management tool, there
KEY POINTS are many on the market which can be utilized to keep
The following key points are discussed in this article: track of multiple timelines, deadlines, personnel, critical
The definition of computer system validation (CSV) meetings, and due dates.
Project management and the software development There are many SDLC processes, which are used in vali-
lifecycle (SDLC) are the starting points dationthe waterfall model, the V-model, the Onion
Requirements are the primary key to CSV model. It doesnt matter what SDLC process is used as
Other points to consider include US Food and Drug long as it begins with the development of the project and
Administration requirements, the overall quality pro- ends with the ongoing maintenance of the system once
cess, validation, and documentation. implemented. It also includes the ultimate retirement
of the system.
INTRODUCTION Once a project management team has been established
This first installment of Computer Validation Forum they can begin requirements gathering.
introduces a series on the subject of computer system
validation (CSV) by defining CSV, looking at the REQUIREMENTS
importance of project management, and specifying Requirements will determine the scope of the project. The
CSV requirements. validation and/or qualification should be the first major
deliverable for any computer system. Again referring to
THE DEFINITION OF CSV the Figure, requirements include the following:
Computer system validation establishes documented Software. How the software is to operate.
evidence providing a high degree of assurance that a Hardware. The hardware including the server
specific computerized process or operation will consis- Controlling system. The operating system on

[
For more Author ABOUT THE AUTHOR
information, Sharon Strause is a senior consultant with EduQuest, Inc. Sharon may be reached at sastrause@
go to aol.com.
gxpandjvt.com/bios

Special Edition: Computer & Software Validation 1


Sharon Strause

the server and the


Figure: Areas for validation and qualification.
database used to Areas to be Validated
collect the data from the
software.
E  quipment. Equipment
is other computer systems Software Hardware
1 2 Equipment
or pieces of manufactur- 4
ing equipment with which
the software may interact.
Operating procedures Operating
Procedures and
and documentation. Documentation
These all have require- 5
ments that include peo- Controlling System Controlled Process
ple who will be doing the (Computer System) 6
3
work of validation, people
who will be trained to Total System
(Computerized System) and
build the system, and peo- 7 all the links
ple who will be trained to between the boxes
Operational Environment
utilize the system once it 8
is in place.
C  ontrolled processes.
Established controlled processes and change control the foundation of the overall project and the validation
need to be reviewed or addressed to ensure that con- required. CSV can be as simple as an Excel spread-
trol is maintained throughout the life of the project sheet or as complex as an enterprise resource system,
and for the ongoing stability of the system once vali- thus the reason for the critical nature of realistic and
dation and qualification is complete. testable requirements.
Total computerized system. Networks may be Once testable requirements have been established,
local or wide area, may utilize the web, may be the project can begin; validation can be established; risk
within a corporate intranet or utilize the facilities evaluation can be started; and the goal of a validated and
of the Internet. qualified system can be reached.
O  perating environment. Security will be
addressed as both a part of the operating environ- POINTS TO CONSIDER
ment, the software and operating systems on the Additional points should be considered in the valida-
hardware, and all interfaced equipment. tion and qualification of a CSV, including FDA require-
Another way of determining the requirements is to ments, quality process, validation checkpoints, and
ask the questions who, what, why, where, and when. documentation.
Answering those questions will make the requirements
gathering process easier and will help in determining the FDA Requirements
priorities of the system. FDA requirements regarding current good practices
Once the system requirements have been gathered, (CGXPs) are as follows:
the process of determining the regulatory requirements Hardware is considered to be equipment within the
will begin (see Reference section). For what will the data meaning of the CGXP regulations
developed on the system be utilized? Regulations need Software is regarded as records or standard operating
to focus on the purpose, use, and reporting of the data. procedures (SOPs) within the meaning of the CGXP
There may be regulations outside of FDA that will be regulations
impacted by the data. For example, in an enterprise Software maintenance is considered revision or
resource planning system, data will be subject to finan- change control
cial regulations, possible Environmental Protection Record controls require programs to ensure accuracy
Agency (EPA) regulations, possible Occupational Safety and security of computer inputs, outputs, and data
and Health Administration (OSHA) regulations, etc. Record access requirementsavailable for inspection
Again requirements will help to determine the regula- and subject to reproduction.
tions required and ultimately the extent of the valida-
tion and qualification that will need to be done on the Quality Process
computer system. The quality process needs to be in place and should
Requirements gathering should take time, because it is include the following:

2 Special Edition: Computer & Software Validation


Sharon Strause

SDLC methodology Installation plan and records


Project planning Training plan, procedures, and evidence of training
Personnel qualifications SOPs
Documentation standards and procedures User acceptance
Methods for review and approval Validation report
Design standards Retention of critical documentation.
Programming standards
Configuration management Documentation
Testing standards and procedures Documented evidence should include the following:
Separation of development, test, and production Validation plan
environments (logical/physical) Business and system function requirements
Move to production process System design specifications
Clearly defined responsibilities Validation protocol
Involvement of customer/user, quality assurance Test plans, scripts, results
professionals, and technology professionals Documented development testing (i.e., unit, integra-
Change management tion, system testing)
Change control Installation qualification
Training process Operation qualification
Process for continuous evaluation, incident monitor- Performance qualification
ing, and error correction Validation report
Processes and procedures for physical and logical Standard operating procedures
security of system and data. Manuals (e.g., development, user, support)
Change records
Validation Logs, operational records, audit results.
Validation checkpoints should be in place as part of
the overall project management process. Consider the
following: REFERENCES
Evaluation, analysis, and rationale for system and 21CFR11, Electronic Records, Electronic Signatures.
its validation 21CFR210, Current Good Manufacturing Practice in Manufactur-
Validation strategy ing, Processing, Packing, or Holding of Drugs: General.
Business, system, and function requirements 21CFR 211, Current Good Manufacturing Practice for
Detailed system design specifications Finished Pharmaceuticals.
Validation protocol 21CFR820, Quality System Regulation. JVT
Test plan
Development testing and verification (structural, ARTICLE ACRONYM LISTING
unit, integration, and system) CGXP Current Good (ALL) Practices
Vendor and supplier evaluations CSV Computer System Validation
Hardware and software qualification (installation EPA US Environmental Protection Agency
qualification, operation qualification, performance FDA US Food and Drug Administration
qualification) GXP ALL Good Manufacturing Practices
Procedures IQ Installation Qualification
Utilization OQ Operational Qualification
Administration OSHA Occupational Safety and Health Administration
Maintenance PQ Production (Performance) Qualification
Monitoring SDLC System Development Lifecycle
Change management
Change control

Originally published in the Spring 2009 issue of Journal of Validation Technology

Special Edition: Computer & Software Validation 3


Robert Smith

Computer System Design


Robert Smith

Computer Systems Quality and Compliance discusses the quality and compliance
aspects of computer systems and aims to be useful to practitioners in these areas.
We intend this column to be a useful resource for daily work applications.
Reader comments, questions, and suggestions are needed to help us fulfill our
objective for this column. Case studies illustrating computer systems quality and
compliance issues by readers are most welcome. Please send your comments and
suggestions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com
Rupert King/Getty Images

or journal coordinating editor Susan Haigney at shaigney@advanstar.com.

KEY POINTS
The following key points are discussed in this article:
Systems design is the process or art of defining the architecture,
components, modules, interfaces, and data for a system
System design should consider the entire system lifecycle to prop-
erly manage costs and compliance
System changes, maintenance, and future expansion or other orga-
nizational changes should be part of system design
The role of quality is often compromised in system design in favor
of project cost and timing
Security issues, both external and internal, are an important con-
sideration
System designers must consider the needs of the quality area in sys-
tem design and must actively solicit their input
Quality unit personnel, in turn, must carefully consider their
needs, and clearly communicate these needs to system designers
Do not under estimate the cost and time impact of even the small-
est change.

INTRODUCTION
My six-year-old daughter is often fascinated by things that fascinate me.
On the cover of a book that I had asked for her to bring to me was a
picture of a kettle with the spout and the handle on the same side. She
studied the picture for a moment and then reported carefully, that is not
a very good design! I was delighted in her discernment. It was easy for
her to understand the intended use and know that this will not work
very well.
How often do we fail to have these insights when designing GXP com-
puter systems? More often than wed like to admit. Pressures mount
to do more with less, hit timelines, show return on investment, and
meet commitments. These are all admirable things, and senior manag-
ers should push system designers and project managers to contribute
to the business by thoughtfully executing against those mandates. At
the same time those very same project teams need to keep stakeholders
informed about the technical debt they are accumulating. If teams are
making decisions to sacrifice quality or maintainability in order to meet
those demands, technical debt is incurred. The payment on technical
debt, like personal debt, has a cost that can be felt for a long time. The
recurring costs of technical debt are far greater than addressing the issue
presently.
The more likely that changes to a system will occur, the more impor-

4 Special Edition: Computer & Software Validation


Robert Smith

tant it is to understand the long term cost of those not change a calculation for yearsit makes sense
changes. Elements of a system that are subject to to separate these. Often in looking at production or
higher velocities of changes are the best candidates deployment phase plans there is a one-size-fits-all ap-
for analysis. This column will explore some common proach. This often leads to something that is imprac-
tradeoffs that lead to technical debt. tical or worse.
Upfront planning to develop specific strategies to
FIX ONEBREAK TWO handle different change velocities and understand the
One small example that can lead to technical debt is risks associated with these changes helps significantly
hard coding a variable that, by its very name, we to develop cost-effective plans that look at the system
know will change over time, to save a few days devel- over time. Focusing on lifecycle cost planning will
opment time. This might be a password, a common minimize the technical debt of the deployed applica-
security mistake, or some configuration setting like tion.
the name of a database server. It is easy to hard code
such a thing to save time, but because the likelihood Changing A Password
of change is high, the cost of this shortcut is high. For example, systems that have passwords that are
This is true for two reasons. One is that a validation used infrequently are going to result in passwords
process must be re-executed and the other is the risk that expire or are forgotten by users. What is the
that something else might get inadvertently changed strategy for managing this? Let the help desk do
or that there is some unintended consequence. This is password resets manually by routing a ticket to the
commonly called the fix onebreak two syndrome. In database administrator? Thats the most expensive
short, it is a change that leads to technical debt. solution. Write a tool so that the help desk can
A password mistake is a perfect example. Good do it for them? This is the better approach. Add a
security requires frequent changing of passwords. If self-service feature in the application? This is the
a password is hard coded, then a new version of the best approach. Knowing what to do requires some
software (called a release) is required to update the planning and time up front. Imagine a 1000-user
password. For a validated system, this will result system and assume 30% will need one password
in an even larger cost the organization will pay over reset a year. This is an optimistic estimate. Suppose
and over again. If the organization does not change each help desk call costs $50 by the time the security
the password to avoid this cost, it has traded good administrator changes the password and the system
information security practice to pay the technical is in use for five years. The organization will spend
debt and also accepted a 21 CFR Part 11 compliance at least $75,000 on tickets alone. This is more than
risk. Assuming the system has a reasonable life of five it would cost to implement a self-service I-forgot-
years, the technical debt per year of not making the my-password feature. This model doesnt even
password easy to change is either poor security and consider any impact to the business, such as inability
a compliance risk or the cost of two or more releases to release a lot while an engineer is locked out, so the
per year over five years. Besides the recurring costs total technical debt could be much higher.
of the releases, the organization will also assume the
risks related to releasing and validating the applica- Changing A Storage System
tion. Surely it would be more efficient to handle the Another example is the case of an electronic record
password correctly in the first place. Pay now or pay a storage system. Lets use some numbers to illustrate
lot more later. the point. To make the math easy, lets assume that
a basic validated system costs $1 million and has
MANAGEMENT OF CHANGE a 10-year life. The team reports that they need an
Understanding the concept of change velocity is im- extra $100,000 to address an archiving feature or the
portant for any system, but even more so for validated system will outgrow the storage system early in the
systems. Specific strategies need to be in place for systems expected life. The extra money is deemed
dealing with varying rates of change. What is the best too expensive. The project was already spending ev-
way to manage these varying rates of change? What ery dime, so the decision is to address it later. Over
are the costs associated with the changes and how time business needs change slightly as it becomes
should an organization manage them? Vendor soft- paperless, and in five years the system is critically
ware may move at one speed. Internally developed low on storage. A new project is proposed to add the
customizations probably move at another rate until archiving feature. Because this is a validated system
the system matures, but may accelerate if business and now contains five years of electronic records, it
processes change. Microsoft patches its operating sys- will take a full release and sufficient testing to show
tem monthly, commercial application vendors might that the records are archived correctly. Lets say
patch quarterly, and an analytical chemist might the team can do this for $500,000 and delivers it

Special Edition: Computer & Software Validation 5


Robert Smith

robustly on time. But now the last five years of the system, but is something that is designed in, then the
system depreciation costs twice as much. Would the stage is set for initial planning and subsequent discus-
$100,000 in initial project costs have been worth sav- sions about trade-offs and tuning to ensure that all
ing $400,000? This is the kind of technical debt that four variables have a place at the table. When quality
needs to be managed thoughtfully at the beginning. is simply assumed, then bad things can happen and
they usually show up in the form of technical debt.
SYSTEMS DESIGN In this authors experience, most organizations have
Systems design is the process or art of defining the strong formal and informal mechanisms to ensure
architecture, components, modules, interfaces, and data project costs do not exceed the plan. And for good
for a system to satisfy specified requirements. Today, reason, as the system development community has
more than ever, system design must be cost effective. accumulated few headlines for on-time, on-budget,
Todays economic conditions require full lifecycle cost on-scope, and on-quality success. The technical teams
to be factored into decisions. It is not uncommon for need to do a better job of expressing the quality trade-
the maintenance phase to prove more costly than the offs in business terms and identifying risk factors that
implementation phase. The maintenance phase is often the business can understand. Telling a business leader
not considered or analyzed but is a counter force of get- we need more time to fine tune the user interface or
ting the cost out of the business. The proverb of the frog make usability changes is hard to relate to a business
sitting in water with the temperature slowly going from impact. Stating that there are data that suggests one in
cool to boiling is a good reminder. The frog doesnt five users makes errors that could result in erroneous
notice the heat because the rate of rise is slow, but in the filings to a governing body and here are the errors is
end he is cooked. From the preceding examples we can something that can be processed in the business risk
clearly see that understanding and managing technical management and review framework.
debt can have a profound impact on GXP computer sys- Thus, in order to have a fact based dialog, decision
tems and allow us to jump out while the water is cool. makers need to be involved up front with competent
Anytime we are asking the organization to pay more system designers who understand both how to get
or take more time in the implementation phase, we have things done and how to consider what the organiza-
to articulate the value proposition. That proposition tion will pay over time. These pay-me now or pay me
will be the benefit of addressing a lifecycle cost now vs. later time bombs are not just measures of technical
assuming the recurring cost and risk over time. Few acumen. They are also indicators of business savvy.
teams are getting a blank check in todays environment. Business leaders need to have trusted technical leaders
How does a team explain the value proposition? Some that can help get the cost out of the business by not just
points are obvious, some are not so obvious. Most deci- excelling at technical execution, but also by understand-
sion makers want to be rational and make wise deci- ing how to speak to the business.
sions for their organizations. In order to support fact If a team understands its customers, it can imple-
based decision-making, teams must tally the technical ment in a cost-effective way. For example, enabling
debt and make sure that decision makers understand users to add reports using validated features can avoid
what they are buying on creditsort of the fair disclo- more costly-to-deliver and harder-to-get-scheduled IT
sure doctrine of GXP system development costs. It must releases. In this authors experience, it is rare to see
be expressed in business terms identifying clearly what those trade-offs surface up front. Most senior business
the cost and the benefits are. Numbers and specific leaders would rather know theyll get all the reports
examples that support business decision-making are that they asked for upfront in the validated system,
critical for influence. It cannot be expressed in techni- but anything else will be another costly release. Most
cal geek-speak language. would like the chance to ask if there is a way to avoid
those costly releases.
THE STOOL HAS FOUR LEGS When designing for maintainability, the concept
Yet another type of technical debt is assuming that qual- of change velocity comes up again. In this authors
ity of a system is simply something that exists at some experience, there are many tightly-coupled or inter-
constant level. Often this happens when quality is faced systems that should be loosely coupled. Tight
assumed by taking it off the table with statements like coupling occurs when one module or system relies
we never compromise on quality. Traditional project on another module or system so strongly that a small
management paradigms articulate that there are three change in one will require an implementation change
legs (i.e., scope, resources, and time), but with a wink in the other. The following is an example of tight
we all know there are really fourquality does not system coupling: System A needs to view System Bs
simply exist. Quality is often traded to make the other records. To make things fast, the B team sends the A
three. If teams and their sponsors agree right up front team source code from their system. A implements Bs
that quality is not a magic property that appears in a code and the organization is happy. Any time a user

6 Special Edition: Computer & Software Validation


Robert Smith

of A needs a B record, they can get it. Later B adds subject to government regulation, most notably the
another record type and users of A still need to see it. Health Information Portability and Accountability Act.
But now both A and B have to release anytime there is Understanding the risks, vulnerabilities, and counter-
a changeNot goodPay a lot later. measures is important in system design, and it is the
What is the correct solution? B could have imple- most cost effective as part of design as opposed to later.
mented a service for A, show me a record. With a Often failure to plan for this creates expensive and
little thought something as simple as show-me-the- time-consuming redaction programs.
record-this-ID could be implemented. Then A and
B are loosely coupled so one system can be changed IMPLICATIONS FOR COMPLIANCE
without the need to change another. The cost effective Compliance personnel should always be part of com-
paradigm is to make tight coupling rare. It might cost a puter systems design activitiesthe fourth leg of the
little more up front, but it will save a lot later. stool. They can provide valuable input regarding qual-
This can pay back in more ways than one. Not only ity requirements that will minimize future costs and
can an organization avoid extra release costs, it can system downtime. When the quality area is overlooked,
also improve uptime, as now only one system needs be future changes to the system will surely be needed, and
taken offline to make an upgrade. these future changes equate to additional costs, down-
time, and potential problems affecting other systems.
PLANNING FOR THE FUTURE The quality area must also be mindful of the importance
Understanding how the user community is expected to of their input. The quality area must carefully consider
change and probable impacts on electronic data can its needs and must clearly communicate these needs to
have a dramatic impact on lifecycle costs. Does the the systems designersdo not underestimate the cost
system need to support a business acquisition plan? If and time impact of even the smallest change.
so, this could dramatically affect the user count and
make one design appropriate or inappropriate by alter- CONCLUSION
ing scalability needs. Will more than one geographic Good software design is complex. These are just a few
location be using the system? If so will data consolida- examples of how shorting the initial planning and
tion be required? Knowing the answer to questions implementation can result in significant downstream
like these may not only affect system architecture, costs. Business owners of systems and budget decision
vendor selection, and technology selection, it may makers should set clear expectations that while certain
also require the addition of a data warehouse to meet budget and schedule goals are in place, the expecta-
reporting needs. Often fixing things like these later tion is that system designers provide solid information
becomes massively expensive when compared to en- related to lifecycle costs. That information can be used
abling the system for scalability up front. Often, senior to get to the best decisions related to managing technical
leaders will make different choices if they have the debt and cost effectiveness.
data and facts to allow good decision support. Skip-
ping these steps frequently leads to unanticipated costs ABOUT THE AUTHOR
and can undermine the technical teams credibility. Robert Smith is an application technical lead responsible for quality
systems software development at Abbott Vascular. Prior to this, he
Security is often addressed as an afterthought.
was Sr. Director, Engineering at Symantec Corporation, where he was
Sometimes teams work hard to get the system to work, responsible for developing enterprise client, host, and server based
then say, lets make it secure. At this point it is too corporate security products as well as the Symantec and Norton Live
late. Security, like quality, needs to be designed in and Update offering. Robert has 25 years of software development expe-
requirements should be stated clearly up front. The rience including VC start-ups funded by The Mayfield Fund, Granite
Capital, and Wasatch Venture Fund, and holds CISSP and PMP
requirements need to be clear and related to risks.
credentials. Robert can be reached at robert.smithii@av.abbott.com.
Often GXP systems are closed systems on internal
networks and not subject to skilled, determined attack- Barbara Nollau, column coordinator, is director of quality services at
ers. But insider threats are real and the most prevalent. Abbott Vascular. She is responsible for validations, reliability engi-
These threats run the gamut from disgruntled em- neering, supplier quality, microbiology, and document management
ployee sabotage to someone correcting their mistakes at Abbott Vascular. Ms. Nollau has 25 years experience and increas-
ing responsibility in pharmaceutical and medical device industries,
to avoid reprimand to misappropriation of intellectual
spanning areas of manufacturing, quality assurance/compliance, and
property. information services/information technology.
Some systems in the life sciences sector may also Ms. Nollau can be reached at barbara.nollau@av.abbott.com.
contain protected health information and may be

Originally published in the Spring 2009 issue of Journal of GXP Compliance

Special Edition: Computer & Software Validation 7


Robert H. Smith

System Design and Control


Robert H. Smith

Computer Systems Quality and Compliance discusses the quality and compliance
aspects of computer systems, and aims to be useful to practitioners in these areas.
We intend this column to be a useful resource for daily work applications.
Reader comments, questions, and suggestions are needed to help us fulfill our
objective for this column. Case studies illustrating computer systems quality and
compliance issues by readers are most welcome. Please send your comments and
suggestions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com
Rupert King/Getty Images

or journal coordinating editor Susan Haigney at shaigney@advanstar.com.

SUMMARY
An illustrative incident at a pharmaceutical company that is representative of
actual events is discussed. This incident involves software control of a drug
dispensing system in pharmaceutical manufacturing. An error in amount of
drug weighed occurred. The investigation identified several problem areas.
Lessons learned, areas of concern, questions to be asked, and actions to be
taken are discussed.

INTRODUCTION
The following discusses an illustrative incident at Pharma154, a fictitious
pharmaceutical company that makes the global commercial supply of
Pinkoswill, a potent drug product. Because this drug product contains a
potent active ingredient, weighing the correct amount of drug in the manu-
facturing process is critical.
Personnel involved in the incident include the following:
Alex, vice president of regulatory affairs
Bob, vice president of information technology
Annie, software development manager
Alicia, software contractor
Sam, systems test lead
Salli, system administrator
Manufacturing engineers and operators.
While the incident, company, drug product, and personnel involved are
contrived, the following is representative of actual events for which the US
Food and Drug Administration has issued warning letters.

THE INCIDENT
I need you here. Now! exclaimed Alex, the VP of regulatory affairs at
Pharma154.
Alex, are you crazy? Its Sunday. Its 5:00 AM, slurred Bob, Pharma154s
vice president of IT.
Bob, listen, there are three reported hospitalizations tied to Pinkoswill.
They are all in critical condition. Surveillance is coming in now, we think
there may be others. We expect the FDA to be here Monday morning. This
is serious, Alex explained coolly.
Bob started to wake up, What does this have to do with IT anyway?
Alex said, We are not sure. Something has gone wrong. The labs say the
dosage in the suspected lots is almost four times spec. We have got to figure
this out.
Look Alex, this is clearly some manufacturing problem. I have a life. If
something points to IT, then call me. Otherwise, I have things to do. OK?

8 Special Edition: Computer & Software Validation


Robert H. Smith

said Bob. design.


I thought you would want to be in on this. Its im- Alicia said timidly, Can I add something? When
portant. But, I have to admit, we do not have anything no one said anything she went on to explain. She was
that points to IT. Ill call you if something changes, Alex more than a little embarrassed. When I first got here
managed to squeeze out before Bob hung up. I couldnt get the software to interface with the scales
Within hours, CNN reported: Massive Pinkoswill correctly. The manufacturing engineers were very
recall, FDA investigates. All patients should stop taking frustrated with the personnel change and could not
this medication immediately and see your physician. believe that they had to get another software engineer up
to speed. They told me to figure it out. They were not
THE IT GROUP GETS TOGETHER very helpful.
On Monday morning, an emergency senior staff meeting Salli, a system administrator at Pharma154, had told
was called in the Pharma154 boardroom. Alicia that the last person that had the scale interface job
Alex addressed the room, We have ordered a world got it working somehow. Salli said she made a back up
wide recall of Pinkoswill, not that we had much choice. of his hard disk before he left. She would restore the files
The FDA would have had an order in our hands later for her and maybe something would help.
today anyway, so we made the call to be proactive. The I poked around at all the stuff from the backup. It
analytical labs have analyzed samples from the last three took me a while but I found some stuff that seemed to
lots. About 15% of those lots have an overdosing of work. It passed all the basic tests. So I copied that into
about 400%. We do not know why. We have chemists our test environment, Alicia recalled. I was really con-
and engineers on the lines now and at our suppliers. We cerned because we do not have any real version control.
reviewed our sample data and the stored samplesthey I even wrote a bug report on that. The manufacturing
all check out. So we have some variation that we do not engineers closed it and were thrilled that I finally figured
understand yet. it out. But I didnt! All I knew was that when I put that
Bob, after listening to Alexs explanation of the week- DLL in the directory, the tests passed. They signed off
ends events, was glad he did not waste his Sunday wait- and I think that is what went into production, Alicia
ing around for manufacturing to figure out its problem. concluded.
When he got back to his office he saw some serious faces. Annie said, We had better get Salli in here.
Sitting at his conference table were Annie, his star soft- When asked what her role in the situation was, Salli
ware development manager; Alicia, a software contractor; offered in defense, Look, I was just trying to help. All I
and Sam, his systems test lead. did was give her the files, she put them into test and the
Bob asked, Why all the serious faces? This whole manufacturing team signed off.
Pinkoswill thing is just some manufacturing problem. Sam asked, Why dont my team and I go out on the
They have it sorted out, no one died, at least not yet. It line and do some testing.
is going to hurt for a quarter or two. Come on, we have Fine, barked Bob, but I want an answer tonight.
lots to do.
Well! Uhm. You might want to ask Alicia what her VISIT TO THE LINE TO TEST THE SYSTEM
idea is, said Annie. Sam gathered his team and headed to the line. No one
Bob was in no mood for this. Lets let manufacturing was happy out there. We need something to put in
figure out their problems. We have our own problems the drug hoppers to test the scales. That stuff weighs
to worry about. Last time I checked you had a couple nothing. They all looked at each other for a while. Sam
of projects that should be keeping you pretty busy, saw a five-gallon water bottle by the cooler in the break
grumbled Bob. room that he could see outside the manufacturing area
I really think you should listen to her, Bob, said Sam. through the observation window. Sam asked one of the
OK, lets have it and be fast, I have a meeting in 10 manufacturing engineers if he could put that bottle in a
minutes, snapped Bob. Then he said, Im sorry. It has pre- and post-mixing process hopper.
been a rough couple of mornings. The engineer laughed at him, That must weigh 100
times more than the compounds we mix.
The Problem May Be In The Software Thats the whole idea, said Sam.
Alicia reported that if the scales and controlling software Go ahead. It wont break anything and we have to
failed in some way, it is possible that the active ingredi- sanitize the whole line anyway, stated the engineer.
ents in the recipe could get over-speced. The filler is Sam came back carrying the water. This has to weigh
added to make the weight. This is a design flaw that I 40 or 45 pounds, he grunted as he strained to set it into
pointed out but we postponed correcting it. the hopper.
Annie pointed out, We postponed it because it cant They all stood back. The scale read 46.75 pounds.
happen. There are two weight check and software Good guess! they cheered. Sam went to the software; it
controls. Thats why there is nothing wrong with the said the weight in the hopper was 41.25 pounds.

Special Edition: Computer & Software Validation 9


Robert H. Smith

They all wondered how that could be. Annie said, So if the scale was doing what we see now, the valve
I remember some problem a long time ago about boot would let in a lot of the drug? asked Annie.
order and the USB interface to the scale. They decided Yes, the engineer replied, Thats why we weigh it a
to reboot everything. They turned off the computer second time. Only the exact recipe will produce the cor-
system and the USB hubs. Some one said, Lets turn off rect post-mix weight. We have that down to a science.
everything. They did that too. Sam wondered aloud if Alicia was the first to see it. The scale error is constant.
there was some protocol for restarting. Both scales were off by exactly the same amount.
One of the manufacturing engineers on the other lines And though they all thought it, Annie was the first to
offered to help. He told them the order in which to turn say it, We have a serious problem. A real serious prob-
everything back on. They did and now the software read lem! We have got to tell Bob.
46.75 pounds just like the scale. Sam, said, This is not The team informed Bob of the situation who then
good. contacted the VP of regulatory affairs.
Why? asked Annie, Everything is working fine Alex, this is Bob. We have a problem. My team found
now. a situation. It appears that if there is some maintenance
Sam said, Lets just try a few things. What is this performed on the line, a real problem can occur. I am no
other USB cable for? chemist but I think something like five pounds of extra
The manufacturing engineer informed them that it drug might give some people a real bad day.
controlled the hopper shape knife gate valve. They all As would be expected, FDA investigated the Phar-
laughed. The what? sang the software team almost in ma154 situation. The FDA-483 the company received
unison. The engineer explained, It controls how much from FDA was not kind. A warning letter was expected
of each ingredient goes into the mixer. It opens until the to follow. The possible fines assessed could be astro-
right weight is in the mixer and then closes. nomical. The lawsuits the company may incur will
Alicia spoke up, I wrote the code for that. The valve probably be worse.
is closed. I send a command to open it, then when the
weight rises above the spec, I send the close command. INVESTIGATION
What happens if it stays open? asked Sam. The During the corrective action and preventive action
manufacturing engineer explained that would ruin the (CAPA) investigation, the following items were docu-
batch and the incorrect mix would be caught at the post- mented by outside investigators:
mixing weight station. Software developers were not practicing version con-
Sam pulled the bottle out of the pre-mix hopper and trol. Software and associated source code files were
put it in the post-mix hopper. It weighed 46.75 pounds not kept in a repository. This is in stark conflict
on the scale and the software. They all agreed that made with the International Society for Pharmaceutical
sense. Engineering (ISPE)s GAMP 5: A Risk-Based Approach
Sam asked the engineer, if he could unplug and re- to Compliant GxP Computerized Systems. This lack
plug the cables. Sure, he told them, the techs do that of appropriate software version control was a direct
sometimes if the valves need maintenance. So Sam contributor to the event
unplugged the USB-controlled hopper shape knife gate The company lacked a formal procedure for deploy-
valve and plugged it back in. The room was very, very ing baselines from a controlled repository. This
quiet. allowed the personnel to retrieve software from a
The software displayed a strange error message. Salli backup that was not controlled or cataloged and
commented, Thats odd. It says Unit test parameters then allowed the use of the software in a production
exceeded, using default test values. Click OK to contin- system
ue. Thats not any error message I have ever seen before. The lack of a software version control tool and cor-
The wording makes it seem like some default or testing responding processes allowed a unit test Dynamic
mode. Link Library (DLL), which is a way to deploy soft-
The engineer said, Weve seen that a few times after ware so it can be used by other software, to be used
valve maintenance, but we usually reboot everything. in production. The unit test scale interface DLL was
Sam clicked the OK button. The scale went blank written in such a way that it provided its expected
and then the software and the scale both reported 41.25 values if the scale encountered an error.
pounds. You could hear a pin drop.
Annie asked, What goes in the mixer first? The investigators interviewed the former software
The engineer replied, The active ingredient. We dont developer. He reported that the manufacturing engi-
want to add anything else unless that weight is accurate. neers and he were in dispute regarding the reliability of
It cuts down on scrap. That stuff costs like a thousand the scale firmware (firmware is software that has been
times more than everything else that goes in. We got a committed to a chip in hardware). He believed the scale
process validated to reclaim it a few years back. firmware was not in control. He reported his concerns

10 Special Edition: Computer & Software Validation


Robert H. Smith

and was told to work around the problem. He created in systems and associated controls can and do lead to
code that simply ignored a malfunctioning scale and patient risks. Does your company have adequate tools,
supplied the parent program with historical successful controls, and management review?
values. This allowed the system development to proceed Systems today are very complex. Much of the soft-
without dependency on the scale. Evidence was found in ware and systems are assembled by contractors that often
various bug reports that this software engineer reported leave when the project ends. Is there a change control
these problems. It appears, in part, that his release from record? Is there a version history with accounting of all
the project was due to his reporting of poor controls. the changes? This is extremely important. It is impor-
The scale firmware was also not version-controlled. tant to know when changes are made and why. In the
This allowed scales on the new line to have old firmware story presented in this article, Alicia was given a piece
put into production. This firmware had a defect that of software. She did not know where it came from, who
in certain conditions, like the ones triggered by hopper wrote it, why, or when. It was test software but only the
shape knife gate valve maintenance, caused the scale departed contractor knew that. Alicia had no knowledge
to recalibrate. The original developer attached the new of the bug in the scale firmware, and due to pressure,
firmware to the bug report, but that report was closed the working system was released with a test software
after his departure. Due to the lack of version control component that simply reported to the parent program a
and formal procedures to control the validation and weight it was programmed to return if the scale firmware
deployment process, incorrect and unsuitable versions had an error.
were deployed.
The result of the inadequate software version control, Software Version Control
deployment practices and hardware/firmware version If the Pharma154 Company had software version control
control allowed approximately five pounds of the Pinko- and was using it properly, this scenario would have been
swill active ingredient to be added to the three affected prevented. Software version control provides key ben-
lots. Company chemists and lab personnel acknowledge efits that comply with good automated manufacturing
that this is, at a minimum, a serious overdose risk to practices (GAMP). These include the following:
patients. Frequent check-in and checkout (daily) of work.
The three lots were able to escape into the supply This provides clear visibility and accounting around
chain due to the lot-sampling plan being incorrectly who made changes and when
constructed because of a side effect of the test software. Good process ties check-ins to a stimulus (i.e.,
When in testing mode, the problem DLL did not send requirement, work instruction, bug, or task)
lot information to be included in the lot sampling plan. Labeling (i.e., production version, test version, devel-
Although the lot sampling plan was a validated approach opment version)
and relied on a risk-based analysis, that analysis did not A central and controlled repository where all soft-
identify any configuration management risks or failure ware or firmware is stored.
of the scale system to properly function. The failure to
identify and manage risks associated with configuration Computer Systems Do Not Always Work
management fails to comply with the regulations. Companies today need to recognize that computer sys-
The investigators noted that, per the regulations, the tems range from your SmartPhone to lab equipment to
company had an obligation to prevent mix-ups. The lack manufacturing control systems. As these devices have
of management controls and adherence to basic controls become pervasive, there is a tendency to just assume
around software and firmware versioning fell below they work and work together. In many cases, they do
minimum standards for industry. not. Bugs exist, incompatibilities exist, and often the
formal structure of good version control and software/
LESSONS LEARNED AND AREAS OF CONCERN system best practices is not in place on internal projects.
In most life-sciences organizations, management comes Some organizations confuse software-development-life-
from scientific, sales, finance, or other non-software or cycles (SDLC) for software development best practices.
system development backgrounds. As a result these However, most SDLCs are focused on an artifact trail
organizations often do not have adequate system devel- to satisfy regulation rather than on ensuring best or es-
opment controls in place. There are also many times sential practices are in place. Organizations need both
when organizations do not see themselves as needing to sound SDLC that ensures key steps and artifacts are
practice software and system development at anything executed appropriately and methods and procedures to
more than it seems to work. Where does your organi- ensure essential practices are in place and practiced.
zation fall? This can be particularly true when non-software
Software and systems have become pervasive in orga- and system development professionals are running
nizations from controlling quality systems to production projects. Today, there are many tool kits from leading
lines to devices instrumental in patient care. Failures vendors that allow users with no formal training in

Special Edition: Computer & Software Validation 11


Robert H. Smith

system development to create powerful and complex pothetical, it is representative of real life. There have
systemsothers in the organization then usually de- been FDA warning letters issued for the lack of these
light because it works. However, there are real risks very controls and processes. These are essential and
in life sciences if those systems get used for quality or foundational processes that every organization needs
manufacturing purposes as bugs, version problems, or to make sure are in place and functioning to stay out
validation leakages (i.e., intended or actual use cases of the headlines and away from 483s, warning letters,
that do not traverse the full validation cycle but end and recalls. GXP
up in use) may affect safety or efficacy of processes,
devices, or drugs. ARTICLE ACRONYM LISTING
DLL Dynamic Link Library
CONCERNS AND ACTIONS FDA US Food and Drug Administration
Companies should take a good look at the software and GAMP Good Automated Manufacturing Practice
firmware systems they have in place and what the as- IT Information Technology
sociated regulations are in regards to those systems. SDLC Software-Development-Lifecycles

You Should Be Concerned If ABOUT THE AUTHOR


If your team does not have a software or firmware Robert H. Smith is an application technical lead responsible for quality
systems software development at Abbott Vascular. Prior to this, he
configuration management (SCM) system, that they
was Sr. Director, Engineering at Symantec Corporation, where he was
use everyday, you should be concerned. If your team responsible for developing enterprise client, host, and server-based
does not have a defect management system that they corporate security products as well as the Symantec and Norton Live
use everyday, you should be concerned. If your team Update offering. Robert has 25 years of software development experi-
does not have a formal way to label the version set that ence including VC start-ups funded by The Mayfield Fund, Granite
represents specific and frequent points of time you Capital and Wasatch Venture Fund, and holds CISSP and PMP cre-
dentials. Robert can be reached at robert.smithii@av.abbott.com.
should be concerned. Often, it is the complex interac-
tion of many pieces that results in an issue.
If you have doubts, get an outside assessment of ABOUT THE COLUMN COORDINATOR
your firms level of practice. Make sure the level of Barbara Nollau is a director of supplier and alliances quality at Abbott
practice, the related risks, and the impact on function- Vascular. She has 26 years of experience and increasing responsibility
in the pharmaceutical and medical device industries, spanning the
al areas are analyzed and understood. areas of manufacturing, quality assurance and compliance, validation,
and information technology. Ms. Nollau can be reached by e-mail at
System Design And Control Is Not Optional barbara.nollau@av.abbott.com.
Although the specific incident described herein is hy-

Originally published in the Summer 2010 issue of Journal of GXP Compliance

12 Special Edition: Computer & Software Validation


Frank Houston

The Nine Most Common


Computer Validation Problems
Identify Frequent Deficiencies to Accelerate
Your Validation Projects
Frank Houston

Computer Validation Forum discusses topics and clustered around nine types of deficiencies as plotted on
issues associated with computer validation in order to Figure 1. This case was an exception to the 80/20 rule, in
provide useful resources for daily work applications. that the top nine problem areas represented about 41%
This column provides readers information regarding of the categories.
regulatory requirements for the validation and qualifi- The following were the most frequent deficiencies
cation of computerized systems. found:
Your questions, comments, and suggestions are M issing information. Documents or records omit-
required to fulfill the objective for this column. Case ted fundamental information or content that should
studies submitted by readers are welcome. Please send have been included.
your comments to column coordinator Sharon Strause I nconsistency. Documents contained statements
at sastrause@aol.com or to coordinating editor Susan inconsistent with other statements about the same
Haigney at shaigney@advanstar.com topic in the same document or in the same validation
package. Whats more, no explanation or reason was
INTRODUCTION given for the difference. Jargon, varying terminology,
What validation problems are you likely to see over and and contradictions in logic frequently caused these
over? When tackling complex validation challenges, kinds of inconsistencies.
youll save time, money, and headaches when you know L  ack of needed detail. This deficiency applied most-
the most common problems and where to find them. ly to requirements documents. The requirements in
The following analysis is based on validation work the validation package did not adequately describe the
performed for a large US Food and Drug Administration- characteristics of data, user interactions with business
regulated company. The goal was to bring the companys processes, or key processes internal to the software.
software validation evidence up to the level of FDAs T  raceability. We found three frequent traceability
current expectations as well as those of the clients own problems:
independent auditor. The traceability matrix did not account for a
Our efforts yielded 1,720 observations. As part of traceable specification or an observation step
a lessons learned review, these observations were in a test script
grouped into 22 different categories. The documents The trace was broken. Either a requirement was
that most frequently contained the observations were barren (lacked decedents or a test) or one of the
identified. The results, in the authors experience, are detailed requirements or test results was an orphan
typical of the problems most companies face. (lacked a parent somewhere in the requirement
tree).
APPLYING PARETO ANALYSIS TO The traceability matrix was incomplete. Require-
COMMON VALIDATION PROBLEMS ment details were not explicitly numbered and
Through Pareto analysis of the categories of problems, it traced to associated test steps. Requirements were
was discovered that about 80% of the observations were not traced at a detailed level, so the reviewer need-

[
For more Author ABOUT THE AUTHOR
information, Melvin F. (Frank) Houston is a senior validation consultant with EduQuest, Inc. of Washington, DC.
go to He is a recognized authority on ISO 9000 Quality Standards and Quality System Regulation. Sharon
gxpandjvt.com/bios Strause, the column coordinator, is a senior consultant with EduQuest, Inc. Sharon may be reached
at sastrause@aol.com.

Special Edition: Computer & Software Validation 13


Frank Houston

Figure 1: Top finding categories.

Top Finding Categories


25% 90%
80%
% of observations 20% 70%

Cumulative %
60%
15%
50%
10% 40%
30%
5% 20%
10%
0% 0%
.

cy

..

ng

...

s
i..

lit

ou
d.

in
GD
st
at

en

di
bi

st
te

gu
ed
rm

or
ea
ist

te
le

bi
ed

w
fo

ns

ac

e
ab

Am
et
e
ne
in

co

Tr

gu

rifi

pl
g

of
In

Va

m
in

ve
iss

co
k

Un
c
La

In
M

ed to infer the detailed links between specifica- tions are allowed without additional documenta-
tions and steps in a test script. tion only for obvious typographical errors, such
V  ague wording. Documents used generalities as dropped or transposed letters (e.g., correcting
such as in accordance to an approved procedure, th or teh to the).
or applicable regulatory requirements, or all asso- I ncomplete testing. Test scripts did not fully or
ciated GXP and business processes. In addition, adequately test the associated requirement.
documents used vague words such as may, pos- Ambiguity. Text could be interpreted more than
sibly, more or less, and approximately. one way, so it did not establish a single, unique
U  nverifiable test results. Expected results were requirement. The words either and or in a
not described sufficiently so that an independent requirement are strong clues the text is ambiguous.
reviewer could compare and verify actual results.
The IEEE Standard for Software Test Documentation, ADDITIONAL OBSERVATION CATEGORIES
Std. 829.1988, Clause 6.2.4 (1) states, ...provide the Beyond these top nine categories, 13 other categories of
exact value (with tolerances where appropriate) observations were identified. These category definitions
for each required output or feature. For executed may seem to be somewhat subjective, but for this sort of
scripts, actual results were not recorded or captured analysis the objectivity of the definitions was less impor-
in a way that allowed an independent reviewer to tant than consistency in classifying the observations.
compare them to expected results. For example, For this reason, all the classifications were reviewed
OK was noted in the actual-result column with several times before locking in the data for the lessons-
no reference to a screen shot. learned pivot tables. Even so, it was noted that between
Good documentation practice (GDP). The fol- the Ambiguous and Vague Wording classifications,
lowing three frequent good documentation practice many observations could have fit in either one.
problems: The following additional categories of deficiencies
Hand-recorded data and testing evidence, such (i.e., ones that did not rise to the level of our most
as test results, were presented in a way that could common findings but were still worth noting) were
cause doubts about their authenticity (e.g., cross- identified:
outs without initials, date, and reason) Compound requirement. Requirements that
Data that confirmed a specific requirement was were not unique; that is, the requirement statement
hard to find in the evidence provided (e.g., a busy actually stipulated two or more system characteris-
screen shot crammed with data) tics. (When the predicate of a requirement sentence
Handwritten corrections were made that changed contains and or a series of commas, or when the
the sense of a requirement or an expected test requirement is presented as a compound sentence or
result, but no discrepancy report or change request series of bullets, its probably a compound require-
was filed (e.g., changing an expected result from ment. This deficiency was often coupled with trace-
indicator Off to On). In GDP, hand correc- ability problems.)

14 Special Edition: Computer & Software Validation


Frank Houston

Figure 2: Top document types.

Top Document Types


25% 120%
100%
% of observations
20%

Cumulative %
80%
15%
60%
10%
40%
5% 20%
0% 0%

t
ns

t
an

an

rix

lts

...

...

...

t
y..

en

di
rip

t1
rip

st
io

em resu
pl

pl

at

au
ar

m
c

su

ar
at

n
m

sc
ts

m
n

st

ss
tio

or
P
fic

io

m
n
Te

de
e
s

st

se

nd
io
at

ia
ac
Te

su
ci

Te

as
at

ed
lid

Ve
e

Tr

st
Sp

lid
Va

P
st

Te
Va

Gx
Re
Sy

F  or your information. Here comments on the the overall evaluation and acceptance or rejection
potential to improve a document or process were of the test and validation results.
included. The issue that generated the comment L  ack of process for resolving deviations. A
may or may not have had an impact on a determi- plan, protocol, or script lacked a process for resolving
nation of substantial compliance. Remarks on deviations (e.g., failure to meet expected test results,
particularly good examples of documentation or discovery of unanticipated behavior, or deviations
development practice were also included. from GDPs).
I ncomplete requirements. Findings in this cat- Questionable statement. A statement appeared
egory fell into the following four subcategories: to be inaccurate or incorrect.
The requirement in question implied another R  edundant requirement. The same require-
requirement, possibly complementary, that needed ment appeared more than once in a specification
to be explicit to ensure verification document.
Regulatory impact analysis and risk assessment T  opical inconsistency. The text within a topic
indicated a need for requirements that were miss- pertained to a different topic.
ing from the user requirement specification (URS) Typo. Typographical errors were observed.
Requirements in a software requirements specifica- U  nsupported deviation. The summary document
tion (SRS), a software design specification (SDS), omitted reporting on differences between planned
or a configuration specification (CS) were not suf- activities and those that were actually carried out.
ficient to address the associated URS item. This Not testable requirement. The requirement was
deficiency was often associated with a broken trace not presented in objective, observable, or measur-
System and business process analyses indicated able terms. In other words, the requirement did not
the software had functionality that was used but describe a system response or characteristic that a
had not been described in the URS reasonable person could sense or measure.
R  ationale. Statements or assertions were made Violation. The text set up or highlighted a violation
without supporting rationale or justification. Or, the of procedures or regulations.
rationale or justification for a particular statement
or assertion was not persuasive. These categories should be considered nothing more
Lack of acceptance criteria. Test and valida- than suggestions or starting points to create a list of obser-
tion plans did not establish objective criteria based vations. As experience is gained, the list may need to
on the outcomes of various tasks in the validation be revised to cull out some categories and/or identify
process, such as vendor audit, testing, and problem new ones.
resolution. The plans did not include criteria for
assessing the seriousness of deviations as a basis for

Special Edition: Computer & Software Validation 15


Frank Houston

Identifying the Most Vulnerable found that the results described in this article are
Documents and Records typical of companies worldwide.
Taking the next step to document the lessons learned More importantly, the author has seen first-hand
from this project, the documents and records where that companies who reduce the frequency of these
the most frequent deficiencies were found were cat- problems with focused remediation efforts are much
egorized. It was discovered that about 85% of findings more likely to weather future FDA inspections. It can
were concentrated in six key documentation areas, as be reasonably assumed the same would be true if the
shown in Figure 2. frequency of such problems were low in the first place.
The following were the top types of flawed It is recommended that companies use these results
documentation: and definitions to assess their own validation projects,
Specifications (including user requirements) or devise their own categories and charts to pinpoint the
Test scripts companys most common problems. Either way, youll
Validation plans have a major headstart in better allocating validation
Test plans resources and making needed improvements quickly.
Trace matrix
Test results.
REFERENCES
Although the exact order of problem areas may 1. IEEE, IEEE Standard For Software Test Documentation, Std
differ in any individual organization, its likely these 829-1998, 16 Dec 1998.
same six documentation areas will float to the top.
From the authors experience, specification documents
are usually the biggest pitfall for most companies. ARTICLE ACRONYM LISTING
CS Configuration Specification
FEWER VALIDATION PROBLEMS FDA US Food and Drug Administration
AND INSPECTION SUCCESS GO GDP Good Documentation Practice
HAND-IN-HAND SDS Software Design Specification
After auditing many companies, large and small, and URS User Requirement Specification
participating in countless remediation projects, it was

Originally published in the Summer 2009 issue of Journal of Validation Technology

16 Special Edition: Computer & Software Validation


Janis V. Olson

Accurately Identifying
Your RequirementsWill
Any Computer System
be Right for You?
Janis V. Olson

Computer Validation Forum discusses topics and Write requirements for how the system should
issues associated with computer validation in order not work
to provide useful resources for daily work applica- Review all requirements with all levels of users.
tions. This column presents information regarding
regulatory requirements for the validation and INTRODUCTION
qualification of computerized systems. Requirements are the foundation for determining
Your questions, comments, and suggestions are what you want and what you need. People, in general,
required to fulfill the objective for this column. do not write down their needs, wants, and intend-
Please send your comments to column coordinator ed uses of the things they buy. Some do extensive
Sharon Strause at sastrause@aol.com or to journal research by going shopping, reading information, or
coordinating editor Susan Haigney at shaigney@ searching the Internet. Others buy the first thing
advanstar.com that appears to meet their needs. Others buy what
everyone else seems to have bought, thinking that
KEY POINTS if it meets other peoples needs, it will satisfy them.
The following key points are discussed in this article: Often, different people have different requirements
A clear statement of requirements is fundamen- and understanding of what is really needed. The only
tal to determining what you want and what way to resolve the conflict when purchasing computer
you need systems for regulated industries is through written
Write your requirements so they are unambigu- requirements.
ous, complete, consistent, and testable Writing requirements can be very difficult. Vague
The quality of your computerized system will statements of goals and needs are often expressed.
be a direct result of getting quality requirements Statements like user friendly, easy to use, and
written intuitive to the user are often seen but rarely
All system users should have input into defining defined. Requirements must be written so they are
the requirements unambiguous, complete, consistent, and testable.
Map the current process or processes the com-
puterized system is designed to replace. Incorpo- DETERMING THE REQUIReMENTS
rate any regulatory, statutory, and/or standards The quality of your computerized system will be a direct
requirements. result of getting quality requirements written. I have
Optimize the process or processes you want to use not used user requirements because those are only
Write your intended uses and requirements for one part of all the requirements you need to document.
the system in terms of how you will be able to Requirements should specify what the user and business
test that the requirements are satisfied need, not the abilities of the various products available.

[
For more Author ABOUT THE AUTHOR
information, Janis V. Olson (Halvorsen) is Senior Validation Consultant at EduQuest, Inc., a global team of FDA
go to compliance experts. Sharon Strause, the column coordinator, is a Senior Consultant with EduQuest,
gxpandjvt.com/bios Inc. Sharon may be reached at sastrause@aol.com.

Special Edition: Computer & Software Validation 17


Janis V. Olson

must be stated in the way that you want them imple-


This is the only way to assure the system chosen meets
mented in the computer system. For example, stating
your real needs. Too often, I have seen companies buy
that a system must meet 21 CFR Part 11, Electronic Re-
a software package or tool to automate one of their criti-
cords: Electronic Signature (ERES) regulations is not
cal systems only to find during installation and testing
specific enough to assure that the system will meet
that the system does not meet their needs and does not
these requirements. You must be specific. For exam-
have a critical (to them) capability. For example, I saw a
ple, some of the requirements for ERES compliance in-
company try to add, at great expense, the capability of
clude the following (see reference):
a complaint system before their requirements had been
established. A year later, the company gave up and bought Each user will have their own user name and
a different software package just to handle complaints password
now that they understood their needs and processes. The users login user name and password will be
The total cost of ownership is affected by your ability to the same as his electronic signature user name and
identify, right at the beginning, the product that meets password
the needs of your business and the users in the business. Identification of the individual doing work is from
The following are some steps to get you started in their login
determining the requirements needed. The computer system will check each user at login
to determine the operation that can be done and
Have All The Users Of The System the files that can be can accessed
Represented All signatures require the user to enter both the user
Users are defined as the people who will interact with name and password when the user signs a review or
the system. Users include those who input, change, and approval of an operation. The login process is not
review data (i.e., users); receive reports from the system linked to the signature.
(i.e., users, managers); maintain the system (i.e., informa-
tion technology department [IT]); manage and change Determine The Process Or Processes
the system (i.e., IT or super users); business owners; etc. What are the efficiencies the new computerized system
Have focused meetings with users to understand their will be able to provide? If the current system is manual,
needs and how they see the system operating. Do not the process identifies the person doing an operation by
have meetings that only include one type of user. Cross his name, number, initial, stamp, etc. that he must write
functional meetings are needed to assure that conflicting or place on the paper and date. The computer can identify
requirements are identified. Get the users to be specific the person based on his login and can apply the date and
about their needs and wants. Write down what is said time the operations are done. The computer can forward
and what the system is required to do. information (e.g., data, documents, requests for action,
etc.) to the next person to review or approve without the
Map Your Current Process Or Processes user having to cause this to happen. The computer can
No matter what the current process is, you must under- also put data in several places, pre-populate fields with
stand the flow and interactions both within the system standard information, provide instructions to the user
and the interfaces to the system. The current processes when required, etc. Any redundant operations in the
may be manual, automated, or a combination of both. current system may be eliminated by the computer if the
Use multiple layers of process mapping to show what process is designed correctly. Additionally, this is the
is currently done. Include who does what, when, and time to optimize your process. One company developed
how, including decision, review, and approval points. a system to automate its documentation and tracking of
Include what is received and what is sent to other pro- corrective actions and preventive actions (CAPA) and
cesses that are not in the scope of the new computerized had implemented over 60 electronic signatures from
system. Understand where the data come from, where opening to closing of a single CAPA report. Needless to
the data are processed, and where the results go. Map say, users of the system were extremely dissatisfied and
not only the usual processes but also the exceptions to said there was more work using the automated system
the current processes when problems arise. As a result, than doing the same operations on paper.
you may discover additional business requirements the
new system will need to meet. Write Your Intended Uses And
Requirements
Determine Any Regulatory, Statutory, Or Write your intended uses and requirements for the sys-
Standards Requirements tem in terms of how you will be able to test that the
Write any regulatory, statutory, or standards require- requirements are satisfied. Develop scenarios for how
ments down individually and not by just referencing the system will be used. These scenarios can be used
other documents or standards. These requirements as part of the performance qualification of the comput-

18 Special Edition: Computer & Software Validation


Janis V. Olson

erized system. Scenarios are often easier for users to Complete requirements cover all aspects of what the
review to assure all of their needs are being met by the system will and will not do. The design of the system
system. They will help you identify standard operating will determine what is done by hardware, software,
procedures that will need to be rewritten or written prior and people following procedures. All the users should
to performance qualification. review the requirements to assure that all of them
have been covered in the requirements document.
Requirements For How The System Should Consistent requirements do not conflict with one
Not Work another. For example, one requirement stated that
Write requirements for how the system should not the user will enter the date when the complaintant
work. Ask the What if? question as many times as reported an issue. A second requirement stated that
needed. Conduct a risk analysis for the system and the computer will pre-populate the report date of the
identify mitigations for those risks. Mitigations for complaint with the date the complaint was entered
the risk identified become requirements of the sys- in the system. The two requirements are inconsistent
tem. The goal is to assure that the system will fail with one another. Neither in itself is wrong, but taken
in a safe manner. Define a safe manner. Safe could together, the two requirements cannot be fulfilled at
mean that the data are not corrupted; that the data are the same time, and one must be changed. Testable
checked for consistency prior to being accepted; the requirements can be tested singularly and together
user receives a warning message and instructions on to determine if they are met. For example, stating
what to do next; the system flags the fields that have that the user will enter complaintant information
not been completed and are mandatory; etc. Again, into the system without defining the type of infor-
develop scenarios for how the system will not behave mation is not testable. As long as any information
and assure the scenarios are testable. is entered, no matter what it is, the test would pass,
even if there is not enough information to respond
Review All The Requirements to the complaintant. Generally ambiguous require-
The reviews should take place on multiple levels. The ments are not testable.
requirements must be reviewed to assure they are unam-
biguous, complete, consistent, and testable. Unambigu- SUMMARY
ous requirements are interpreted the same way by each Because the quality of a companys computer system can
person that reviews them. One company had require- directly depend upon the quality of the established user
ments that appeared, on first reading, to be well written requirements, it is important to be as specific as possible
and unambiguous. However, the following were misin- when creating a list of written requirements. Require-
terpreted by the system developer: ments should include all user needs and regulatory and
Users will have user names and passwords to oper- standards requirements. Written requirements should
ate the system be clear, complete, consistent, and testable. Establishing
Users will be operators, supervisors, or quality these requirements before a system is purchased can save
personnel. a company money in the long run.
The resulting system was designed so there were
only three user names and passwords the system
would accept, one for each type of user, not one for REFERENCE
each user. Unfortunately, this was discovered dur- FDA, Title 21 Food And Drugs, Chapter IFood And Drug
ing operational qualification and did not meet the Administration, Department of Health And Human Ser-
intended needs of the company because it was plan- vices, Subchapter AGeneral, Part 11 Electronic Records;
ning on using electronic records. The company had Electronic Signatures, April 1, 2009. JVT
to continue to use its manual batch history records.

Originally published in the Winter 2010 issue of Journal of Validation Technology

Special Edition: Computer & Software Validation 19


Barbara Nollau

Computer Systems Quality


and Compliance vs. Software
Validation
Barbara Nollau

Welcome to Computer Systems Quality and Compliance.


This column discusses the quality and compliance aspects of computer systems
and aims to be useful to practitioners in these areas. We intend this column to be a
useful resource for daily work applications.
Quality and compliance considerations associated with computer systems are
relevant across the life sciences industries. Understanding the requirements and best
practice regarding computer systems is fundamental because much (if not all) of our
data and records are electronically created and maintained, and so many of our daily
Rupert King/Getty Images

operations are automated. Computer systems have rapidly evolved, and industry and
regulatory guidance regarding their use has evolved as well.
This column addresses computer systems quality and compliance with real life
scenarios and challenges in mind. It is our intent to present these topics clearly and in
a meaningful way so that our readers will have a basic understanding of principles, and
then be able to apply these principles in their daily work applications.
Reader comments and suggestions are needed to help us fulfill our objective for
this column. Suggestions for future discussion topics or questions to be addressed are
requested. Case studies illustrating computer systems quality and compliance issues
by readers are also most welcome. We need your help to make Computer Systems
Quality and Compliance a useful resource. Please send your comments and sugges-
tions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com or journal
coordinating editor Susan Haigney at shaigney@advanstar.com.

SUMMARY
The following are key points that should be considered in computer systems
quality and compliance:
An evolution has occurred regarding thinking and terminology from soft-
ware validation to computer systems quality and compliance
Computer systems include software, hardware, operating system, technical
infrastructure, use and maintenance processes, and the people who use
the systems
Computer system quality and compliance includes all the activities
associated with acquiring or developing and deploying a system and then
maintaining it until eventual retirement
A true quality system builds quality in because it is the right thing to do,
not because we are obligated to do sobecause obligation typically doesnt
foster the same level of commitment
Computer quality and compliance best practice is to apply quality prin-
ciples and practices with respect to all the elements of the computing
environment across all phases of the system life cycle
When systems or technology services are purchased from outside vendors,
the client company must gain assurance that the supplier has built quality
into the product they are selling
Building quality into the system results in systems that are reliable and
compliant.

20 Special Edition: Computer & Software Validation


Barbara Nollau

INTRODUCTION In the past 10-15 years, software suppliers really began


This first issue of Computer Systems Quality and to recognize the needs of the life sciences industry. There
Compliance lays some foundational groundwork for is now a wide variety of commercial off-the-shelf software
the content that will be addressed in future issues. Here available in the market. Because of this, many life sciences
we examine the terms computer systems quality and companies are realizing that they no longer need to be in
compliance and software validation, and examine why the software development business. They are, therefore,
understanding these terms and others can make a dif- moving from a build to a buy philosophy regarding
ference in how we execute regulatory requirements and computer systems. This changes the face of building qual-
industry best practice. ity in a bit, but it doesnt eliminate the concept. Rather,
the client companys job is now to gain assurance that the
SYSTEM VALIDATION supplier has built quality into the product they are selling,
The term software validation has been used for decades. and that we carry that baton through the implementation
However, it is really not possible to validate software alone. and maintenance of the system. This supplier assurance
Software must be installed on some hardware, with an is gained via assessments of the quality system in place at
operating system, and in many cases some level of techni- said supplier, the longevity and history of the package they
cal infrastructure is also required. Additionally, there are are developing and selling, as well as the ongoing collab-
processes associated with the use and maintenance of the orative relationship with the supplier. Similarly, building
system, and people who use and maintain it. All of these quality in also applies to the outsourcing of information
elements must be part of validation in order to provide a services/information technology services and the use of
high degree of assurance that the system will do what it is application service providers. In these cases, we must gain
supposed to donot do what it is not supposed to do assurance that the third party has built quality in via their
and continue to operate that way in the future. Consider- own quality system. If there are deficiencies that would
ing all of these elements of the computing environment, the make the solution non-compliant, it is the client companys
term system validation is closer to the mark. responsibility to either mitigate those deficiencies through
Additional regulation and guidance regarding the use of additional testing and/or controls, or move on to a different
computer systems in regulated industry was introduced in supplier or third party.
the late 1990s and early 2000s. Industrys way of thinking
also matured over time regarding the concept of building INTERNAL PROGRAMS
quality in rather than testing it in. This broadened our ho- In terms of a companys internal computer quality and
rizons wider still, and the computer compliance school of compliance program, the following are key components:
thought was born. This term reflects that there is a bigger Software, hardware, and infrastructure procedures.
picture that goes beyond the scope of what is traditionally There should be procedures in place to ensure that
known as validationa bigger picture that includes all controls and quality attributes apply to not just system
the activities associated with acquiring or developing and software, but also to hardware and to infrastructure.
deploying a system and then maintaining it over time until One example of this is qualification of the technical
eventual retirement. infrastructure and maintenance of the infrastructure
Application of a quality systems approach led us to under change control. Another example is having
understand that compliance should not be our driver. procedures in place to cover back up and restora-
Compliance should be an outcome of good quality, not the tion, disaster recovery and business continuity, and
reason to do it. If we do something because it is a require- securityall measures that help ensure ongoing data
ment, are our hearts really in it? We should build quality availability and integrity.
in because it is the right thing to do, not because we are Processes and people. The processes and people
obligated to do so. This difference defines a true quality associated with the system need controls and quality
system and a set of rules to which people are not really attributes applied as well. Examples of this are train-
committed. ing for users of the system and for the personnel who
maintain it, and having procedures in place that cover
COMPREHENSIVE QUALITY AND COMPLIANCE BEST the proper use and maintenance of the system itself.
PRACTICE
Computer quality and compliance best practice is to apply Additionally, the following quality practices should be
quality principles and practices with respect to all the applied across all phases of the system lifecycle:
elements of the computing environment (e.g., software, Planning. In the planning phase, a quality repre-
hardware, infrastructure, people, processes) across all sentative should be involved to ensure that activities
phases of the system lifecycle (e.g., planning, requirements, like supplier assessments and creation of valida-
design, build, test, implement, maintain, retire). This way tion and quality-related deliverables are adequately
we ensure a comprehensive approach that builds quality in planned.
from the beginning and results in compliant outcomes. Requirements. In the requirements gathering stage,

Special Edition: Computer & Software Validation 21


Barbara Nollau

any requirements to fulfill regulatory expectations planning and execution of decommissioning activities
or necessary quality controls and checkpoints must also have quality built in, to ensure proper dispo-
should be included. Requirements should also be sition and accessibility of data and records, controlled
testable. transitions to other systems when applicable, and a
Design and build. In the design and build phases, compliant decoupling of the retired system from the
any required standards should be followed, and sys- infrastructure and any interfacing systems.
tem configuration and/or code should be adequately
documented for traceability and ease of mainte- Building quality into the system across all the compo-
nance. nents of the computing environment, and throughout all
Test/validation. The test or validation phase is typi- the phases of the system life cycle results in systems that
cally the phase associated with quality. However are reliable and are also compliant with todays regulatory
this is just a confirming event meant to demonstrate expectations.
the quality built in, and assure of sustainable quality
operation of the system. ABOUT THE AUTHOR
Maintenance. In the maintenance phase, practices Ms. Nollau is a Director of Quality Services at Abbott Vascular, respon-
sible for validations, reliability engineering, supplier quality, microbiol-
such as change control and configuration manage-
ogy, and document management. Ms. Nollau has 25 years of experience
ment, problem reporting and resolution, and ongoing and increasing responsibility in the pharmaceutical and medical device
controlled operation of the system are all ways we industry, spanning the areas of manufacturing, quality assurance/com-
sustain quality and the validated state of the system pliance, and information services/information technology. Ms. Nollau
over time. can be reached via e-mail at barbara.nollau@av.abbott.com.
System retirement. Finally, at system retirement,

Originally published in the Winter 2009 issue of Journal of GXP Compliance

22 Special Edition: Computer & Software Validation


Farhad Forozesh

Computer Systems
Change Control
Farhad Forozesh

Computer Systems Quality and Compliance discusses the quality and com-
pliance aspects of computer systems, and aims to be useful to practitioners in
these areas. We intend this column to be a useful resource for daily
work applications.
Reader comments, questions, and suggestions are needed to help us fulfill
our objective for this column. Please send your comments and suggestions to
column coordinator Barbara Nollau at barbara.nollau@av.abbott.com or journal
coordinating editor Susan Haigney at shaigney@advanstar.com.
Rupert King/Getty Images

KEY POINTS
In this issue of the column, the following key points are discussed:
Change control as good business practice
The importance of having a change control process in place
Regulatory compliance drivers for change control
Developing a change control procedure and pro-
cess for computerized systems
Determining the level of re-testing required
Different types of change control and value in consistency.

INTRODUCTION
Change control is a common term describing the process of managing
how changes are introduced into a controlled system. Experts agree
that most problems of software and computer systems are introduced
when changes are made either during development or during use of the
systems. Change control is required to ensure that validated systems
remain under control even as they undergo changes.
Changes to the system are likely to disqualify the original valida-
tion if not performed and tracked carefully. Lack of documentation for
changes and testing after changes is one of the most frequently cited
deviations during internal or external audits. A robust change control
process must be in place to prevent unfavorable or non-compliant out-
comes as a result of change to systems.

CHANGE CONTROL PROCESS


Computer systems are not static and they do require a robust main-
tenance program soon after the initial validation. A change control
procedure is critical to ensure that changes are assessed, documented,
performed, and tracked consistently across the organization. This
procedure should define the process to be followed for assessing and
implementing the changes.
The change control process is typically defined by proposing the
need for the change, pre-approval and planning, executing the change,
and final approval/implementing the change. Change completion is
then documented. Figure 1 describes the change control process.

Special Edition: Computer & Software Validation 23


Farhad Forozesh

Figure 1: Is the change unavoidable?


Change control process flow.
Does the change increase
Pre -Approval Executing Final Approval / the overall benefit to
the Implementing the
Proposed and change
the organization?
Change Planning Change
Is the project team able to
make such a change?
Proposed Change Is the change best done now, or would it be more
The change requestor formally requests a change to beneficial to defer it?
the system (usually via a form or online entry point). Is the change going to impact other areas
Change requests need to be evaluated to ensure they are or systems?
appropriate and that the proposed change will not nega- An objective process should be in place to deter-
tively impact any other aspect or capability of the system. mine the magnitude and complexity of the proposed
Then it should be determined whether the change should change as well as the level of impact it will have on
be classified as an emergency or routine change. Some the system. This understanding will lead to the deter-
companies also have a third category for non-essential mination of the required documentation rigor. This
changes that may be batched. This classification will impact determination will also help with determin-
indicate the required timing of implementation and as- ing the level of testing required for the system. Some
sociated activities. It is essential that the change control companies categorize changes as major, minor, etc.,
procedure provide an expedited pathway for emergency which can enable more consistent decision-making if
changes. each category is managed consistently.
Often, emergency changes are needed to correct soft- By reviewing the original validation requirements
ware problems or restore processing operations quickly. associated with the changing functionality and any
Although the changes must be completed in a short related functionality, and evaluating any potential
timeline, they must be implemented in a well-controlled new risks that might be introduced through the
manner. Emergency changes should be subject to changes to the system, the focus and level of re-
similar controls as routine changes. However, the pro- testing can be determined. This is often referred to
cess may be abbreviated relative to the change request, as a regression analysis. Additionally, the Traceabil-
evaluation, and approval to ensure changes can be made ity Matrix (TM) is a document that formally links
quickly. The process should be designed to ensure affect- requirements to design and testing throughout the
ed parties complete detailed evaluations and documenta- validation process, and that can be a practical tool to
tion of the emergency change(s) as soon as possible after help determine regression testing as well.
implementation. Whenever possible, emergency changes The regression analysis would indicate the func-
should be tested prior to implementation. If IT is unable tionality that requires regression testing as well as a
to thoroughly test emergency modifications before instal- solid rationale for excluding those functions that are
lation, it is critical that they appropriately backup files not impacted by the change.
and programs as well as have a back-out plan in place. The following documents should be assessed for
potential impact due to the change, and updates
Pre-Approval and Planning should be planned, where required:
A cross-functional team should determine how the Validation package including user requirements
change might affect the system before the change is specification (URS), technical requirements spec-
made. This cross-functional team should include the ification (TRS), TM, design qualification (DQ),
system owner (or business area representative delegated installation qualification (IQ), operational qualifi-
by the system owner) and other key contributors includ- cation (OQ), performance qualification (PQ), and
ing, but not limited to, quality assurance (QA) and IT. validation plan and report
Depending on the nature and assessed impact of the Design documentation
change, the level and rigor of documentation and testing Procedures for using and maintaining
will likely vary. the system.
Approval to move forward with the change must
occur before any changes to the system are made. In In some cases (e.g., for large or complex changes,
an urgent situation (emergency change) a change might or due to cumulative change over time), a complete
be granted prior to completion of the formal change rewrite of certain affected documents may be neces-
control process. In that case, the type of change must be sary in lieu of addenda or point revisions.
documented in the same manner. The decision whether Changes should be planned and executed cross-
to accept or reject a change would be based on a number functionally, minimally involving IT, QA, and the
of rules. The fundamental logic should be business area owning the system. Changes should be
as follows: communicated to all impacted areas and functions.

24 Special Edition: Computer & Software Validation


Farhad Forozesh

Executing the Change Figure 2.


In the execution phase, the change is actually Virtual environments.
made in a staging environment so it can be
tested before production implementation. The
change (and other aspects of the system that Training
may have been affected) is tested to ensure System Production
Sandbox
Testing
the system accuracy, reliability, and consis- Validation
tent intended performance. The testing must
be documented, and the results should either
IT Quality
lead to corrections and additional testing, or
confirm that the end result after the change is
what was intended. The documentation associ-
ated with the change should also be completed. and approval authorities from QA and IT. The new
Changes should initially be implemented away version of the system/software is then released to the
from the production environment of the validated production environment. This can be done via login
system. This will ensure that no changes are made to script or other means.
the production environment until they have been ful- It should be noted that in the event of an audit that
ly qualified and found to be functioning as expected. includes inspection of any computer system used for
Relative to computer systems, it is advisable to have a regulated purpose, inspectors will typically review
several virtual environments defined in the architec- the system documentation, including records of
ture landscape. Typical environments are described changes. This review will help them to determine
in Figure 2 and discussed as follows: the level of change and consistency in decision mak-
Development environment (sometimes referred ing and documentation, both within the system and
to as Sandbox)a virtual environment where across systems.
experimental coding/configuration takes place, The change control documentation produced will
as the developer/configurator is trying different demonstrate the ongoing validated state of the sys-
solutions, doing preliminary unit testing, etc. tem. Changes must be controlled and well document-
System testinga virtual environment used for ed throughout the process.
preliminary systems testing conducted by IT
Validationa virtual environment that is frozen CONCLUSION
and representative of production, set up for vali- The change control process is important to ensure com-
dation testing, and controlled as unchangeable pliance and avoid a potential risk and possibly a business
throughout validation testing liability. An objective decision-making process should
Training (not always used by all companies for all be used to determine the level and complexity of the pro-
systems)a virtual environment used for hands- posed change. The level of impact that the change might
on training on the new or revised system have should also be determined, and stemming from
Productionthe live business environment or that, the required documentation rigor. Additionally, an
instance of the system. objective process will enable consistent management of
all types of changes.
Testing should verify the following: A change control process is necessary to prevent
System performs as expected after the changes inappropriate modifications or modifications that
were made lead to adverse effects. Effective change control is an
Systems original functionality continues to work important aspect of maintaining the validated state of
after the changes were made the system, enabling continuous improvement, and
New changes do not introduce errors that keep preventing compliance gaps.
the system away from performing as intended.
REFERENCES
Final Approval/Implementing the Change H. Ronald Berlack, Software Configuration Management, John Wiley
Final approval to release the new version to produc- and Sons, 1992.
tion is granted based on successful test results and Ofnisystems, Change Control for Validated Systems, Ofnisys-
completion of documentation package. If training is tems.com, http://www.ofnisystems.com/Validation/Change_
required, affected personnel (e.g., users, super-users, Control_for_Validated_Systems.htm, accessed 9/13/2010.
IT support) must either be trained before they are GXP
able to access and use the system or before the imple-
mentation into the production environment. Final ARTICLE ACRONYM LISTING
approval is typically granted by the system owner DQ Design Qualification

Special Edition: Computer & Software Validation 25


Farhad Forozesh

tion engineer at Abbott Vascular responsible for coordinating


IQ Installation Qualification validation activities (equipment/software validation projects)
IT Information Technology with all primary and support groups and providing technical
OQ Operational Qualification leadership and guidance. Farhad can be contacted by e-mail at
PQ Performance Qualification farhad.forozesh@av.abbott.com.
QA Quality Assurance
TM Traceability Matrix ABOUT THE COLUMN COORDINATOR
TRS Technical Requirements Specification Barbara Nollau is a director of supplier and alliances quality at
URS User Requirements Specification Abbott Vascular. She has 26 years of experience and increasing
responsibility in the pharmaceutical and medical device indus-
tries, spanning the areas of manufacturing, quality assurance and
ABOUT THE AUTHOR compliance, validation, and information technology. Ms. Nollau
Farhad Forozesh has 13 years of experience in the pharmaceu- can be reached by e-mail at Barbara.nollau@av.abbott.com.
tical and medical device industry. Farhad is a senior valida-

Originally published in the Autumn 2009 issue of Journal of GXP Compliance

26 Special Edition: Computer & Software Validation


Frank Houston and Mark Weinglass

How to Right-Size Computer


System Validation Based on
Criticality and Complexity
Frank Houston and Mark Weinglass

Computer Validation Forum discusses topics and following four steps:


issues associated with computer validation in order A ssess regulatory impact
to provide useful resources for daily work applica- A ssess criticality
tions. It brings information regarding regulatory A ssess complexity
requirements for the validation and qualification of Plan validation deliverables.
computerized systems.
Reader questions, comments, and suggestions are ASSESS REGULATORY IMPACT: GXP AND
required to fulfill the objective for this column. Case NON-GXP ANALYSIS
studies illustrating principles submitted by readers Computerized systems have modules for a wide range
are welcome. Please send your comments to col- of business and production activities. The functional-
umn coordinator Sharon Strause at SharonStrause@ ity of each module may or may not affect data and
EduQuest.net or to journal coordinating editor Susan decisions about product quality or safety.
Haigney at shaigney@advanstar.com. To make certain all functions are validated if they
affect GXPs and simultaneously to avoid unnec-
KEY POINTS essary documentation, a GXP impact assessment
The following key points are discussed in this article: is performed for each of the applications mod-
Validate for intended use utilizing criticality ules. This step establishes your target systems or
and complexity functionalities.
Use criticality and complexity to determine The following computer system functions have
documentation deliverables for computer system regulatory impact and need to be analyzed further:
validation Create, maintain, or preserve records or documen-
Plan considering regulatory impact and validation tation required by GXP regulations (the system
deliverables utilizing criticality and complexity provides the information but not the answer)
input. Create, maintain, or preserve records or docu-
mentation needed for product quality and safety
INTRODUCTION decisions (the system provides the information
As we are all aware, validation of computerized sys- but not the answer)
tems can generate a lot of documents. Because todays Automation of GXP, product quality, or product
systems are highly interconnected, it is not easy to safety decisions (the system provides the answer)
determine when or where to stop validating. If we are Output data to other system modules or external
not careful, we will end up validating the universe. systems having any of the functions described in
A rational process for generating a list of target systems 1, 2, or 3
and validation deliverables will go a long way toward P rocesses input data from other system modules
streamlining the validation process. or external systems having any of the functions
The approach we are advocating consists of the described in 1, 2, or 3.

[
ABOUT THE AUTHORS
Frank Houston is a senior validation consultant for EduQuest, Inc. His career includes digital de-
For more Author sign, clinical engineering, and biomedical engineering. Mr. Houston has done software quality audit-
information, ing and consulting for clients of all sizes in both the medical device and pharmaceutical industries.
go to Mark Weinglass is a senior validation consultant for EduQuest. He has over 25 years of professional
gxpandjvt.com/bios experience in the design, development, and validation of computerized process instrumentation, control
systems, medical devices, and related project management activities in the FDA-regulated industries.

Special Edition: Computer & Software Validation 27


Frank Houston and Mark Weinglass

functions in computer systems:


Best practices for a GXP and non-GXP analysis The work process has no alternative methods to
include the following: perform the needed functions
Evaluate each function or module for GXP, prod- A lternative methods for the work process are
uct quality, and product safety impact impractical or grossly inefficient
L ist each function or module of the system as The work process has no check or verification
either GXP or non-GXP with rationale for each steps to detect failures and defects
conclusion The system performs multiple GXP decision
Cite the applicable section of the predicate rule functions
(21 CFR xxx.xxx or other regulatory agency rules) The system generates or acquires primary (origi-
for the GXP modules or functions nal) data
Specify the affected product quality or product The system generates, stores, and preserves elec-
safety feature for the listed modules. tronic records
The system uses electronic signatures
The list of GXP-related system functions or mod- The system controls critical process parameters
ules described above is the deliverable for this step The system controls user access and privileges via
of Criticality and Complexity analysis. If the system levels of user authorization
has regulatory impact, proceed with Criticality and Work-around methods exist for the required func-
Complexity assessment to plan for the validation tions, but they are noticeably less efficient than
deliverables. the automated method
The work process has some checks and verifica-
ASSESS CRITICALITY tion steps to detect failures, but the detection
System criticality rests on the following three factors: process has demonstrated a marginal capability
Safety riskprevention of harm index
Quality riskmeeting all documented The system operates on or transmits electronic
requirements records.
Business riskcost and feasibility of production
and service. ASSESS COMPLEXITY
A complex computer system has more opportunities
Critical functions can be identified and overall for failure than a simple one. Therefore, it requires
system criticality can be estimated by evaluating each more effort to validate. The complexity of a computer
computer system or function. It may be that only system is not based solely on the complexity of the
certain modules, functions, or parts of the system are technology. Complexity depends on many factors
covered by the GXP regulations, but errors or failures including the following:
may have other negative effects on the business. Intricacy of the underlying work process
Determine the potential consequences of failure Sophistication and interconnectedness of the
by asking the following questions: computer programs involved
What could happen if the computer system fails Familiarity of the staff with the system or systems
to function as specified? like it
What could happen if the failure goes E xtent of computer infrastructure changes needed
undetected? to implement the system.

Consideration of failure to function as speci- Determining the complexity of a computer sys-


fied must include more than complete failure. Com- tem requires input from both the customer(s) and
putational errors, such as incorrect calculations, must the supplier(s) of the system. The customer needs to
be considered as well. understand the complexity of the underlying work
In GXP regulated systems, the following attributes process; that is, the system requirements. The sup-
are most important: plier must understand how the system (or software)
C orrectness and accuracy of the data a system would perform the functions in order to meet these
acquires, stores, or transmits requirements; that is, what goes on inside the com-
L ong-term integrity of GXP data stored by the puter system.
system Some characteristics of a complex computer system
C orrectness and consistency of automated deci- are as follows:
sions over the full range of input conditions. Performs complicated algorithms or calculations
Interacts with multiple computer systems, pro-
The following are typical characteristics of critical grams, or external databases

28 Special Edition: Computer & Software Validation


Frank Houston and Mark Weinglass

Performs extensive and complicated data input Figure: Risk vs. criticality vs. complexity.
checking or control
P rocesses numerous types of transactions
Requires extensive support to maintain the system
Involves large numbers of users
Includes significant customization of a standard
software package through configuration or addi-
tion and modification of the source code.
R
i
CRITICALITY, COMPLEXITY, AND RISK s
As the Figure shows, criticality and complexity com- k

ity
bine somewhat like severity and probability do in

lex
risk assessment. In fact, this analysis is a good start-

mp
ing point for a systematic risk assessment. Take your
Criticalit

Co
number of Yes answers for criticality and multiply
it times the number for complexity, and the result
y
gives you a rough initial risk factor estimate to use
for planning your validation deliverables. sheet, risk factor 0 to 2, one should be able to do nearly
all the documentation needed within the spreadsheet
PLANNING THE VALIDATION itself with maybe one or two other documents or files
DELIVERABLES to cover change control and decommissioning.
It is important to remember that standard operating Validation records must cover the following:
procedure (SOP) documents are never optional, and Development or acquisition planning
plans should not be used as substitutes for SOPs. The Supplier assessment (up to and including supplier
following tasks must be addressed in SOP documents: audit)
Software acquisition, development, and User requirements
implementation Ongoing risk assessment
R isk assessment Functional requirements
Validation Design documentation
Supplier assessment (including audits) Design verification (including reviews)
Change control Qualification of software implementation includ-
Design ing the following tests as needed:
C ode review Installation
Testing. Operation
Performance
To begin validation planning, consider the follow- Traceability
ing questions. Change control and maintenance of validation
In your system are there criticality issues with: status.
Patient safety?
P roduct quality? Use the risk factor number to set an initial goal
P roduction operations (usability or efficiency for the number of documents or files to produce as
for example)? evidence of validation. Remember, SOPs do not count
In your system are there complexity issues with: as validation records.
The work process? Documents that combine easily include the
The computer programs or the equipment to be following:
used? System development plan and validation plan
Staff familiarity with the programs or Requirements documents
equipment? Test plans, protocols, and associated reports
Infrastructure changes needed? Test report and validation report
System implementation report and validation
Count up the number of Yes answers in each cat- report
egory and calculate a rough risk factor by multiplying Installation qualification and operational
them together. qualification.
The risk factor calculation should result in a number
between 0 and 12. The lower the number, the more A generalized validation procedure with a valida-
documents you can combine. For a simple spread- tion report form could be developed for the simplest,

Special Edition: Computer & Software Validation 29


Frank Houston and Mark Weinglass

least critical systems. These are rarely found in prac- Validate for Intended Use becomes easier with good
tice, but they should be used more often. Many vali- planning and use of the criticality, complexity, and
dations are fairly routine activities, and they do not risk processes. JVT
require extensive plans and reams of documentation.

CONCLUSION
With some careful homework and a few rules-of-
thumb, one can cut validation effort down to size.

Originally published in the Autumn 2010 issue of Journal of Validation Technology

30 Special Edition: Computer & Software Validation


Jae Burnett

Practical Use of Automated


Tools in Computer System
Compliance
Jae Burnett

Computer Validation Forum discusses topics and tion requirements to maximize benefits
issues associated with computer validation in order The organizations information technology
to provide useful resources for daily work applica- (IT) strategic vision is one way to define how to
tions. It brings information regarding regulatory identify, select, prioritize, plan, and implement
requirements for the validation and qualification automated tools for computer system validation.
of computerized systems. These IT initiatives can realize significant value by
Reader questions, comments, and suggestions the adoption and integration with the computer
are required to fulfill the objective for this col- system compliance process.
umn. Case studies illustrating principles submitted
by readers are welcome. Please send your com- INTRODUCTION
ments to column coordinator Sharon Strause at sas- For those of us working in US Food and Drug Admin-
trause@aol.com or to journal coordinating editor istration-regulated industries, computer system vali-
Susan Haigney at shaigney@advanstar.com dation (CSV) has been the long standing practice of
establishing documented evidence that a specific pro-
KEY POINTS cess will produce, with a high degree of assurance, a
The following key points are discussed in this article: product meeting its predetermined specifications and
This discussion addresses the use of enabling quality attributes. The FDA definition of validation
technology in computer system validation (CSV) rolls effortlessly off our tongues when those not famil-
projects to most efficiently achieve the validated iar with the discipline ask. And, as we continue into a
state in a pragmatic cost-effective manner more detailed explanation of the validation lifecycle,
Requirements definition management (RDM) the eyes of those who ask the question begin to glaze
and automated testing software are used regularly over as we cite regulatory references and enthusiasti-
for the validation and verification of embedded cally dive deeper into the details of how validation is
software in the design and development process accomplished. Invariably, those discussions include
for medical devices terms such as controlled processes, risk assessment,
G AMP 5 (March 2008) states that automated CSV documented requirements, and documented testing
testing tools can be used to improve test execution results that typically are met by the manual methods
efficiency and effectiveness of CSV. The outcome of a validated computer sys-
Automated CSV tools provide the most benefit tem is for the benefit of the organizations use of an
for larger enterprise applications such as enter- enabling technology in a regulated process. Typically,
prise resource planning, document management the organization or business unit is using technology
systems, laboratory information management to transform or improve manual or inefficient busi-
systems, corrective action and preventive action, ness processes. Yet, the process of CSV has historically
and so on been mostly manual and paper driven. However,
Organizations should consider a formalized the use of enabling technology in CSV projects can
validation plan for each tool or set of tools to serve industry well as a way to achieve the validated
describe the risk, use, and validation or qualifica- state, while reducing the overall duration of validation

[
ABOUT THE AUTHOR
For more Author Ms. Jae Burnett is a senior manager at Deloitte & Touche, LLP with 10 years experience in the
information, pharmaceutical, biotech, and medical device industries and extensive knowledge of required system
go to controls and processes to comply with FDA regulations 21 CFR Parts 210, 211, 820 and Part 11. Sharon
gxpandjvt.com/bios Strause, the column coordinator, is a senior consultant with EduQuest, Inc. Sharon may be reached
at sastrause@aol.com.

Special Edition: Computer & Software Validation 31


Jae Burnett

with gains in efficiency. Careful consideration and on a GXP regulated system, it becomes subject to
purposeful application of the appropriate technology specification and verification based on risk. While
tools for your organization can help you gain greater GAMP 5 doesnt focus on all types of automated tools,
control and streamline processes in a pragmatic, cost- these principles should be applied when considering
effective manner. automated tools for GXP systems.

CSV EVOLUTION APPROPRIATE USE OF AUTOMATED CSV


Historically, the validation process for computerized sys- TOOLS
tems can be time consuming and, if not focused properly, Automated tools are not the answer for all systems;
can become an exercise in documentation. The phar- the effort to qualify a tool used for validation of a
maceutical, biotech, and medical device industries have smaller-scale system could actually be greater than
made great progress in reducing unnecessary or minimal performing and controlling the validation activities
value validation by adopting a risk-based approach to CSV. manually. Automated tools in CSV provide the most
Most companies now recognize the value of using a risk- benefit for larger enterprise applications such as enter-
based approach as a means to identify the systems and prise resource planning (ERP), document manage-
system functions that fall under FDA predicate rules and ment systems (DMS), laboratory information manage-
are subject to validation requirements. The next major ment (LIMS), corrective action and preventative action
improvement for CSV is the adoption and implementation (CAPA), and complaint handling. The validation
of enabling technology for use in the validation process. effort for these types of systems can be significant as
Automated tools supporting the management of require- well as the on-going sustainment activities such as sys-
ments, configuration, change and documentation, as tem maintenance and the introduction of changes via
well as automated testing, can be leveraged if integrated the change control process. For example, automated
into the validation process appropriately. The industry testing tools, such as HP Quality Center, provide a
is traditionally risk-adverse, but the adoption of enabling central repository for the qualification protocol test
technology for CSV is increasing as companies look to cases that can be easily reused in support of change
take advantage of the benefits it can offer. control and offers functionality to easily capture and
Using automated tools to support the validation record test results. Defect management functions
lifecycle is easier said than done. Other industries not offered in automated testing software are another
subject to 21 CFR Part 11 have benefitted from using benefit to the validation process and allow for real
these tools to aid in the system development lifecycle time view of status of defects, resolution activities,
for years. These industries, however, do not have and results of retesting.
concerns of electronic signatures, audit trail, quali-
fication requirements, and formalized procedures. COMMERCIAL AUTOMATED CSV
These additional requirements should not be seen SOFTWARE
as roadblocks to using automated tools in the CSV Automated testing software does present certain real
process, but they do need to be addressed. Require- challenges when used as part of the validation testing
ments definition management (RDM) and automated for regulated computer systems. HP Quality Center
testing software are used regularly for the validation does not have built-in electronic signature capabili-
and verification of embedded software in the design ties. Life science companies can use a third-party
and development process for medical devices. The software component to provide the capability or can
value of these automated tools is quite high consider- consider the formal approval and control of test cases
ing the criticality, complexity, and volume of software outside of the systemessentially a hybrid approach
components used in medical devices. between automation and manual processes. Formal-
GAMP 5, A Risk-Based Approach to Compliant GXP ized procedures should also be established to define
Computerized Systems, released in March 2008, address- use of the tool in the validation process and detail
es automated testing. The current industry guide any manual controls.
states, Automated test execution tools can be used to Another automated tool becoming more widely
improve test execution efficiency and effectiveness leveraged for ERP computer system validation and
(1). The guide continues, Any use of automated test compliance is SAPs Solution Manager (2). The sys-
execution tools should be defined in the test strat- tem is a centralized tool for supporting the SAP ERP
egy. Tools should be used in accordance with defined software suite installation. Several components and
instructions and manuals as appropriate, and the tool functionality of Solution Manager are leveraged for the
should be held under Configuration Management. validation and on-going compliance of SAP. Solution
Commercial or established tools are normally con- Managers functionality includes implementation and
sidered to be GAMP Category 1. GAMP 5 goes on configuration information, change management, test-
to explain that if an automated testing tool is used ing, and application support, among others. Taking

32 Special Edition: Computer & Software Validation


Jae Burnett

advantage of the integrated functionality to support a level of competency for automated tools used for
the compliance activities can provide big benefits supporting non-GXP systems.
to a life science company. For example, leveraging
Solution Manager as a repository for application CONCLUSION
configuration provides the traceability to specifica- Automated tools can have a real impact on computer
tions. The change request component of Solution system compliance and serve as a way to gain great-
Manager, also known as ChaRM, links the change er control and efficiencies. Life science companies
request, approval, test, and migration of the change should consider the various tools supporting the
through the SAP system landscape with its integration management of requirements, configuration, change
with the transport management system. SAP also control, documentation, and automated testing as
offers adapters with third-party applications (3, 4), real options. These IT initiatives can realize signifi-
such as HP Quality Center, adding additional testing cant value by the adoption and integration within the
functionality. Full traceability between the change computer system compliance process.
request and the production system build is realized.
Life science companies recognize the business and REFERENCES
compliance value of this tool and make plans early in 1. ISPE, GAMP 5, A Risk-Based Approach to Compliant GxP
the lifecycle of their SAP implementations to validate Computerized Systems, ISPE, page 207, 2008.
or qualify Solution Manager. 2. Components and Tools of SAP Netweaver: SAP Solution
Manager, http://www.sap.com/usa/platform/netweaver/
STRATEGY FOR USE OF AUTOMATED components/solutionmanager/index.epx.
CSV TOOLS 3. SAP Solution Manager Adapters, http://www.asug.com/
Before an automated tool can be used in the CSV process Search/SearchResults/tabid/211/Scope/All/Default.
or other compliance activities of a GXP-regulated system, aspx?Search=solution+manager+adapters&ResultTy
planning and assessment of any tool should be consid- pes=38,102,2.
ered. The functionality of the tool and its intended use 4. Pharmaceutical Online, Genilogix Announces Avail-
will determine the extent of validation or qualification ability of Validation Accelerator with e-Signature for
requirements. Organizations should consider a for- the Latest Version of HP Quality Center, Pharmaceuti-
malized validation plan for each tool or set of tools to calonline.com, December 15, 2008. http://www.pharma-
describe the risk, use, and validation or qualification ceuticalonline.com/article.mvc/Availability-Of-Valida-
requirements. Operating procedures should also be tion-Accelerator-0001. JVT
in place to detail system administration, configuration
management, and any other control processes. NOTE: This publication contains general information
As automated tools become more widely used, life only and Deloitte is not, by means of this publication, ren-
science companies can take advantage of the benefits dering accounting, business, financial, investment, legal,
and leverage the technology for CSV and compliance. tax, or other professional advice or services. This publication
A pragmatic approach to integrating automated tools is not a substitute for such professional advice or services,
in the validation process should be taken. The effort nor should it be used as a basis for any decision or action
to qualify a tool for validation of smaller-scale systems that may affect your business. Before making any decision
could be greater than the effort to validate manually. or taking any action that may affect your business, you
Identification and selection of enabling tools should should consult a qualified professional advisor.
be carefully considered. Companies need to have a Deloitte, its affiliates, and related entities shall not be
clear vision of how these tools are used to sustain responsible for any loss sustained by any person who relies
system compliance and provide benefit to the organi- on this publication.
zation. A computer system compliance roadmap that
complements the organizations IT strategic vision is ARTICLE ACRONYM LISTING
one way to define how to identify, select, prioritize, CAPA Corrective Action and Preventive Action
plan, and implement automated tools for computer CSV Computer System Validation
system validation. The compliance roadmap aligns DMS Document Management Systems
with the IT strategic plan and offers a method to lay ERP Enterprise Resource Planning
out the compliance activities and tools for computer FDA US Food and Drug Administration
system validation and operational controls for GXP IT Information Technology
systems. Organizations that involve members from LIMS Laboratory Information Management
the IT, business, and compliance groups will benefit Systems
from early assessment and planning for enabling tools RDM Requirements Definition Management
for GXP systems. Many organizations already have
Originally published in the Autumn 2009 issue of Journal of Validation Technology

Special Edition: Computer & Software Validation 33


Sharon Strause

Selecting and Partnering


with a Vendor for a Qualified
Software Product
Sharon Strause

Computer Validation Forum discusses topics your requirements in-house.


associated with computer validation in order to This article explores at a high level the questions
provide useful resources for daily work applica- and decisions that you will need to make during the
tions. It brings information regarding regulatory process of software development by an outside vendor
requirements for the validation and qualification or as a software development project utilizing your
of computerized systems. in-house programmers.
Reader questions, comments, and suggestions
are required to fulfill the objective for this col- IN-HOUSE OR OUTSIDE VENDOR
umn. Please send your comments to column coor- Lets begin by exploring the benefits, drawbacks, and
dinator Sharon Strause at sastrause@aol.com or concerns of developing software code in-house versus
to journal coordinating editor Susan Haigney at contracting for these services from an outside vendor.
shaigney@advanstar.com.
Benefits of In-House Development
KEY POINTS In-house development provides the following:
The following key points are discussed in this article: Defined policies
Benefits, drawbacks, and concerns of developing Standard operating procedures
software code in-house versus outside contrac- Guidelines
tors are discussed Accountability to senior management for the
Main objectives of vendor management for soft- project
ware development are discussed Resources and budget
Types of vendor audits and their appropriate Available personnel with a knowledge of the
use are described business
Audit preparation is a key component to for- Team approach for the project.
mulating a plan and maintaining control of a
vendor project. Drawbacks of In-House Development
Drawbacks of in-house development include:
INTRODUCTION Personnel may not be available, especially if the
The software vendor market is highly competitive in programming staff is smaller and focused on a
todays world. What decisions do you need to con- particular programming language
sider in choosing the right software developer for your The technical expertise required for project may
companys needs without sacrificing the compliance not be available
required by the US Food and Drug Administration Resources may not be used efficiently
for computer system validation and supplier (vendor) Long-term maintenance issue (this should be
management? This is just one of many questions addressed carefully).
that you need to consider when selecting a vendor
for either a software package that is customizable to Benefits of Software Development
your requirements or choosing to write the code to by a Vendor

[
For more Author
information,
ABOUT THE AUTHOR
Sharon Strause is a senior consultant at EduQuest, Inc. working in quality assurance compliance and
go to
computer system validation. She may be reached by e-mail at SharonStrause@EduQuest.net.
gxpandjvt.com/bios

34 Special Edition: Computer & Software Validation


Sharon Strause

Using an outside vendor can provide the following software development, rather than doing that work
benefits: in-house.
Internal resource availability The following are four main objectives of vendor
Technical matter experts availability management for software:
Experience with multiple implementation Selecting the right vendor
approaches Working with the chosen vendor
Expertise and knowledge. Keeping control of a software development project
(who does what?)
Drawbacks of Software Development Developing a vendor partnership.
by a Vendor
The following are some drawbacks to using a vendor Selecting the Right Vendor
for software development: There are a few ways to approach finding a quali-
Not accountable to company management (just fied vendor. First check within your own company
what is in the contract) to see what other vendors have been utilized and
Delays due to communication, lack of knowledge the lessons learned from that companys contract.
of company policies and procedures, conflicts Second, check with affiliate organizations, like
with their own policies and procedures, or lack the American Society for Quality or the Parenter-
of knowledge of company operations al Drug Association. Third, you can use industry
Budget and resources may be fixed (dependent networking resources or industry publications and
on contract terms) journals. Fourth, you can ask other vendors for
Team approach may not be evident recommendations.
(i.e., we versus them). Once youve chosen the vendor, its time to audit a
vendor. The following are three types of audits that
Company Concerns with Vendor you should use with a vendor:
Companies do have concerns with vendors that need Pre-selection audit. This audit determines who the
to be addressed as a part of any contract, but they also vendor will be based on a set of criteria.
play a role in determining whether an outside vendor In-process audit. This audit determines how the con-
will be utilized. Concerns include the following: tact, communication, and coding are proceeding.
Determining that the vendor has the personnel Post-development audit. This audit determines
with the expertise required maintenance requirements.
How will the project be communicated so that
all parties understand their roles? Working with the Chosen Vendor
Can the vendor work independently or will they Once a vendor has been chosen, a contract should
require constant communication? be developed between the vendor and the company.
Can the vendor deliver a functioning system within The contract should include the following:
the time and budget and meet all the internal quality The contract should be formal and signed before
assurance (QA) standards required for the project? the work starts. Usually this is a normal func-
What about the accountability level? tion of the vendor management process and
purchasing control.
Vendor Concerns with the Company The contract should have terms and conditions
A vendor may have its own concerns, as follows: (i.e., type of service, identification of deliver-
Can the project be completed on time and meet ables and associated timelines, requirements
the terms of the contract? for personnel, requirements for documentation,
Who will coordinate the plan and keep work in quality and regulatory requirements, etc.).
the pipeline assuring that procedures, guidelines, The contract should have a section on distri-
and regulatory requirements are met? bution of work (i.e., company and vendor and
A vendor has multiple clients and must be able associated personnel at each).
to service all of them, which means the vendor The contract should have quality checkpoints.
needs to be flexible to meet different standards and These could be the in-process audits or docu-
requirements as well as regulatory expectations. mentation deliverables.
The contract should have a cost and payment
OBJECTIVES FOR UTILIZING A VENDOR schedule.
FOR SOFTWARE DEVELOPMENT
Now that weve seen both sides of the argument for Key vendor deliverables established as a part of
software development, lets determine what actions the contract include the following:
would be necessary to utilize an outside vendor for Design and development documentation. If

Special Edition: Computer & Software Validation 35


Sharon Strause

the company is going to do the maintenance of compatibility of the systems so that the enterprise
the system, this will be critical. If the vendor works well and efficiently.
were doing the maintenance, this documenta-
tion would be part of the in-process and post- AUDIT REQUIREMENTS AND
development audit review. PREPARATION
Test plans and results documentation. The ven- The following are three types of audits that you
dor would retain this and would be reviewed by should use with a vendor:
the company during the in-process audit reviews A pre-selection audit determines who the vendor
and any post-development audit reviews. will be, based on a set of criteria
System and user manuals with release note and An in-process audit monitors the contact, com-
quality program documentation. munication, and coding of the project
Training plan and materials. This would be A post-development audit determines mainte-
developed with the company. nance requirements.
Knowledge-transfer process (if maintenance is
going to be the responsibility of the company). Preparation for All Audits
Preparation for audits is a key component to for-
Keeping Control of a Software mulating a plan and maintaining control of an
Development Project audit. Preparation for the three types of audits
It is important for the company to keep control will be similar. It begins with a schedule, estab-
at all times during the development project. If a lishing an agenda, a date and time for the audit,
good contract has been completed, this should the personnel involved in the audit, the require-
be easy. If not, you will have missing or incom- ments for review, and the audit results follow-
plete documentation; var ying quality standards ing completion of the audit. The requirements
on the code itself as well as the deliverables; more for review will change with each type of audit.
in-house work will be required; and there will Audits should show that the vendor is operating
be some hostility between the vendor and the in a quality manner and that project deliverables
company because of missed deadlines, missing are complete and accurate.
functionality, and a system over budget. Good
project management is key to keeping control in Pre-Selection Audit
a software development project. This audit is a critical one because it should reveal the
most important areas for the company to understand
Developing a Vendor Partnership regarding the usage of this vendor. It should include
Developing a vendor partnership provides leveraging questions on the following:
opportunities for the company. There can be shared Vendor stability, both financial and the number
work between vendor personnel and company per- of years working in the industry
sonnel. The company doesnt have to do extra work An organization chart should be requested to see
when the project is delivered by the vendor. The where quality fits into the vendors management
vendor does what they do best and the company does structure. Is quality a separate department or a
the same. Both will develop a common language function of one of the managers?
for terminology and deliverables. There will be an Procedures covering the software development
inter-dependent work relationship, a reduction in lifecycle, quality manual, quality policy, disaster
the time needed for a project, and a reduction in the recovery, document management, etc. (reviewed
cost required for a project. and assessed against the companys procedures
Preferred providers give support and development and regulatory requirements).
resources on a continuing basis. The vendor will Development methodology. How does devel-
learn the companys environment in order to better opment occur? What safeguards are in place to
understand the companys business requirements. ensure sections of code are secure? What types
All lessons learned can be applied to future projects of testing are completed?
and especially for on-going support of the projects. Measurement systems should be reviewed (i.e.,
A preferred provider means that the company gets customer issues, bug fixes, etc.).
first priority for vendor resources and a more con- Resource availability and technical expertise.
sistent look and feel to the information managements Does the vendor have enough personnel to do
systems being utilized by the company, which is your project as well as others in the timelines that
always helpful in the regulatory environment that you require? Can you review resumes of the per-
everyone operates in today. There should be a better sonnel to see what types of education and years

36 Special Edition: Computer & Software Validation


Sharon Strause

of experience the developers of the vendor have? would take place as a part of the companys audit sched-
Training of the personnel. Is any regulatory train- ule for vendor management.
ing included?
Industry knowledge or your companys specific CONCLUSION
knowledge (you must understand what you might Developing a partnership with a vendor begins
need to train). by selecting a qualified vendor that is determined
Will they fit with your company? Can you in your by a pre-selection audit to ensure that the vendor
discussions determine whether open communica- is capable of providing the services you require. A
tion will be possible and factual information pass partnership establishes expectations for both parties;
between the company and the vendor? examines methodologies for differences; identifies
specific deliverables required in the contract; identi-
In-Process Audits fies the roles and responsibility for the work and key
These audits are performed during the process of develop- milestones; reviews checkpoints with key contacts for
ment. Depending on the criticality of the development, communication; and allows time for implementation,
more than one audit may occur. These audits have a QA validation, and review.
person to lead and conduct the audit, a technical spe- Most important, however, is that the company and
cialist from the information technology department to vendor treat each other as a valued partner working
review the technical issues, and a business person for the toward a common goala quality, regulatory secure,
user needs of the code being developed. The following software development project.
should be considered during an in-process audit:
Review the deliverables for the project and any
corrections. Are you staying on the established REFERENCES
schedule or must negotiation take place? Is the FDA, 21 CFR 11, Electronic Records, Electronic Signatures, 62
documentation in place? Does the code demon- Federal Register 13464, March 20, 1997.
strate the user requirements? FDA, 21 CFR 210, Current Good Manufacturing Practice in
Manufacturing, Processing, Packing, or Holding of Drugs:
Post-Development Audits General, 43 Federal Register 45076, September 29, 1978.
These audits are usually completed after the software code FDA, 21 CFR 211, Current Good Manufacturing Practice for
project has been delivered and is in place in the company. Finished Pharmaceuticals, 43 Federal Register 45077, Sep-
They are usually a result of either enhancement or changes tember 29, 1978.
that need to be completed for the code or if the vendor is FDA, 21 CFR 820, Quality System Regulation, 61 Federal Reg-
providing the support of the system. ister 52654, October 7, 1996. JVT
If the vendor becomes a valued partner, the audit

Originally published in the Summer 2010 issue of Journal of Validation Technology

Special Edition: Computer & Software Validation 37


Robert Smith

Information SecurityA Critical


Business Function
Robert Smith

Computer Systems Quality and Compliance discusses practical aspects of


computer systems and provides useful information to compliance professionals. We
intend this column to be a relevant resource for daily work applications.
Reader comments, questions, and suggestions are needed to help us fulfill our
objective for this column. Case studies illustrating computer systems quality and
compliance issues by readers are most welcome. Please send your comments and
suggestions to column coordinator Barbara Nollau at barbara.nollau@av.abbott.com
or journal coordinating editor Susan Haigney at shaigney@advanstar.com.
Rupert King/Getty Images

INTRODUCTION
Businesses function in an electronic world where potentially sensitive
information and data are stored on computers and networks. These same
networks may be vulnerable to attacks that could result in corruption of data
or loss of property. Information security should be an important part of any
business practice.
This article describes a hypothetical breach of computer security. It
describes how easily a corporate computer system may be accessed, both by
unauthorized internal personnel and by an outside hacker. The results of
such a breach may be disastrous. And it may be surprising how easily these
breaches can be accomplished. Suggestions for preventing these types of
problems are provided.

A BUSINESS NIGHTMARE
NewGen43 is a (hypothetical) pharmaceutical biotech company located in
the US. Sally is NewGen43s complaint-handling lead. NewGen43s man-
agement has become concerned about late medical device reports (MDRs).
Sally has been with NewGen43 for 15 years and she likes it there. Sally is not
sure why new management was brought in for her department. As far as she
can tell, she was doing fine. She thinks her new boss is giving her, and her
team, an incredibly hard time over late MDRsshe had even been given a
warning! Sally is troubled.
SoftBio Systems is a software development company located in Estonia.
The company was founded and is led by Gunter. Their primary customer
is TopBioPharma, a competitor to NewGen43. Gunter and his coworkers
in Estonia are making more money than they have ever imagined. The deal
with TopBioPharma has been incredibly lucrative for them. When they set
up shop, they never imagined five years of development work setting up the
software for clinical trials would follow. By Estonia standards they were rich.
So Gunter and his team were distraught when their contact at TopBioPharma
called to say they were terminating their contract with SoftBioSystems. It
seemed that NewGen43, a rival to TopBioPharma, was expected to complete
its final trial and would have compelling outcomes that would give TopBio-
Pharma poor prospects at best. Gunter asked when the submission was
plannedabout four months, he was told.

38 Special Edition: Computer & Software Validation


Robert Smith

A Well Intentioned Insider contract with TopBioPharama. Using social business net-
Bob, Sallys husband, could not believe that after 15 years working sites they developed a list of people they could
Sallys job, or at least the raise they were counting on target to get them into NewGen43 and in particular they
for that new RV, was at stake. He asked her to explain targeted a number of people that were looking for jobs.
the problem she was having at work. She said, Its the The next step of their attack was to get some of those
software, its very hard to use. My team wants to do a people to compromise their work computers. It was easy
good job, but the system is old and has all these rules we to cull a list of work e-mail addresses. They sent each
have to follow. Thats why we keep filing late. Bob was a person a carefully worded recruiting e-mail.
pretty good software developer and had some experience One of these e-mails found its way to Sally. She was
with software like the complaint system Sally used. They already feeling pretty good about her new status at New-
decided after dinner Sally would log in remotely (VPN) Gen43. But what was the harm at looking at the Top
and give Bob a demo. She had to fix a complaint any- Tier Pharma Company seeks Senior Manager, top salary
way. Sally showed Bob the system as she worked on the + bonus + signing bonus job posting? She clicked on the
complaint. Then she said, Look now I have to log out link and provided all the information they asked for. Af-
and back in to change roles. Bob laughed and pointed at ter all, if her profile was selected she would get a free iPod
the URL; it read: Touch. She was a little irked by the security warning that
kept popping up, but she wanted to complete that profile.
http://complaints.newgen43.com/process/ref=complain She was relieved when she was donemaybe she would
t45689?role=supervisor get the iPod!
Gunter exclaimed, We got one! His team went to
Bob said, Log back out and back in like you did be- work. They installed software that would allow them to
fore. Sally did. Now the URL read: control Sallys computer on the iPod and on a CD they
sent as well. They had it packaged and on its way that
http://complaints.newgen43.com/process/ day.
ref=complaint45689?role=handler A little over a week later Sally got a nice letter congratu-
lating her on her accomplishments and thanking her for
Bob pointed out the role in the URL. See, Bob submitting her profile. She immediately plugged in the
said, The system is granting you permissions based on new iPod and installed the software.
what it sees in the URL. Bob asked Sally what the other Later that day Gunter was scanning NewGen43s
roles were. Sally said, Id love to be admin, they can fix network. He found Sallys system secure, but was able to
almost anything. Lets try editing the URL, suggested install a hacking tool to infect other systems. He compro-
Bob. Sally edited it to read: mised a few systems, but in general NewGen43s IT team
had done a good job. Then Gunter noticed something
http://complaints.newgen43.com/process/ he could not believe. Sally was suddenly connected to
ref=complaint45689?role=admin the complaint management systems as Admin. He used
that connection to connect to the complaint database. He
The administration screen appeared. Sally could not quickly learned the complaint system had a programmat-
believe her eyes. She threw her arms around Bob and ic link to the clinical system. He pulled that code back to
said, I can fix anything now! I can easily hit the MDR on- his system. Then he used the databases command shell
time targets now. Bob said, Dont get carried away! But to infect the database server with his own remote access.
Sally could see that her team was on its way. He set it to call him over a standard web port every few
At first Sally felt a little guilty changing the filing dates hours.
using the administrator access. She started looking for Castle keys. Gunter reverse-engineered the code he
a new job, just in case. She was not sure she could keep pulled back and found the NewGen43s clinical system
changing the dates. But slowly it got easier. Her boss login ID and password. It took him a few hours, but he
praised her and her team. She said Sally and her team wrote some interesting programs. The first was to change
were role models, that other teams could do the same. She the code in the complaint system that talked to the clini-
even got a small bonus. Sally and Bob bought that new cal system to insert small random errors as well as insert
RV. There was no way she could stop now. bogus complaints, tricking the clinical system into think-
ing that there were additional failures. These changes
Trouble from the OutsideAttack were subtle. His goal, after all, was to derail approval by
Meanwhile in Estonia, Gunter and crew were plotting. corrupting the trial data.
They decided that they had nothing to lose. The global Subtle manipulation. Next, he added a program to the
economy was bad and the chance of finding another clinical database that made small but insidious changes.
lucrative contract soon was nil. So they decided to try His intent here was to do a small amount of damage over
to sabotage NewGen43s trial so they could keep their the next several weeks to months. His program would

Special Edition: Computer & Software Validation 39


Robert Smith

change certain key data randomly, but viably, so as not Every element of the story presented in this article is
to be immediately detected. He knew what he was doing completely plausible using off-the-internet hacking tools.
and what results would disrupt the trial. So slowly the The imaginary Gunter is not a top computer scientist. In
trial populations blood pressure dropped, pulse rates fact, the skill to perform this attack would be considered
went up, blood iron levels rose, and so did HDL. moderate to advanced intermediate. So what happened?
Two months passed and TopBioPharma called Gunter. We have all learned from television and big screen
We are going to keep the program going. NewGen43 just crime dramas that we need a motive. In this case there
pulled out of a conference where they were going to pres- are two key motives. First Sallys motive: she just wanted
ent their trial results, our R&D team decided to start the to keep her job. She loves her company and her job. She
next trial. Well send you a purchase order for the next just had a clash with a new manager over a few percent-
phase of the project. Gunter was happy, but not greedy. age points on late MDRs. She never intended to hurt the
Erasing his tracks. Gunter quickly connected back company; she was just scared.
in to NewGen43. He deleted his programs and cleaned Gunter was a reputable software consultant who had
up as best he could, but knew a few traces would be no idea how, in this bad economy, he would replace the
left behind. He then inserted a common virus that had kind of lucrative contract TopBioPharma represented. He
a payload that would encrypt the disk, including the knew that TopBioPharma made good products and he
database. This would also secure any local evidence of was sure people would be just as well off with TopBio-
his programs tampering. He knew the company would Pharmas drug vs. the product made by NewGen43. In
restore from a backup, but that didnt matter, as long as he his mind, patients werent hurt, he kept his contract, and
could erase his tracks. TopBioPharma stayed a lucrative customer.
Next, he used the other systems he compromised Could this scenario happen at your company? Do you
to launch a wide spread attack inside the NewGen43 have an employee that values their job? Could they, for
network, installing a common botnet (a way for external what they think are innocent reasons, take advantage of a
hackers to control computers that are not theirs.) He did vulnerability in a system to help them keep that job, to get
this so that any investigation would point to a run-of- a raise, or get a bonus? Is there a Sally in your organiza-
the-mill compromise of the system and not trigger any tion?
alarms. Do you have a supplier or contractor similar to
Finally, he backed out of the complaint systems and in- SoftBioSystems? Does one of your competitors? Is there
fected Sallys computer with a destructive virus knowing someone who depends on a revenue stream that is large
the IT staff would baseline the system (erase the disk and enough to induce them to attack you? Keep in mind
install all new software), thus covering his last probable that governments are compromised for what amounts to
track. trinkets and pocket change.
NewGen43s IT staff responded quickly to the virus
outbreaks, cleaning the infected system. They saw the Taking Advantage of a Security Weakness
iTunes on Sallys computer and she told them she won Bob, Sallys husband, had some skill and he knew enough
the iPod in a contest. They found the infection on it, but to exploit a weakness in NewGen43s complaint system.
saw it just installed a remote control program that looked The method Bob used (URL tampering and altering
like the others they had been dealing with. They cleaned unsecured security information) is not esoteric. This type
it for her and gave it back with a warning not to install of attack is on the top 10 vulnerability lists of two security
unapproved software in the future. organizations. The attack Bob used is really two issues
in one. By setting the role in the URL, the application
Trying to Recover did not sufficiently protect credentials. Also, the URL
Over the next weeks, NewGen43s clinical and regulatory alteration is a type of web parameter tampering. How
teams realized something had gone very wrong. They does this happen? For most organizations, developing
kept restoring older and older versions of the data, but software that works at all is hard, and developing secure
could not piece enough data together to confidently pro- software is even more difficult. There is evidence of this
ceed. They had an electronic system and scraps of paper everywhere. That is how Sally became an admin. The
that could be used to see that some data was wrong. But administrator connection is what gave Gunter the access
other data was right! to compromise the complaint system and from there, the
NewGen43s stock dropped 22% upon the news that clinical system. Keep in mind Gunter did not care at all
they would restart their trial. It dropped another 10% about the complaint system. It was only a way to gain
when word spread that the US Food and Drug Admin- access to the clinical system. It was the complaint system
istration was auditing them for inconsistencies in their that gave him the key to his damaging attack.
MDR filing practices.
Social EngineeringThe Attack Method
SO WHAT HAPPENED? The iPod trick is one example of social engineering. We

40 Special Edition: Computer & Software Validation


Robert Smith

read about these attacks all the time. Social engineering The Wired.com posting (2) also reported that the
attacks range from a thief getting a kind person to hold University of California at Santa Barbara observed one
the door open while they carry out an armful of laptops botnet, Torpig, for 10 days and observed 70 gigabytes of
or a person in a uniform standing in front of an ATM data being stolen from computers remotely-controlled
and taking deposits because the ATM is downmany by the botnet, including financial data. The harvested
people just hand over the envelope. data included 1.2 million Windows passwords and over
Hacker turned security researcher, Kevin Mitnick, 1 million e-mail items, such as e-mail addresses and
is famous for his social engineering skills. In his book, login credentials.
The Art of Deception (1), Mitnick states, Social engineer- Wired.com quotes the University report (2) as stating,
ing uses influence and persuasion to deceive people by In ten days, Torpig obtained the credentials of 8,310
convincing them that the social engineer is someone he accounts at 410 different [financial] institutions. The re-
isnt, or by manipulation. As a result, the social engineer searchers continued, The top targeted institutions were
is able to take advantage of people to obtain information PayPal (1,770 accounts), Poste Italiane (765), Capital One
with or without the use of technology. (314), E*Trade (304), and Chase (217).
The lesson here is to not underestimate the ease of
Vulnerabilities these attacks or how simply an IT team could mistake a
In the scenario presented in this article, the complaint targeted attack (what Gunter did) for a run-of-the-mill
system software held the login id and password for botnet attack. Gunter was clever and used social engi-
the clinical system. Far-fetched? No. The hard coded neering on the IT team, tricking them into thinking they
credentials problem is also on the list of top 10 vulner- were fighting a botnet and using a common erase and
abilities. This happens all the time. It is easy to just stuff replace strategy, thus covering his tracks.
credentials in-line with code; called hard coding. It takes Now some astute readers might point out that there
zero extra lines of code to do this. To make credentials are products and techniques to thwart these attacks.
secure and configurable is a lot more work, maybe 100 They are right, but in this authors experience those
times more by the time all the scenarios are tested. If are rarely deployed and staffed by sufficiently trained
credentials are hard coded, over time this weakness gets personnel to be consistently effective. A proof point is
worse and worse as more people and more systems gain that governments, banks, and financial institutions that
access to those hard coded, never changing credentials. do have highly competent technical staffs and great tools
The next point in our fable relies on another top 10 still have determined attackers that get through their
vulnerability: elevated privileges. Developers like to defenses.
run at the highest privilege level. Its sort of like having It is important that readers understand business risk
the keys to the castle: no worries, we can go wherever and the value of information security. It is easy to break
we want. But good security requires the opposite in. It is easy to compromise systems. It is really easy to
least privilege. Least privilege means only the absolute social engineer people. Do you understand these risks,
minimum to do the one thing the program needs to or do your advisors? Have you mitigated those risks?
do at that moment. Its hard to develop and hard to Does Sally work somewhere in your organization? Does
test. It takes time and costs money, hence its persis- Gunter work for a competitor? Are you sure?
tent presence on the top 10 list. But this is precisely The good guys have to protect all possible points of
how Gunter gets unrestricted access to the complaint attack. The bad guys (even well intentioned ones) need
system. only find one unprotected or inadequately protected
Botnets and command and control may sound point to get in. Once in, for most organizations, its game
like something that is cutting edge and difficult. But over.
it is really easy to use. There are lots of websites that
offer the software that any competent administrator or RECOMMENDATIONS
programmer can easily use right off the shelf. But it gets System-Administrator, Audit, Network Security (SANS)
better. There are hackers who will build you whatever is a globally trusted source for information security train-
you like for $50 to $250! In May 2009, Wired.com (2) ing, certification, and research that recommends protect-
reported that there are bots active on 12 million IP ad- ing your organization with approaches called defensive
dresses. (An IP address, or Internet Protocol address, walls (3). The following is a brief explanation of each
is like the phone number for your computer system). By wall that will help create awareness of what a compre-
trailing his attack with common botnet and virus drop- hensive program looks like.
pings, our fictitious Gunter covered his tracks. The IT
staff erased all the evidence for him. Defensive Wall 1: Proactive Software Assurance
By the time the clinical team realized their data was This level of defense relates to the following:
bad due to Gunters slow careful corruption, they had How software is developed
no way to prepare a trial submission. How software is tested

Special Edition: Computer & Software Validation 41


Robert Smith

How software is evaluated Log management


Security skills and training for your developers and Security information and event management
testers. Media sanitization
Mobile device recovery and erasure
This is often the most difficult as it affects the system Security skills development
most profoundly when the system is envisioned and de- Security awareness training
signed. It is exceptionally difficult to add this wall later. Forensics tools
Governance, risk, and compliance management
Defensive Wall 2: Blocking Attacks Disaster recovery and business continuity.
This level of defense focuses on the following tools that
aid in preventing and detecting suspicious activity: Most organizations in the life sciences arena will need
Intrusion prevention (IPS) and intrusion detection some elements from all the defensive walls. This is a
(IDS) principle called defense in depth. Organizations with
Wireless intrusion prevention (WIPS) highly valuable data or critical processes controlled by
Network behavior analysis computers will need most elements from all the walls.
Network monitoring. Organizations with low value data and no critical system
will not need them all, because the cost of protecting the
Wall 2 also includes the following tools: assets would exceed their value. The key to protecting
Firewalls your information and systems assets is to classify them,
Enterprise antivirus understand your organizations appetite for risk, and then
Unified threat management take steps that adequately and cost effectively protect the
Secure web gateways asset.
Secure messaging gateways All the defensive walls are important. How strong
Anti-spam tools to make each will vary based on the risk and the value.
Web application firewalls Based on this authors experience, the one thing that
Managed security services. people have a hard time understanding is that otherwise
good people can do bad things. These people are both, as
Defensive Wall 3: Blocking on The System Under Attack our story indicates, on the inside and on the outside of the
Defensive wall 3 includes tools like the following: organization. For many, this may be too much to digest.
Endpoint security It would be in a companys best interest to hire a consul-
Network access control (NAC) tant with practical experience and ideas for assessing your
System integrity checking tolerance for risk and mapping it to a sensible approach
Application control of information security. The author recommends that
Configuration hardening. the consultant be a Certified Information System Security
Professional. This is the most difficult credential to earn.
Defensive Wall 4: Eliminating Security Vulnerabilities Global Information Assurance Certification is an accept-
This wall includes the following tools: able credential too. But seek a certified professional who
Network discovery can give clear examples of how they can be practical. You
Vulnerability management do not want to spend $50,000 protecting a $5,000 asset.
Penetration testing and ethical hacking
Patch and security configuration management REFERENCES
 Compliance with your organizations security policies. 1. Mitnick, Kevin D. and Simon, William L., The Art of Deception:
Controlling the Human Element of Security, Wiley, October 2002.
Defensive Wall 5: Safely Supporting Authorized 2. Zetter, Kim, Botnets Took Control of 12 Million New IPs this
Users Year, Wired.com, May 5, 2009, http://www.wired.com/threatlev-
This wall includes tools that support the following: el/2009/05/botnets-took-control-of-12-million-new-ips-this-year/,
Identity and access management accessed 9/14/09.
Mobile data protection and storage encryption 3. SANS, What Works in Internet Security, http://www.sans.org/what-
Storage and backup encryption works/, accessed on 9/14/09.
Content monitoring/data leak prevention
Digital rights management RECOMMENDED READING
Virtual private networks (VPNs). The author recommends the book, Geekonomics, The True
Cost of Insecure Software, by David Rice as a great example
Defensive Wall 6: Tools to Manage Security and Maxi- of how widespread this problem is today. It is written
mize Effectiveness for a broad audience, not a technical one. Mr. Rice offers
Defensive wall 6 includes tools for the following: compelling examples of the state and scope of the overall

42 Special Edition: Computer & Software Validation


Robert Smith

information security landscape. http://www.geekonomic- are busy and thus hide suspicious activity well.
sbook.com/. GXP Uniform Resource Locator (URL). Also know as a web ad-
dress. www.google.com is an example.
GLOSSARY Virtual Private Network (VPN). A network inside a network
Command Shell. A shell is a piece of software that provides that is created for a private use. A VPN rides on some
an interface for users. Shells generally fall into one of two existing infrastructure (like wires) but has been secured so
categories: command-line and graphical. Command-line it is private.
shells provide a command-line interface (CLI) to the system.
Users type key words and symbols to get the command ABOUT THE AUTHOR
shell to perform tasks. Robert Smith is an application technical lead responsible for quality
Medical Device Report (MDR). An FDA required report. systems software development at Abbott Vascular. Robert has 25 years
of software development experience including VC start-ups funded by
These reports are always important, but at times can be The Mayfield Fund, Granite Capital, and Wasatch Venture Fund, and
critical to meeting regulations and agency goals of protect- holds CISSP and PMP credentials. Robert can be reached by e-mail at
ing the public. robert.smithii@av.abbott.com.
Port (web port). A number, like and extension to a main
phone number used for two devices to connect. When you Barbara Nollau, column coordinator, is director of quality services
connect to a website you generally do so on Port 80. There at Abbott Vascular. She is responsible for validations, reliability
engineering, supplier quality, microbiology, and document man-
are potentially thousands of ports on each system. Smart agement at Abbott Vascular. Ms. Nollau can be reached at Barbara.
attackers usually use well-known and popular ports. They nollau@av.abbott.com.

Originally published in the Autumn 2009 issue of Journal of GXP Compliance

Special Edition: Computer & Software Validation 43


Barbara Nollau

Disaster Recovery and


Business Continuity
Barbara Nollau

Computer Systems Quality and Compliance discusses practical aspects of com-


puter systems and provides useful information to compliance professionals. We
intend this column to be a relevant resource for daily work applications.
Reader comments, questions, and suggestions are needed to help us fulfill our
objective for this column. Suggestions for future discussion topics or questions to
be addressed are requested. Case studies illustrating computer systems quality
and compliance issues by readers are also most welcome. Please send your com-
ments and suggestions to column coordinator Barbara Nollau at barbara.nollau@
Rupert King/Getty Images

av.abbott.com or coordinating editor Susan Haigney at shaigney@advanstar.com.

KEY POINTS
The following key points are discussed in this article:
In todays environment of technology and automation, it is important to
understand disaster recovery (DR), business continuity (BC), and con-
tingency plans (CP) and how they all work together to ensure continuity
and integrity of systems and availability of data and records
System owners and technology professionals should understand how
these plans should be developed and when/how to exercise them
Having a DR plan in place is important to the compliance of computer
system validation and Part 11 for regulated systems
The DR team and the associated roles and responsibilities should be
clearly defined and understood
Disaster identification, notification and coordination processes, com-
munication plans, alternate computing facilities management, return
to normal operations, plan testing, and maintenance procedures are all
required elements of a robust DR program
Minimally, a company should have a functional plan that addresses all
of the processes required to restore technology, an individual respon-
sible for that plan, and a disaster response team at the ready.

INTRODUCTION
I attended a disaster recovery conference a while back, and one of the
speakers said, If you want to see how real experts plan disaster recovery,
go to Puerto Ricowhy? Look at the number of hurricanes they deal with
on an annual basis. Theyd better know what they are doing from a disaster
recovery standpoint! I never forgot that statement, and Ive been interested
in best practice relative to disaster recovery ever since.
In this issue of the column, we will examine the terms disaster recovery,
business continuity, and contingency planning. Understanding these
terms and implementing these measures are important for the integrity
and compliance of the systems we use. We will further explore the disaster
recovery (DR) element to gain a deeper understanding of what is required.

THE MEANING OF DISASTER


Websters defines the word disaster as great distress, destruction, or

44 Special Edition: Computer & Software Validation


Barbara Nollau

Figure 1:
Elements and hierarchy of a DR/BC/CP program.

Enterprise Business Continuity

Disaster Recovery Functional Business


Continuity

Short Term
Contingency

misfortune. A disaster is an event that is catastrophic to connection). Localized system outages and brief periods
the business, meaning people cant work, or even worse. of system downtime (i.e., a document control system
An example of a disaster in this context is an earthquake down for a day or e-mail unavailable for several hours)
that destroys an entire facility. A smaller event may also are not considered disasters and are, therefore, treated
be considered a disaster in some cases, for example a differently, usually with simple contingency plans. What
fire in a data center that brings all computing capability constitutes a true disaster for a company should be de-
in the company down. A disaster can be defined as any fined up front, including determining criteria. This must
unplanned event that prevents an entire organization be understood ahead of time, so it is clear what condi-
from functioning as intended or causes damage to people tions will lead to invocation of the DR plan. Depending
or facilities (e.g., fire, explosion, or extensive building on the magnitude of a disaster, invocation of the broader
damage). business continuity (BC) plan may or may not be war-
A disaster can have a significant, direct impact to a ranted (DR and one or several functional area BCs may
firms ability to continue business processing. There may suffice.) Disaster recovery is designed to recover from a
be an inability to develop submissions or collect clinical true disaster, not an outage or fault.
trial data, delayed or limited ability to get information to
the field or process sales data, or the inability to manufac- ELEMENTS OF THE DR/BC/CP PROGRAM
ture, pack, ship, or track product, samples, and pro- Now that we have reviewed what constitutes a di-
motional material. The ability to sustain time sensitive saster and how that differs from an outage, we need
processes such as payroll may also be hindered, effecting to gain an understanding of the elements of a DR/
financial relationships. The enterprise may be unable BC/contingency plan (CP) program, how they work
to communicate internally or with customers, and there together, and for what conditions each element is
could be residual outcomes such as non-compliance with used. The elements and hierarchy of the program are
regulations and lack of alignment with a parent company shown as follows (see Figure 1):
and partners. Some of the effects of these outcomes are Enterprise business continuity (EBC). A broad
financial in nature (lost revenue from inability to ship program that covers all aspects of the business
product, loss of sales from delayed submissions, loss (e.g., process, technical, physical, human, etc.).
of worker productivity, or damaged credit rating from Focuses on keeping the business viable in the
inability to pay bills). The companys reputation with cus- event of a disaster.
tomers, employees, partners, or other stakeholders may Disaster recovery (DR). A program focused on
be damaged. technology recovery in the event of a disaster, an
There is a difference between a disaster and an outage element of EBC.
or fault, which is the temporary loss of some or all servic- Functional business continuity (BC) plan. A
es (e.g., hard drive failure, power outage, loss of network functional area- or business area-specific plan

Special Edition: Computer & Software Validation 45


Barbara Nollau

focused on keeping business processes moving in re-qualification should be based on risk (risk level of the
the event of a disaster, an element of EBC. affected system(s), level of change, and planned sustain-
Contingency plans (CP) for system downtime. ability of change). The re-qualification criteria should
A functional area- or business area-specific pro- be pre-determined and documented in the DR plan.
cess used as a workaround during non-disaster Requirements listed in 21 CFR Part 11 (1) that are
system outages, usually contained in an operat- related to, amongst other controls, DR are the ability
ing procedure. to generate accurate and complete copies for review
and inspection, and that records must be retrievable
The broadest level of BC (enterprise level) covers facili- throughout required retention time. In the case of a
ties, human resources, safety, equipment and furniture, disaster, without a DR plan, we cannot say that we are
communications (internal and external), and invocation able to produce accurate and complete copies or that
of lower level plans. Disaster recovery is focused on the records will be retrievable during that time.
technology only and covers the recovery facility (on-site,
hot site, or cold site), computer hardware, operating DISASTER RECOVERY PLANS
systems, networking, and other infrastructure, applica- A disaster recovery program is more than just how
tion software, databases, and records. Functional BC to restore systems and data. The plan must include
plans are lower level plans specific to a functional area disaster identification, notification and coordination
or given business process. They are usually put in place processes, communication plans, alternate comput-
for critical business processes and cover the manual ing facilities management, processes to return to
workarounds to be used until technology is recovered. normal operations, and DR plan testing and main-
These workarounds may involve the use of log books, tenance procedures. It must be a functional plan
cell phones, hard copy documents, etc. in place of the that addresses all of the processes required to restore
technology that is unavailable. Finally, contingency technology and it must have a defined owner respon-
plans for system downtime are similar to functional area sible for maintenance of the plan on an ongoing basis.
business continuity; however, they cover localized out- A disaster response team must be identified and at
ages only (e.g., one department, one system, etc.) They the ready.
are usually feasible for short durations only, assume some When developing the plan, it is important to deter-
sort of infrastructure being in place, and typically involve mine the priority order of restoration across infrastruc-
paper-based manual workarounds. ture and systems. One of the inputs to determining
Developing and maintaining a tested DR/BC program this is pre-requisite technology (e.g., the network must
is important to the computer validation process and to be restored before applications that rely on networked
compliance with 21 CFR Part 11, Electronic Records and communications are restored.) A second input is the
Signatures. A commonly accepted definition for validation required level of uptime for each system. Systems
is establishing documented evidence which provides a requiring 24/7/365 uptime will need to be restored
high degree of assurance that a specific process [system] before those that dont have such stringent uptime
will consistently produce a product meeting its predeter- requirements. Another factor to consider is the sustain-
mined specifications and quality attributes. In order to ability of the defined workarounds (i.e., how long can
address the consistently portion of the definition, as the manual workaround realistically suffice without
part of system validation the following should be verified causing bigger problems such as unmanageable back-
as in place and tested: logs, etc.). The person developing the DR plan must
Disaster recovery plan collect this information about all technology elements,
Backup plan perform a triage activity, and resolve any conflicts in
Business continuity plan the case of systems with dependencies or the same
Contingency plan for system downtime. uptime requirements or conflicting priorities, and then
determine the overall order of restoration required and
Maintaining Compliance document it in the plan. This information should be
Another validation-related consideration is regarding the communicated back to the business area system own-
maintenance of the validated state of regulated systems/ ers so everyone is aligned with the planned order of
infrastructure. In the case of a major disruption to restoration in the case of a disaster. This is important
service that requires restoration in a completely different because recovery time expectations must be managed.
environment and/or replacement of major components, Business area system owners whose systems are lower
measures must be taken to ensure the validated state of in recovery order must understand this fact and the
the system is maintained. A disaster, and subsequent drivers for that order.
DR, interrupts the qualified state of the IT infrastruc-
ture. Once the environment is restored, some level DISASTER RESPONSE TEAM
of re-qualification must be performed. The level of The disaster response team must be identified ahead

46 Special Edition: Computer & Software Validation


Barbara Nollau

Figure 2:
Example DR team organization.

IT Management Lead Staff Support


IT Disaster Recovery Principle: - HR
Team Lead - Legal
Backup: - Purchasing
- Security
Facilities

Hot Site Coordination Lead


Principle: Recovery Execution Team
Backup: Voice Recovery Lead Principle : System Support
Principle:
Backup :
Connectivity & Facilities
Principle:
Backup

Execution Platforms
Principle: Platform 1 Recovery Team
Backup: Principle:
Backup:

Application Restore & Qualification Platform 2 Recovery Team


Principle: Principle:
Backup: Backup:

Application Support Team Platform 3 Recovery Team


Principle: Principle:
Backup: Backup:

Client Platform Team


Principle: Indicates reporting/coordinating relationship
Backup:
Indicates quali fication
Operations Recovery Team
Principle:
Backup:

of time. Roles, responsibilities, and backups must be Makes decision, based on initial assessment, to
defined, documented, and understood. Figure 2 shows activate the DRP and subsequent recovery teams
an example DR Team organization. Monitors the hot site recovery and the home site
Typical roles and responsibilities for personnel restoration efforts
involved in DR are as follows. Establishes and ensures the receipt of updates from
the hot site coordination team lead on a regular basis
DR Team Lead Keeps senior management informed of the progress
The team leaders role and responsibilities include the of the recovery effort
following: Facilitates planning for return to a new or repaired
Facilitates the disaster recovery process facility.
Ensures the workability of the plan by working
through assigned teams Hot Site Coordination Lead
Maintains and distributes the final copy of the The hot site coordination leaders role and responsibili-
plan ties include the following:
Conducts impact studies Assembles hot site coordination team members at
Develops recovery strategies and response proce- the command center
dures Briefs, organizes, schedules, and mobilizes all subor-
Coordinates testing dinate recovery teams
Monitors team response in actual disaster situa- Oversees the preparation and restoration activities of
tions. all hot site environments
Coordinates the identification, retrieval, and distri-
IT Management Lead bution of all off-site disaster recovery backup tapes
The IT management leaders role and responsibilities and vital records
include the following: Updates the IT management lead of restoration
Assembles team leaders at the command center progress on a regular basis
Places hot site on ALERT and makes formal Receives and responds to restoration progress re-
disaster declaration ports from all associated recovery teams
Monitors the initial assessment activities Assists with planning for return to a new or repaired

Special Edition: Computer & Software Validation 47


Barbara Nollau

facility. Execution Platforms Team


The role and responsibilities of the execution platforms
Platform Recovery Team(s) team include the following:
The platform recovery teams role and responsibilities Coordinates the activities of the platform-specific
include: recovery teams
Confirms the given platform (e.g., Unix, Windows, Reports the status of recovery activities to the IT
etc.) required hardware inventory at the hot site management lead
Updates the execution platforms lead on a regular Assists in the planning for return to a new or
basis repaired facility.
Oversees and verifies the proper restoration of the
given platform environment Client Platform Team
Ensures the execution of any required qualifica- The client platform teams role and responsibilities
tion for the given platform. include the following:
Coordinates the acquisition of client device com-
Application Restore And Qualification Team ponents as needed to recover and return to normal
The application restore and qualification teams role state
and responsibilities include: Reports the status of recovery activities to the IT
Coordinates recovery of applications in accor- management lead
dance with enterprise recovery prioritizations Assists in the planning for return to a new or
V  erifies the integrity and accuracy of the restored repaired facility.
critical application files
D  etermines and coordinates the steps necessary Recovery Execution Team
to update and synchronize the restored files to The recovery execution teams role and responsibilities
their status as of the disaster occurrence include the following:
D  etermines status of work-in-process at the time Obtains the appropriate backup tapes from the hot
of the interruption site coordination team
P  rovides centralized coordination for all depart- Performs the restoration of the specific platform
mental unit concerns and processing requests environments
P  rovides application-related assistance and Reports the status of recovery activities to the hot
staffing, if needed, to the other teams during the site coordination lead
recovery period Works with the platform recovery teams to ensure
C  ommunicates ongoing application changes to proper restoration.
the computer operations team for evaluation of
the impact on the contracted hot site recovery Application Support Lead
location The application support leads role and responsibilities
S erves as the liaison between the IT organiza- include the following:
tion and the application support teams for the Coordinates the activities of the application sup-
recovery efforts port teams to enable end user problem resolution
E  nsures the execution of any required application and assistance throughout the recovery period
qualification. Maintains communications with end users.

Connectivity And Facilities Team Voice Operations Team


The role and responsibilities of the connectivity and The role and responsibilities of the voice operations
facilities team include the following: team include the following:
Provides guidance and oversight to the voice Provides the necessary voice operations support
recovery team for the initial and ongoing needs of the recovery
Provides guidance and oversight to the recovery effort
execution team in relation to connectivity restora- Provides the operational support required to gen-
tion erate and maintain the voice hardware and system
Ensures the completion of any required platform software needed during recovery
qualification Establishes and maintains a voice communications
Provides regular updates on progress of voice and network capability for the critical internal and
connectivity recovery activities to the IT manage- external user groups.
ment team
Assists in the planning for return to a new or Operations Recovery Team
repaired facility. The operations recovery teams role and responsibili-

48 Special Edition: Computer & Software Validation


Barbara Nollau

ties include the following: initially and then drilled periodically. Drills typically
Provides centralized coordination for all help desk identify snags, which should result in updates to the
requests DR plan. A drill doesnt always have to be a full-
Provides end user problem resolution and assis- blown simulation of the actual processthere can be
tance throughout the recovery period segmented drills (for selected portions of the technol-
Maintains communications with end users ogy/selected systems) at the DR location, and in some
Communicates the prepared disaster statement cases, a conference room drill (one in which the
Coordinates the setup and staffing of required process is walked through procedurally) can suffice.
operations at the hot site. It is not recommended to ONLY perform these abbre-
viated options, however. Hot site contracts typically
RECOVERY FACILITIES include several drills per year, of which the company
The type of facility required for the DR operation must should take advantage.
also be determined based on business requirements. Some common (and easily avoidable) mistakes
A hot site is needed if fast recovery of data and con- with respect to DR execution are such things as
nectivity is required and taking the time to actually missing or forgotten software product keys, outdated
rebuild the technology platform prior to recovery is not contact information for key personnel or service pro-
feasible. In the case of a hot site, hardware will already viders/vendors, not assigning backups for DR team
be on hand and mobile computing resources and desk roles, and blank or corrupt backup tapes. One of the
space for critical staff are available. The network is most frustrating mishaps is discovering that the DR
designed to be able to quickly connect all unaffected plan was maintained in electronic form only and is,
systems to the hot site and telecommunications carri- therefore, not available when needed.
ers are prepared to switch those capabilities to the hot One person should be assigned the overall responsi-
site. The hot site is typically provided by a third-party bility for maintenance of the DR plan (normally the DR
service provider contracted by IT and provides these lead). The plan should be updated when drill results
services on a subscription basis, governed by a con- dictate a change, when there are system implementations
tract. The subscription also typically covers periodic or retirement, and when significant changes are made
drilling of the DR plan using the hot site. Some corpo- to systems that would affect their recovery method. The
rations choose to designate one of their own locations DR lead must maintain the plan master copy and ensure
as a hot site for the others; however, these locations that all copies of the plan are the most recent version and
must also be tested and drilled. that old versions are destroyed. Additionally, the DR
A cold site is used for build and recovery of data lead must maintain any sensitive combinations, pass-
and connectivity in a situation where time is not as words, etc. that will be required during DR but cannot
critical. Many DR plans use a hot site for immediate be put into the plan.
recovery of business critical systems and then move
to a cold site to rebuild lower priority platforms. A DEVELOPING A DISASTER RECOVERY PLAN
cold site is much less expensive than a hot site, be- If you do not have a DR plan in your company, it is ad-
cause it is really only providing a facility. This space visable to develop one. Steps to do so are as follows:
must be outfitted at the time of need by the subscrib- Stakeholder support. Identify management
ing company, and the arrangement should include stakeholders and gain support and funding by creat-
quick-ship agreements with vendors because there is ing a business case for why it is needed. This can
no equipment on hand. This option is certainly less sometimes be a tough sell because DR is similar to
costly but if used solely, significantly slows recovery insurance and it is sometimes difficult to imagine
time. needing such a thing. Be persistent.
Whichever type of recovery facility is selected, Project requirements. After approval and sup-
choose a location that will likely not be affected by port to proceed, gather uptime and recovery time
the same disaster, but that is still within a reasonable requirements and technical requirements and
travel distance and time. Storage location for back- constraints from the business and from IT subject
ups must be accessible within a reasonable time and matter experts.
effort and/or an arrangement in place for quick-ship Project team. Form a team to define a plan to bal-
to the recovery site. With respect to storage of the DR ance the recovery time requirements with relative
plan, keep a copy of the plan in several locations (e.g., priority and available resources, and use a risk-based
company facility, recovery site, in possession of the approach to determine the overall recovery order.
DR lead.) Gap analysis and remediation. Identify any gaps
and remediate them.
MAINTAINING THE DISASTER RECOVERY PLAN Disaster recovery plan. Draft the plan, review with
Once developed, the DR capability must be tested stakeholders, finalize the plan, and conduct a drill.

Special Edition: Computer & Software Validation 49


Barbara Nollau

Revise the plan as required. Drugs, Chapter IFood And Drug Administration, Department
Of Health And Human Services, Subchapter AGeneral, Part
CONCLUSION 11, Electronic Records; Electronic Signatures. GXP
This article discusses disaster recovery, business
continuity, and contingency planning and how ARTICLE ACRONYM LISTING
understanding and implementing these measures BC Business Continuity
are important for the integrity and compliance of the CP Contingency Plan
systems in todays environment of technology and DR Disaster Recovery
automation. EBC Enterprise Business Continuity
System owners and technology professionals should
understand how these plans should be developed ABOUT THE AUTHOR
and when and how to exercise them. System owners Barbara Nollau, column coordinator, is director of quality services
at Abbott Vascular. She is responsible for validations, reliability
should have a DR plan in place and all team roles and
engineering, supplier quality, microbiology, and document man-
responsibilities should be clearly defined. A company agement at Abbott Vascular. Ms. Nollau has 25 years experience
should have a functional plan that addresses all of the and increasing responsibility in pharmaceutical and medical
processes required to restore technology, an individual device industries, spanning areas of manufacturing, quality assur-
responsible for that plan, and a disaster response team ance/compliance, and information services/information technol-
ogy. Ms. Nollau can be reached at barbara.nollau@av.abbott.com.
at the ready.

REFERENCE
1. FDA, HHS, Code of Federal Regulations, Title 21Food And

Originally published in the Summer 2009 issue of Journal of GXP Compliance

50 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

System Definition:
Defining the Intended
Use for a System
By Robert W. Stotz, Ph.D.

INTRODUCTION System Requirements, in the GAMP 4 Guide


for Validation of Automated Systems6 (Figure
The author first met Mr. Chapman in June 3) it is User Requirements Specification, and
1987 as a new member of the PhRMAs (formerly in the Institute of Validation Technologys Pro-
the PMAs) Computer Systems Validation Com- posed Validation Standard VS-27 lifecycle, which
mittee (CSVC) which had reconvened to address is similar to the PDAs model, it is Functional
the source code issue1 and eventually launch the Requirements.
Staying Current series of articles.2 This was the Although variations exist, all versions of the
start of a long term collaboration between Mr. lifecycle models such as those below involve the
Chapman and the author that included a number same fundamentals, are compatible with each
of years on the CSVC followed by co-authoring, other, and all have contributed significantly to
as part of the Parenteral Drug Association (PDA) understanding how to cope with defining require-
Committee on Validation of Computer-Related ments for systems operating in an environment
Systems, the computer-related system require- subject to regulations. For the purpose of this ar-
ments section of their Technical Report No. 183 ticle system definition = computer-related system
and subsequently an article4 published in the requirements = user requirements specifications
October-November 1992 Journal of Parenteral = functional requirements = the intended use for
Science and Technology entitled Validation of a system.
Automated Systems - System Definition. The
following is an update of that article.
THE NEED FOR SYSTEM
DEFINITION
BACKGROUND
The following three paragraphs quoted from
The term associated with the document that the original article have proven to be as true today
defines the intended use for a system has be- as they were fifteen years ago:
come a confusing one because it depends upon
individual and/or company preferences and the An automated (computerized) sys-
chosen lifecycle model. For the CSVCs System tem can be defined as an assembly
Development Life Cycle (SDLC) model5 (Figure of multiple units consisting of one or
1) defining a systems function and structure, i.e., more microprocessors, and associ-
system definition, is equivalent to intended use ated hardware and software that con-
for both new and existing systems. In the PDAs trols and/or monitors without human
Technical Report No. 18 lifecycle (Figure 2) the intervention a specific set of sequen-
equivalent document is the Computer-Related tial activities such as a plant process,

Special Edition: Computer & Software Validation 51


Robert W. Stotz, Ph.D.

Figure 1
Upper Part of PhRMAs System Development Life Cycle

New Computer Existing Computer


Define System Define System
Function Structure Function Structure

Define Design/specify Qualify


Software Hardware System

Develop Install Review Operating


Software Hardware Experience

Verify Qualify
Software Hardware

laboratory function, or data process- systems require close teamwork and


ing operation. Defining that system effective communications among
in terms of its requirements (what the many, diverse disciplines. This multi-
system must do) and specifications disciplinary team should include sys-
(how the system will meet its require- tem users and others involved with its
ments) are the first, and probably design and implementation, and sub-
most important, steps in building a sequent maintenance. The eventual
quality system. Clear definition of re- users of the system are often over-
quirements and specifications results looked at the early planning stage of
in systems that are more straightfor- system development. This oversight
ward to construct, easier to operate, often results in automated systems
better documented, and more reli- that are difficult to operate and costly
able. As a result of being more reli- to maintain.
able, they are easier and less costly
to maintain. If outside vendors are The multidisciplinary team may
involved, vendor/user relationships consist of representatives from most.
improve and vendors are better able if not all, of the following disciplines:
to determine and meet user needs. manufacturing, automation engineer-
ing, technology development, quality
Defining and validating automated assurance and quality control, in-

52 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

Figure 2
Upper Part of PDAs Life Cycle

1
Validation Policies
Plan Validation
Validation Project Plan
Activities
Validation SOPs

2 Functional
Requirements
Define Computer-related
Computer-related System Requirements
System Requirements Design
Requirements

formation services, systems analysis, The significance of system definition


programming, and other software, is acknowledged by the Food and
hardware, and equipment consul- Drug Administration. Indeed, when
tants. Teamwork is essential since it inspecting a computer-related sys-
is almost impossible for one person or tem, FDA officials most often request
one discipline to have all the expertise system definition documentation,
required to develop todays automated along with a project validation plan.
system and also assure its quality. In May 1993, Sam Clark, a former
FDA administrator and an expert on
Failure to adequately define the intended use national computer systems valida-
of a system at the beginning of a project has been, tion, reinforced this point. During
and continues to be, universally recognized as the a roundtable discussion of computer
most frequent reason for failure involving com- systems validation, he commented
puter system design and/or validation. In the 15 that failure to adequately define
years that have elapsed since the original article computer systems is the most com-
published the importance of a multidisciplinary mon problem found in FDA inspec-
approach to clearly defining a systems intended tions. Former FDA investigator Ron
use, it has become even more evident. For ex- Tetzlaff agrees. In the second of a
ample, the following was excerpted from a 1995 three-part series of articles9 entitled
article8 on the subject: GMP Documentation Requirements

Special Edition: Computer & Software Validation 53


Robert W. Stotz, Ph.D.

Figure 3
GAMP 4 Basic Framework for Specification and Qualification

User Requirements Performance


Specification Qualification
Verifies

Functional Operational
Specification Qualification
Verifies

Design Installation
Specification Qualification
Verifies

System Build

for Automated Systems, he stated functions, requirements and proce-


that specifications are reliable pre- dures. This definition is consistent
dictors of GMP documentation prob- with the term system definition
lems. Tetzlaff went on to say that it used in this article.
may seem obvious that specifications
should be complete and meaningful, The following FDA events (It is recognized
but many firms have been unsuc- that there have also been significant events in
cessful in their efforts to define them. the international regulatory and professional or-
There are several reasons that this ganization sectors that have impacted the topic
task is so difficult, including the many of system definition, but to keep this article to
variables, diverse operations, and a manageable size, its focus is limited to FDA
controls that can function indepen- events.) since the original article published in
dently or be interrelated. 1992 have further emphasized the importance of
system definition:
Note: Tetzlaff defines specifica-
tions as written documents that
clearly and completely describe what 21 CFR Parts 808, 812, and 820, Medi-
the system is supposed to do. Speci- cal Devices; Current Good Manufacturing
fications apply to both hardware and Practice (CGMP); Final Rule published in
software and describe applicable October 1996.

54 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

21 CFR Part 11 became effective in August defines validation as confirmation by examina-
1997, policy guide 7153.17 issued in July tion and provision of objective evidence that the
1999 followed by five Part 11 guidance particular requirements for a specific intended
documents in 2001/2002. The policy guide use can be consistently fulfilled, and 820.25(c)
and five guidance documents were subse- covering design input states in part:
quently withdrawn in February 2003, and
replaced in September 2003 with Docket No. Each manufacturer shall establish
2003D-0060, Guidance for Industry, Part and maintain procedures to ensure
11, Electronic Records; Electronic Signa- that the design requirements relating
tures - Scope and Application. to a device are appropriate and ad-
FDA published their systems-based inspec- dress the intended use of the device,
tional program (Compliance Program Guid- including the needs of the user and
ance Manual Program 7356.002) in February patient The design input require-
2002, and in September 2004 a draft guid- ments shall be documented and shall
ance subsequently replaced by the final be reviewed and approved by a des-
guidance in September 2006, both entitled: ignated individual(s). The approval,
Quality Systems Approach to Pharmaceuti- including the date and signature of
cal Current Good Manufacturing Practice the individual(s) approving the re-
Regulations that defines the role of quality quirements, shall be documented.
systems in the pharmaceutical current good
manufacturing practice regulations. Both A common error found in many system defi-
the draft and the final guidance were devel- nition documents is a description of a systems
oped by the quality systems working group capabilities, often extracted from vendor provided
(now the Council on Pharmaceutical Qual- information, rather than a definition of intended
ity) formed as part of the Pharmaceutical use. The impact of this type of error is particu-
CGMPs for the 21st Century: A Risk Based larly acute relative to Part 11 requirements when
Approach initiative. a system has extensive capabilities for generating
FDA issued their new GMP initiative in Au- or maintaining electronic records and/or utilizing
gust 2002 that described an increased focus electronic signatures and only a portion of these
on those aspects of manufacturing that pose capabilities are intended to be used. The end
the greatest potential risk, and their intent to result is wasted time and resources in extensively
integrate quality systems and risk manage- testing a systems capabilities rather than the por-
ment approaches into its existing programs tion of those capabilities that are intended to be
with the goal of encouraging industry to used.
adopt modern and innovative manufacturing The Facilities and Equipment section of the
technologies. The final report on the new Quality Systems Approach to Pharmaceutical
initiative published in September 2004. Current Good Manufacturing Practice Regula-
Publication of several guides/guidances rel- tions guidance states:
evant to computer systems such as De-
sign Control Guidance for Medical Device Under a quality system, the tech-
Manufacturers in March 1997, Off-The- nical experts (e.g., engineers, de-
Shelf Software Use in Medical Devices in velopment scientists), who have an
September 1999, and General Principles understanding of pharmaceutical sci-
of Software Validation; Final Guidance for ence, risk factors, and manufacturing
Industry and FDA Staff in January 2002. processes related to the product, are
responsible for defining specific facil-
ity and equipment requirements, The
Section 820.1(z) of the medical devices CGMP Glossary section defines validation

Special Edition: Computer & Software Validation 55


Robert W. Stotz, Ph.D.

as: Confirmation, through the provi- requirements.


sion of objective evidence, that the re-
quirements for a specific intended use The guidance also states:
or application have been fulfilled.
Development of a solid foundation
The Quality Systems guidance also addresses of requirements is the single most
outsourced services. The section titled Control important design control activity,
Outsourced Operations states in part: and If essential requirements are not
identified until validation, expensive
Under a quality system, the manu- redesign and rework may be neces-
facturer should ensure that a contract sary before a design can be released
firm is qualified before signing a to production.
contract with that firm. The con-
tract firms personnel should be ad- Eventually the final medical device is validated
equately trained and monitored for against the user needs. As stated in the guidance:
performance according to their qual- Basically, requirements are developed, and a
ity system, and the contract firms device is designed to meet those requirements.
and contracting manufacturers qual- The Design Control Guidance for Medical De-
ity standards should not conflict. It is vice Manufacturers also states: design input
critical in a quality system to ensure requirements fall into three categories. Virtually
that the management of the contrac- every product will have requirements of all three
tor be familiar with the specific re- types:
quirements of the contract.

Although the FDAs new GMP initiative Functional requirements specify


and the final report on the new initiative do not what the device does, focusing on
specifically address defining the intended use of the operational capabilities of the
an automated system/equipment, their impact on device and processing of inputs and
system definition is obvious. One can not focus the resultant outputs.
on those aspects of manufacturing that pose the
greatest potential risk without first defining the Performance requirements specify
intended use of the automated system/equipment how much or how well the device
utilized in the manufacturing process. must perform, addressing issues
The waterfall design process depicted in the such as speed, strength, response
Design Control Guidance for Medical Device times, accuracy, limits of operation,
Manufacturers shows the process proceeding in a etc. This includes a quantitative
logical sequence of phases or stages starting with characterization of the use environ-
user needs being incorporated into the design ment, including, for example, tem-
input. The guidance goes on to state: perature, humidity, shock, vibration,
and electromagnetic compatibility.
Each design input is converted into Requirements concerning device re-
a new design output; each output is liability and safety also fit into this
verified as conforming to its input; category.
and it then becomes the design input
for another step in the design pro- Interface requirements specify
cess. In this manner, the design input characteristics of the device which
requirements are translated into a are critical to compatibility with
device design conforming to those external systems; specifically, those

56 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

characteristics which are mandated 3. How will you assure appropriate actions
by external systems and outside the are taken by the end user?
control of the developers. One in- What aspects of the OTS Software and
terface which is important in every system can (and/or must) be installed/
case is the user and/or patient inter- configured?
face. What steps are permitted (or must be
taken) to install and/or configure the prod-
uct?
The FDA guidance on Off-The-Shelf Software How often will the configuration need to
Use in Medical Devices provides a series of six be changed?
questions, with additional questions following What education and training are suggested
each of the primary six, to help define the basic or required for the user of the OTS soft-
documentation requirements for OTS software. ware?
The following is an adaptation of those questions What measures have been designed into
that can be used as an aid in defining the intended the computer system to prevent the opera-
use of OTS software. tion of any non-specified OTS software,
e.g., word processors, games?
4. What does the OTS software do? -
1. What is it? What function does the OTS software pro-
For each component of OTS software used, vide in the computer system? Specify the
specify the following: following:
Title and Manufacturer of the software What is the OTS software intended to do?
Version Level, Release Date, Patch Num- The design documentation should specify
ber, and Upgrade Designation as appro- exactly which OTS components will be
priate included in the design of the computer sys-
Any software documentation that will be tem. Specify to what extent OTS software
provided to the end user is involved in error control and messaging
Why this OTS software is appropriate for in the computer system error control.
its intended use What are the links with other software
including software outside the computer
2. What are the computer system specifica- system (not reviewed as part of this or
tions for the OTS software? another application)? The design doc-
For what configuration will the OTS soft- umentation should include a complete
ware be validated? Specify the following: description of the linkage between the
Hardware specifications: processor (man- computer system software and any out-
ufacturer, speed, and features), RAM side software (e.g., networks).
(memory size), hard disk size, other stor-
age, communications, display, etc. 5. H
 ow will you know the OTS software
Software specifications: operating sys- works?
tem, drivers, utilities, etc. The software Describe testing, verification, and valida-
requirements specification (SRS) listing tion of the OTS software. Software test,
for each item should contain the name verification, and validation plans should
(e.g., Windows 95, Excel, Sun OS, etc.), identify the exact OTS software (title and
specific version levels (e.g., 4.1, 5.0, etc.) version) that is to be used in the com-
and a complete list of any patches that puter system. When the OTS software is
have been provided by the OTS software tested it should be integrated and tested
manufacturer. using the specific OTS software that will
be delivered to the end user.

Special Edition: Computer & Software Validation 57


Robert W. Stotz, Ph.D.

Is there a current list of OTS software ware, even if purchased off-the-shelf, should
problems (bugs) and access to updates? have documented requirements that fully define
its intended use, and information against which
6. How will you keep track of (control) testing results and other evidence can be com-
the OTS software? pared, to show that the software is validated for its
An appropriate plan should answer the intended use. The guidance defines a require-
following questions: ment as any need or expectation for a system or
What measures have been designed into for its software, and goes on to state: Require-
the computer system to prevent the in- ments reflect the stated or implied needs of the
troduction of incorrect versions? On customer, and may be market-based, contractual,
startup, ideally, the computer system or statutory, as well as an organizations internal
should check to verify that all software requirements. There can be many different kinds
is the correct title, version level, and of requirements (e.g., design, functional, imple-
configuration. If the correct software is mentation, interface, performance, or physical re-
not loaded, the computer system should quirements). Software requirements are typically
warn the operator and shut down to a safe derived from the system requirements for those
state. aspects of system functionality that have been
How will you maintain the OTS software allocated to software. Software requirements
configuration? are typically stated in functional terms and are
Where and how will you store the OTS defined, refined, and updated as a development
Software? project progresses. Success in accurately and
How will you ensure proper installation completely documenting software requirements
of the OTS software? is a crucial factor in successful validation of the
How will you ensure proper maintenance resulting software.
and lifecycle support for the OTS soft-
ware?
DEFINING REQUIREMENTS
The FDA guidance on General Principles of
Software Validation describes how certain pro- It should be clear at this point that the first and
visions of the medical device Quality System most vital step in defining an automated system is
regulation, which became effective in June 1997, the definition of its requirements, i.e., its intended
applys to software and the agencys current ap- use. The requirements are the foundation for the
proach to evaluating a software validation system. system specifications and all subsequent design
Validation of software is a requirement of the documents. One cannot prove that a system does
medical device Quality System regulation, i.e., what it is intended to do if just what it is intended
Title 21 Code of Federal Regulations (CFR) Part to do has not been clearly defined. The require-
820, and applies to software used as components ments define what the system is to do rather
in medical devices, to software that is itself a than how it will perform a given task.
medical device, and to software used in produc- Definition of a systems requirements fre-
tion of the device or in implementation of the quently begins with a preliminary concept of the
device manufacturers quality system. Although required (and desired) functions of the new sys-
the guidance is directed at the medical device tem. Through an iterative process with input from
industry, it is based on generally recognized soft- the systems users and others involved with the
ware validation principles, and can therefore, be design and implementation of the system, the re-
applied to any software. quirements are further refined in terms of required
Section 2.4 of this guidance (Regulatory Re- functions (needs or musts), desired functions
quirements for Software Validation states in part: (wants), data to be processed, design constraints,
All production and/or quality system soft- performance and documentation requirements,

58 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

and validation criteria. The desired functions or The document should also include security re-
wants should be prioritized. The ability to under- quirements; safety considerations; specific hard-
stand both the activities being automated as well ware and software implementation requirements;
as the needs of the individuals or operators who and level of education, training, and experience
will be using the system is necessary in defining of each person who will interact with the system.
the requirements. In many cases, these needs may Personnel (i.e., in-house experts, consultants, etc.)
not be known at the beginning of the project, but required or available for each part of the project,
they must nevertheless be anticipated to the great- and a description of environmental factors should
est degree possible. be included as well. Graphical information such
A rigorous review and verification process is as system flow charts and diagrams that show the
required in defining the requirements of a system impact of the new system on existing manufactur-
that not only considers the needs of the end- ing functions and corporate data bases is useful
user(s) but also includes a clear understanding in communication of requirements. Definition of
of the operating environment that is to surround the requirements (intended use) for an automated
the proposed system. Configurations that might system should not be taken lightly. The quality
satisfy the requirements should be considered in and ease of maintenance of the system depend on
terms of cost; availability of required technology, the care taken at this point in the planning phase
facilities, equipment and effectively trained per- of the project.
sonnel; interface with current systems (e.g., en- A typical requirements document10 contains
terprise resource planning, ERP); legal liabilities; the following:
etc. Prospective vendors can also be contacted for Overview of the project and its objectives,
additional information. expected benefits, as well as constraints
Requirements can be developed using a top- caused by finances, time, and human re-
down process. General requirements for the auto- sources
mated system are established first, and then more Required and desired control functions
detailed requirements are developed. In large Sources and characteristics of input data
projects, defining the requirements of each logical Data manipulation and output requirements
entity may be required. A typical requirements Technical, electrical, and mechanical re-
document could contain the following: an over- quirements
view of the project and its objectives; expected Spare capacity
benefits; and financial, time, and manpower con- Human/Machine Interfaces (HMIs)
straints. The requirements document should de- Schedule for desired completion of important
scribe the required and desired control functions; milestones in the project
sources and characteristics of the input data; data Basis for system evaluation (in terms of per-
manipulation and output requirements; technical, formance requirements), and validation (a
electrical, and mechanical requirements; human summary of the general approach to be used
interfaces; desired timetable for completion of for validation of the system)
important milestones in the project; and the basis Devices, equipment, and/or databases in-
for system evaluation and validation (i.e., a sum- cluded in, or controlled by, the system
mary of the general approach to validation of the Block diagrams or sketches showing the
automated system). Each device and/or piece of physical location of the components of the
equipment included in, or controlled by, the auto- system
mated system should be described in the require-
ments document. Block diagrams or sketches that
show the physical location of the components of Because the requirements document describes
the system are also helpful and should be included. the sequence, timing, and scheduling of opera-
The requirements document should describe the tions, it should also include the following:
sequence, timing, and scheduling of operations. Security requirements

Special Edition: Computer & Software Validation 59


Robert W. Stotz, Ph.D.

Safety considerations nate approaches that also would produce desired


Specific hardware and software implementa- results, and methods of control, data acquisition,
tion requirements storage, reporting, and analysis.
Level of education, training, and experience The Scope of Responsibilities identifies hard-
necessary for anyone interacting with the ware and services provided by the vendor, end
system user, and third party contractors. This section
Personnel (e.g., in-house subject matter ex- should contain the following:
perts and consultants) required or available
for each phase of the project
Description of environmental factors Processing requirements for signal conver-
Graphical information, such as system flow sion
charts and diagrams, that demonstrates the Control algorithms (i.e., the controlling ac-
impact of the new system on existing manu- tions of the system and the parameters to be
facturing functions and corporate databases controlled)
Data manipulation necessary to support dis-
play or reporting functions
All activities and functions controlled, moni- Number and format of reports
tored, or reported by the system, as well as Archival of data and reports
their interrelationships and sequencing, should Application-specific programs that may be
be identified in the requirements document. Al- required (e.g., production or assay schedul-
locate functions of the system to general types of ing, batch recipes, assay methods, and pro-
hardware, firmware, and/or software. Make sure duction tracking)
to rank the systems overall structure according Required utility programs (Those associated
to higher and lower level activities, the discrete with, or used by, the operating system for
functions making up each activity, and the inter- back-up; the restarting of the system fol-
dependencies of the functions and activities. lowing an unplanned shut down; tools for
The requirements document should also in- configuring, programming, and editing; and
clude flow charts and diagrams that translate diagnostic and troubleshooting aids neces-
requirements and project objectives into inputs, sary for maintenance of the system)
functions, and outputs. Diagrams should refer-
ence the source of each input and the destination
of each output to indicate their relationships with The Scope of Responsibilities also describes
system functions. The hierarchy of activities and field hardware and human interfaces. Field hard-
functions should be clearly identified. ware includes the following items (as well as their
physical location and input/output requirement):
Instruments (including intelligent instru-
ments which provide early warning of po-
COMPONENTS OF THE tential failures and significantly reduce
REQUIREMENTS DOCUMENT maintenance costs for the proposed system)
Transducers
The Project Overview discusses the objectives Sensors
and expected benefits of implementing the sys- Valves
tem, the nature of the project, the components of Activators
the automated activities, the amount and type of Actuators (wired to the system)
operational support needed, future requirements
that might affect system design, and any standards
and/or design constraints to which the system Human interfaces encompass the following:
must adhere. This section should include alter- Number of operators

60 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

Figure 4
Requirements/Specifications and PhRMAs Lifecycle Model

User Functional
Requirements Description

System
Contact
Prospective Vendors Definition

Functional
Requirements
Define System
System
Requirements Function
Design Structure
Requirements

Requests for
Proposal

System
Specification
Vendor
Selection

Define System Define/Specify


Software Specifications Hardware

Special Edition: Computer & Software Validation 61


Robert W. Stotz, Ph.D.

Quantity and type of data to be entered into Qualification/Validation Requirements define


the system vendor qualification, system qualification before
Output to be displayed and/or printed and after installation (i.e. factory and site accep-
Networking requirements (e.g., definition of tance testing, FAT and SAT), system support,
communication protocols, polling response and system evaluation and acceptance. Vendor
times, error recovery, and link redundancy) qualification refers to the items incorporated in
with other systems an audit or assessment of vendor operations, in-
cluding:
Successful market experience and aware-
Security addresses requirements for protecting ness of applicable regulations in the industry
against unauthorized use, levels of security, virus where the system will be installed
scanning, and logging of access to the system. Financial stability
Electrical and mechanical requirements include Documentation of system or software devel-
the following: opment
Power sources and characteristics Adherence to software quality assurance
Maintenance of system operation during a standards and procedures
power failure Change and revision control
Atmospheric conditions at the site Assurances of pre- and post-installation sup-
System operation hazards (e.g., electromag- port
netic fields; corrosive or explosive chemi-
cals, gases, or dusts; or vibration)
System qualification before installation (FAT)
should identify the methods that will ensure that
Documentation specifies the documentation the purchased system meets, and is installed ac-
that a vendor is expected to provide. Generally, a cording to, specifications. In addition, it should
vendor is responsible for all documentation until detail the supporting documentation (e.g., instal-
installation of the system. System qualification lation, operator, and maintenance manuals) to
usually is executed on the installed system by be supplied by the vendor and the timeframe for
either the firm or a third party, although the ven- providing this documentation.
dor may assist in the preparation of protocols and System qualification after installation (SAT
training of personnel. and/or installation and operational qualification,
Vendor documentation should be clear. For IQ/OQ) generally is the responsibility of the firm
example, management of the firm using the using the system. It should be noted, however,
system should have no difficulty explaining the that there may be a need for vendor participation.
documentation during the course of a regulatory Any requirements in this area should be outlined
agency inspection. In other words, end users in the requirements document.
must demonstrate a thorough understanding of System support refers to requirements for on-
the systems procedures and controls and a firm going vendor assistance with hardware and soft-
command of the quality of their finished product. ware employed for various reasons, including:
Training is performed to ensure the proper op-
eration and maintenance of the system. Everyone
who uses the system must be trained adequately, Correction of problems
and this instruction must be documented. This Implementation and testing of changes
section of the requirements document should out- Warranty periods
line the type and amount of instruction required, Availability of spares
as well as the materials to be provided by the
vendor.

62 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

In system evaluation and acceptance, the for- Despite the above discussion, experience has
mal mechanism for judging the performance of shown that the definitions of requirements and
the new system and the minimum requirements specifications are often incorrectly combined into
for acceptance should be identified. a single document. Done sometimes by design
and other times through the evolution of the re-
quirements document, this practice often results in
Requirements versus Specifications oversights of user needs and the mixing of require-
ments with specifications. If all or a major part of
Each of the above lifecycle models (Figures the automated system is supplied by outside ven-
1-3) shows two separate and distinct steps in de- dors, more the rule than the exception with todays
fining the attributes of an automated system. The more complex systems, a separate requirements
first step, defines the systems requirements and document is required to convey user requirements.
the second its specifications. Although the level Providing a detailed specifications document to
of detail can vary, the requirements must establish potential vendors may lead them to rule out some
the criteria for system design and testing, while viable solutions or attempt to satisfy the specifica-
also allowing for flexibility in the selection of tion with an expensive, customized system.
specific hardware, software, and vendors. On the
other hand, specifications provide highly detailed
definitions of specific hardware components and CONCLUSION
their functions, software considerations, and the
systems interaction with its operating environ- Defining an automated system in terms of
ment, i.e., specifications define in detail how its requirements, i.e., what a system is intended
the system will meet the requirements described to do is the first, and most important, step in
in the requirements document. building a quality system. A clear definition of
Figure 4 shows the lifecycle relationships and requirements, and specifications based on these
separation between requirements and specifica- requirements, results in systems that are more
tions using PhRMAs SDLC model. The process straightforward to construct, easier to operate,
of system definition starts with a high-level de- better documented, and more reliable. These sys-
scription (User Requirements) of what the new tems are subsequently simpler and less costly to
system must do to be acceptable for its intended maintain, and vendors are better able to determine
use. Depending on the complexity of the new and meet user needs.
system a narrative description of its intended The development of a systems requirements
use (Functional Description) may be extracted (intended use), as well as its specifications, is
from the User Requirements and used to solicit an iterative process that requires effective com-
information from prospective vendors on systems, munication among diverse disciplines. Too often
technologies, and/or system components (hard- the system user either is neglected or fails to par-
ware and software) that could be utilized in the ticipate adequately in these phases of the project.
development and construction of the new system. Invariably, the result is an inferior system which
Subsequently, this information can be formulated is difficult to learn, confusing to use, and expen-
into the Functional Requirements (i.e., prioritized sive to maintain.
required and desired functions) and Design Re- Although defining system requirements and
quirements (i.e., the new systems architecture, system specifications are closely related, they
its operating environment, design and/or software should be defined at two distinct points in the
development standards to be followed, etc.) sec- lifecycle. This two step process may seem
tions of the System Requirements document. The lengthy, tedious, and simply not worth the extra
System Requirements document in conjunction effort; however, taking these additional steps
with the selected vendors is then used to generate consistently proves to be time well spent, making
a separate System Specifications document. validation a value added process rather than an

Special Edition: Computer & Software Validation 63


Robert W. Stotz, Ph.D.

unending series of costly events. Article Acronym Listing


The importance of clearly defining the func- CFR Code of Federal Regulations
tion and structure of an automated system in
terms of its requirements cannot be overempha- CGMP Current Good Manufacturing Practice
sized. The time spent in the early planning stages CSVC Computer Systems Validation
of a project will save hours in the subsequent Committee
design, implementation, and maintenance of the ERP Enterprise Resource Planning
system when the cost of correcting or adding a
FAT Factory Acceptance Testing
feature grows exponentially. o
FDA Food and Drug Administration (U.S.)
REFERENCES GAMP Good Automated Manufacturing
1. Chapman, K.G., J.R. Harris, A.R. Bluhm, and J.J. Practice
Errico, Source Code Availability and Vendor-
HMI Human/Machine Interface
User Relationships, Pharm. Technol., 11(12),
24-35 (1987).
IQ Installation Qualification
2. PMAs Computer Systems Validation Committee, OQ Operational Qualification
K.G. Chapman and J.R. Harris, principal authors, OS Operating System
Computer System Validation - Staying Current:
OTS Off-The-Shelf
Introduction, Pharm. Technol., 13(5), 60-66 (1989).
3. Technical Report No. 18, Validation of Com- PDA Parenteral Drug Association
puter-Related Systems, PDA Journal of Phar- PhRMA Pharmaceutical Research and
maceutical Science and Technology, Supplement Manufacturers Association
49(S1) (1995).
PMA Pharmaceutical Manufacturers
4. Stotz, R.W. and K.G. Chapman, Validation of
Association
Automated Systems - System Definition, Journal
of Parenteral Science and Technology, 46(5), 156- RAM Random Access Memory
160, September/October 1992. SAT Site Acceptance Testing
5. PMAs Computer Systems Validation Commit- SDLC System Development Life Cycle
tee (CSVC),Validation Concepts for Computers
Used in the Manufacturing of Drug Products,
SRS Software Requirements Specification
Pharm. Technol., 10(5), 24-34 (1986).
6. International Society of Pharmaceutical Engi- ABOUT THE AUTHOR
neering, GAMP Guide for Validation of Auto-
mated Systems, GAMP 4, December 2001. Robert W. Stotz, Ph.D., has more than 28 years
7. Proposed Validation Standard VS-2: Computer- experience in the pharmaceutical and healthcare
Related System Validation, Journal of Validation industry, and is President of Validation Compli-
Technology, 6(2), 502-521, (2000). ance Inc. (VCI) located in Exton, Pennsylvania.
8. Stotz, R.W., System Definition: The Oft Ne- Dr. Stotz accumulated more than 11 years ex-
glected Life Cycle Module, Part 1, Journal of perience at The Upjohn Company (now Pfizer)
Validation Technology, 1(3), 28-32, (1995). in Kalamazoo, Michigan USA, culminating as
9. Tetzlaff, R.F., GMP Documentation Require- Validation Manager for Upjohns worldwide vali-
ments for Automated Systems: Part II, Pharm. dation efforts, and nearly seventeen years in the
Technol., 16(4), 60-72 (1992). validation services industry.
10. Stotz, R.W., System Definition: The Oft Ne- Dr. Stotz works with many multi-national
glected Life Cycle Module, Part 2, Journal of pharmaceutical and healthcare manufacturers in
Validation Technology, 1(4), 24-29, (1995). all aspects of operations (particularly computer
systems) and validation, from concept through to
system/facility qualification and start-up. He has

64 Special Edition: Computer & Software Validation


Robert W. Stotz, Ph.D.

been actively involved with validation issues for PDA Technical Report No. 18 on Validation of
more than twenty-seven years and was a member Computer-Related Systems, and has presented
of the Pharmaceutical Research and Manufactur- and published several papers on the subject of
ers of Americas (PhRMAs, formerly PMAs) validation. Dr. Stotz holds a doctoral degree
Computer Systems Validation Committee for sev- from the University of Florida and B.S. and M.S.
eral years. He was also a member of the PDAs degrees from the University of Toledo. He can be
Computer Validation Committee that published reached at (610) 594-2182.

Originally published in the Autumn 2007 issue of Journal of GXP Compliance

Special Edition: Computer & Software Validation 65


Brian Shoemaker, Ph.D.

Lessons Learned in a
Non-Regulated Software
Validation Project
B y B rian S h o e mak e r , P h . d .

INTRODUCTION This fresh start situation proved to be ex-


tremely fertile ground for a mutual learning
Not all software validation projects are created experience. Key lessons, for the validation con-
equal. sultant as well as the UnderOver team, fell in
Though this statement is obvious to any team three major areas: technical, organizational, and
leader or consultant who has executed software human-relations.
validation in an FDA-regulated company, it can
be difficult to grasp for managers more concerned
with production schedules and challenges than TECHNICAL LESSONS:
with validation, and especially for managers in VITAL, BUT ONLY ONE
companies not directly regulated by the FDA. COMPONENT
Risk can be higher or lower; team understanding
of the need to validate can be greater or less, and In software or IT projects, the devil is always
technical specifics of the software in question in the details - and technical details provided
may present a variety of challenges. some interesting lessons on this project.
Because of these differences, every valida-
tion project presents a unique set of lessons. In Do Not Hesitate to Adapt Time-Honored
the case described here, the company (call them Processes to Fit the Situation
UnderOver Widgets) is not directly regulated Nearly all the applications covered in the Un-
by the FDA, but manufactures and supplies spe- derOver project had been in place from months to
cialty products to a number of medical device years; fifteen of the twenty-one (listed in Figure
companies. Though UnderOvers customers are 1) had been developed in-house.
entirely medical device companies, their technol- The project plan was to follow the standard
ogy falls within a larger industry not subject to V model (Figure 2) to the extent appropriate
FDA regulations. Driving the validation project or possible. Where applications were clearly off-
was the medical device customers demand that the-shelf (such as the gauge calibration tracking
UnderOver become certified to ISO 13485. Un- program), validation would consist of developing
derOver - the daughter company of a group in end-user requirements and tests to demonstrate
their manufacturing field and geographical area those requirements.
- used software extensively in their manufactur- The in-house applications were another mat-
ing processes and quality systems. Many of the ter. No development documentation existed for
applications had been inherited or adapted when any of UnderOvers custom software. Something
the company was spun out from its parent; but would need to be created in the middle of the
validation (a requirement specifically called out V, between User Requirements and acceptance
in ISO 13485) was to them a new concept. testing. Complete development documents would

66 Special Edition: Computer & Software Validation


Brian Shoemaker, Ph.D.

Figure 1
List of Applications in the UnderOver Project

Name Function Access VBasic Notes OTS Other

Devns Product deviation approval 3


ProdQC Product QC test specifications and results 3
Barcode label printing (production
Labels 3
machines, scrap codes)
CO_Sys In-house change order application 3
RD_Track New product development tracking 3
Com_ERP Commercial ERP 3
Prod1Spec Product type 1 manufacturing sheet system 3
TraceR-
Product Lot traceability application 3
etrieve
DocIndex Document master index 3
Tester1 Test instrument station software 3
GaugeCal Gauge calibration tracking 3
Plan_Sampl QC Sampling planner 3
ProdSched Production scheduling 3
Tester2 Product testing data acquisition 3
Training Employee training database 3
ManuFlow Barcode-enabled shop floor workflow 3
Production setup sheets: revision/
ProdSys 3
approval, lot-specific printing
Pattern design (output files placed directly
Patterns 3
on production eqpt)
Product type 2 manufacturing sheet
Prod2Spec 3
system
Enter key information for product-lot
TraceInfo 3
traceability
Schedule and track post-production
PP_Sched 3
processing of product lots

Special Edition: Computer & Software Validation 67


Brian Shoemaker, Ph.D.

Figure 2
The Standard V Model

be unnecessary and impractical, since the appli- documenting the specifics of how and where the
cations were already in use. The decision was to applications were installed would have consider-
document the as built design of these applica- able value for future software maintenance. In-
tions, to serve as a baseline for subsequent change stead of performing IQ, a so-called Configuration
control. Specification was created for each application, to
Because thirteen of the custom applications document anything a programmer or information
were either Microsoft Access or Lotus Notes technology (IT) specialist would need to know in
databases, documenting their design required order to reinstall, maintain, or decommission the
more than an annotated code listing. Fortunately, application. Figure 3 lists examples of the types
several tools will generate complete reports of all of information to include in these Configuration
tables, forms, queries, reports, modules (program Specifications.
code), and macros (if any) in an Access applica- In both the as built design documentation,
tion; a similar utility exists within the Lotus Notes and the Configuration Specifications in place of
development environment. These outputs could installation qualification, this project bent the
be automatically generated and printed to PDF, classical validation products - and in so doing,
and archived to capture the complete design of the fulfilled the project purpose.
database applications.
It also proved necessary to adapt the concept Be Ready to Delve into Technical Specifics,
of installation qualification (IQ), to provide even if these Should Be the Province of Devel-
useful information for this project. The applica- opers and Architects
tions were already in place, so a detailed proce- Consider the shop-floor workflow application
dure to confirm that they were being installed (dubbed ManuFlow for this discussion). This
correctly would have no meaning. However, system consisted of an off-the-shelf container

68 Special Edition: Computer & Software Validation


Brian Shoemaker, Ph.D.

Figure 3
Information to include in Configuration Specifications
Version of any underlying system (Access, Excel, Lotus Notes)

C
 onfiguration options (where applicable such items as default file-save directories,
user security settings, or compatibility switches)

Computer or network location where the application is installed

Data files or database tables the application reads or writes

External data required for security (if applicable)

Database links, if any, needed for the application to function

R
 esources required on the users station (e.g., client-side program, browser plug-in,
mapped drives, or shortcuts)

with drivers for barcode scanners and a desktop studying the flowcharts in detail. This kind of
interface, but within which any given company down-in-the-code study is not typically expected
had to build its own suite of workflow scripts. The of a validation consultant, who may or may not be
container was a commercial off-the-shelf appli- familiar with the program language, but for this
cation, but all the functionality of the system re- project it was vital.
sided in the custom-developed scripts. Complete
User Requirements for these scripts were virtually Help Solve Specific Technical Issues Where
impossible to build from user interviews, since Necessary
(a) the manufacturing floor users were too close Several times in the course of the UnderOver
to the functioning system, and had difficulty ex- project, it was necessary to help the project team
pressing what the ManuFlow application should see that a certain output was manageable, and not
do; and (b) the scripts, which had been inherited some insurmountable obstacle.
from the parent company, underwent extensive Creating design documentation was a prime
revision (in part to clean out unused scripts and example. The UnderOver team leader at first
code) in the course of the project. quailed at the task of documenting design of
Determining both the user requirements and the Access databases. After researching Access
overall design of the workflow script suite be- documentation tools, it was possible to recom-
came an exercise in reverse engineering. The IT mend several possibilities, and to list the essential
Director provided automatically-generated flow- information such a tool would need to provide.
chart diagrams for all of the scripts. From these With these suggestions, what seemed unattainable
diagrams, the connectivity of the scripts could became a fairly straightforward task.
be determined (which ones comprised the main Once the Access examples had been generated,
menu, which ones were called by the main selec- the Lotus Notes developer could see the type of
tions, and so on down the hierarchy - see Figure information that would be needed, and employed
4), and the general actions occurring in each built-in developer tools to create equivalent out-
script could be puzzled out. The developer pro- puts for the Notes applications.
vided brief synopses of the scripts, but deducing
the important logic tests and user inputs required Learn the Client Systems, and Adapt to the

Special Edition: Computer & Software Validation 69


Brian Shoemaker, Ph.D.

Figure 4
Connectivity of the ManuFlow Scripts

Clients Tools CO_Sys to release the new widget into produc-


All software in the project in some way af- tion, and preliminary setup sheets generated in
fected the design, manufacturing, or quality con- RD_Track could be ported directly into ProdSys,
trol of UnderOver products. Systems involved as shown in Figure 6.
many different Access databases; some were free-
standing, but others interacted with the resource- Do What Is Necessary to Understand Funda-
planning system (labeled Com_ERP in Figure mentals of the Clients Technology
1). Scheduling affected work-order creation; prod- Inevitably, understanding the software re-
uct work-orders would drive demand for interme- quired comprehending the terminology and the
diates made from the incoming material, and so manufacturing processes. Learning about the
on (Figure 5). manufacturers processes proved to be one of the
The Lotus Notes applications were similarly projects more intriguing challenges. Incoming
interdependent: at the end of a development material arrived in units with one name, and had
project, RD_Track could create a change order in to be repackaged to an intermediate form for final

70 Special Edition: Computer & Software Validation


Brian Shoemaker, Ph.D.

Figure 5
Interdependence of ERP / Access Databases

ManuFlow

Plan_Sampl Com_ERP

ProdSched PP_Sched

production, using a setup with a specific name. In ORGANIZATIONAL LESSONS:


the manufacturing process, a specifically-named HELPFUL FOR THE PROJECT
action indicated both the process of removing AND BEYOND
product from the machine, and creating a lot.
Figure 7 depicts the overall flow of UnderOvers Several lessons from the UnderOver project,
production processes. Each form and each process here called organizational lessons, fell more in
had a unique name, specific to the industry. the realm of planning, tracking, and communica-
Besides the technology of UnderOvers prod- tion.
uct, more mundane terms provided a chance for
misunderstanding. Where many companies main- From the Beginning, Be Aware of What Is
tain standard operating procedure documents they Driving the Validation Project in this Context
call SOPs, UnderOvers quality system consisted Although UnderOver is not subject to FDA
of Quality System Procedures and Work Instruc- regulation, its medical device customers are.
tions - mention of SOPs proved confusing. Everything in the project - documenting software
Once this difference was discovered, project plans requirements and design, establishing software
and weekly reports were modified to refer to qual- change control processes, creating a problem-
ity documents using UnderOvers terms. reporting mechanism - could be justified for clas-
Though the technical lessons were helpful in sical process improvement, and in the long run,
moving the project forward, it was clear from the process improvement and its effect on efficiency
earliest requirements discussions that more than and product quality were the real reasons for
technical learning would be necessary for success. the project. However, behind all these good
reasons was a single immediate motivation. The
customers had demanded ISO 13485 certification,

Special Edition: Computer & Software Validation 71


Brian Shoemaker, Ph.D.

Figure 6
sential throughout the project. How far the work
Interdependence of Lotus Notes Databases had come and how far it had yet to go, when the
next site visit was scheduled, what issues needed
to be addressed: all were featured in a brief
weekly update. After completing several User
CO_Sys Requirements documents, a table was added to
the weekly report to list the applications being
validated and the status of each document (re-
New product quirements, configuration specification, design,
change order test procedures). A glance at the table quickly
told team members both the work which had been
completed and the tasks still ahead.
RD_Track Eventually, the UnderOver team leader asked
that the company president be copied on these
weekly reports. This visibility to top management
as well as to the project team helped immensely
New product
in planning activities, keeping focus on the issues,
setup sheet and allowing team members to see and celebrate
how far they had come.

ProdSys Educate, Educate, Educate


A Computer System Validation (CSV) train-
ing session started the project, and a refresher
session followed at the eight week time point.
As team members came into the project, they re-
which spells out validation of software. Under- viewed these presentations. Just prior to the team
Overs business was at stake, and with it the team executing their software test procedures, still an-
members jobs. other training session provided them specifics of
how to perform and document the testing, while
Develop a Clear Plan and Show Progress reminding them of the projects basis and of the
against it overall plan.
A written plan, where the team agreed on the These training sessions were helpful, but far
applications to be included and on the roles of the from sufficient. Team members, though dedi-
members, started off the project. All training ses- cated, were simply not accustomed to the extent
sions included a flowchart of the overall project, of documentation required by the standard and
with the current point clearly indicated (Figure by their customers. Every conversation with one
8). At projects end, the final report referenced of the team members became an opportunity to
this initial plan, noting not only what had been explore questions and clarify the need for valida-
completed, but also the deviations (applications tion - and these one-on-one discussions proved
removed and added, and approaches modified in essential in helping the team reach the right com-
the course of the validation). fort level with the project.
Though it was helpful to provide high-level
reports, after a few weeks, several team members Trust, but Verify
asked to see a more complete overview of prog- This lesson is unique to the testing phase, but
ress (a component of the next lesson). definitely not unique to UnderOvers project. No
matter how carefully the instructions and the ex-
Communicate Often pected results are described in a test procedure, it
Keeping the team members informed was es- is always possible to misinterpret them. In a num-

72 Special Edition: Computer & Software Validation


Brian Shoemaker, Ph.D.

Figure 7
Overview: Phases of UnderOvers Manufacturing

Post-processing
Preparation Manufacturing

Rough Finished
Incoming Intermediate Product Product

ber of cases, the tester believed that actual results and practical, but completely objective. Human
disagreed with expected results and had to be beings need to use that machinery, however - and
marked as fail; some of these were true failures, working with a team of human beings on the
some resulted from performing the test incor- task of showing that the thought stuff is correctly
rectly, and some showed that the correspond- built, taught several powerful lessons in relating
ing requirement was erroneous or had not been to human beings.
implemented. Figure 9 lists several cases where
an initial fail result was changed to pass (or Learn How to Listen
not applicable) on review. Of twenty-five cases This project team taught the validation consul-
marked fail, only eight remained after review tant some crucial listening skills. During the very
(of course, all were explained in the project final first site visit, as key users described the Prod-
report). 2Spec application (see Figure 1), important state-
Happily, the UnderOver tests did not include ments for the User Requirements emerged either
any cases where the tester counted a result as a as logical conclusions or as implicit assumptions.
pass but in fact the result should have been Restating these apparent requirements (So what
marked fail. These have arisen in other proj- youre saying is that the program needs to keep
ects, and are often very difficult to communicate track of XYZ - is that correct?) allowed refining
to the tester or developer (to the point of causing the points that would be documented. In only one
disagreements within a project team). case could the person interviewed express an ap-
plications user requirements without assistance
INTERPERSONAL LESSONS: - and that person had developed the application.
DIFFICULT TO DEFINE, Simply listening also contributed to the test-
BUT VITAL TO SUCCESS ing phase. More than one team member asked for
help while executing a test procedure; these re-
Computer software is working machinery cre- quests identified a number of script errors or true
ated from pure thought stuff - both ethereal application failures. In one case, the frustrated

Special Edition: Computer & Software Validation 73


Brian Shoemaker, Ph.D.

Figure 8
The UnderOver Project Plan, in Flowchart Format

Systems Finalize Validation


Inventory Validation Plan Plan

Reqmts
Review/Revise
Applications List Risk Analysis
Configs
Document:
User Reqmts,
Conduct Risk, Configs
CSV Training

Develop Tests & Tests & TM


Traceability Matrix
Training
Record
Perform Tests;
Document Issues / Test Results
Results

Develop SOPs

Project
Assemble Project Closure
SOPs Closure Report Report

tester working on the ProdSched test pointed out a rect; others needed to talk through the document
confusing section, talked through a series of steps, with reference to the program itself, to provide
opened the dialogs described in the test, and an- feedback. Each application, and each team mem-
swered her own question without the test coordi- ber, required a different type of communication
nator ever saying a word! Her comment: I guess to make sure the documents were correct - and in
all I really needed was to have you listen to me. the end, those few errors which remained either in
the User Requirements or in the test documents,
Each Individual on a Small Team Has a Dif- were the result of still less-than-perfect commu-
ferent Communication Style nication.
Some team members could write a require-
ments document with very little input. Others Show Patience as Timeframe and Priorities
could describe their application verbally and Change
provide nearly enough information to deduce the Throughout the project, the UnderOver team
requirements. Still others had to show the user was clearly stretched. The ten participants were
interface, live or as screen captures, in order to nearly the entire salaried staff at the company; all
explain what they expected the program to do for of them needed to address not only their everyday
them. work but other priorities in addition to software
Similarly, some could read the numbered User validation. Customers visited the plant, quality
Requirements and judge whether these were cor- audits needed to be performed, production re-

74 Special Edition: Computer & Software Validation


Brian Shoemaker, Ph.D.

Figure 9
List of Initial Test Failures and Reasons for Changing to Pass

Applicn Sec Step Reason Comment Revise


No way to select just eight
Labels 4 7 codes; printing the codes Requirements error. Pass
gives everything on the list
Rejected change notice is not
CO_Sys 4 15 Requirements error. Pass
automatically closed.
Requirements error: only the
After a change order is ap- approver should be permit-
CO_Sys 4 41 proved, a user cannot edit a ted to make such modifica- Pass
task assigned to another user. tions once the change is
approved.
Production type 1 setup sheet Tester error. Using a differ-
was created, but data were ent command successfully
ProdSys 4 8 Pass
not pulled in from sample created the setup sheet with
request. the sample request data.
Changed to Pass with
Work Instruction for DocIndex
comment that more exact
DocIndex 1 2 needs to cover some topics in Pass
detail was needed in the
more detail.
work instruction.
Application does not give op- Requirements error. The all
Tester2 3 4 tion to print a single result or all results option is not essen- Pass
results for a specific sample. tial to the application.
Requirements error. Each
employee has a unique ID,
Training 4 12 Cannot change employee ID. Pass
which cannot be changed. A
new ID is a new employee.
Script error: instead of
Application permits saving getting an error message,
TraceInfo 4 5 incoming material lot with program gives a blank Pass
empty vendor ID. screen and does not allow
proceeding any further.
Validity of stockroom/bin Requirement not imple-
PP_Sched 3 14 location is not verified by mented in current release of N/A
the program. the program.
Test was intended to show
that deleting a work-order (not Requirement not imple-
PP_Sched 5 9 yet started) from PP_Sched mented in current release of N/A
also deletes the corresponding the program.
manufacturing order in ERP.

Special Edition: Computer & Software Validation 75


Brian Shoemaker, Ph.D.

views were held on a regular schedule, and other and had no way to coerce this individual or the
areas of training had to be addressed. One team programmer, so the matter was referred to Under-
members husband fought a losing battle with Overs internal team leader.
cancer through most of the project, and died just Getting cooperation took time, but referring
as the testing was to begin. the issue to internal management was precisely
Against this backdrop, the initial project time- the right choice. Beginning with a surprise tele-
line proved unworkable. No amount of chastis- phone call one Friday afternoon, the floodgates
ing via email would help team members provide opened: screen captures arrived, questions were
timely feedback on requirements documents or answered, and this corner of the validation project
test procedures, and site visits were only practical was back in motion.
every few months. Indeed, roughly six months
after project kickoff, all work on the project THE ULTIMATE SUCCESS:
ceased for six weeks. LESSONS LEARNED ON BOTH
Friendly prodding sometimes yielded results, SIDES
but when the validation project stalled, the only
reasonable response was patience. The teams From start to finish, the UnderOver project
silence did not mean that the project had been took roughly nineteen calendar months (bearing
abandoned - rather, that other issues had taken the in mind the hiatus mentioned above). During that
foreground for a time. Patience and confidence time, several new employees came on board at
paid off; when the project resumed, the teams UnderOver, several applications underwent major
focus was even sharper than before. changes, and specific programs were added to and
removed from the project.
Lead the Team Where Necessary, but Let the Validating these twenty-one programs (Fig-
Client Team Leader Address Internal Issues ure 1) helped the UnderOver team members see
An internal manager led the UnderOver team, software not as a magic genie, always doing the
providing resources where needed and keeping masters bidding flawlessly, but as an engineered
the top management apprised of the progress. product, designed to serve a purpose but limited
On technical and organizational issues - how to by the developers fallible understanding. Getting
organize the requirements documents, whether a down to basics - writing down what a program
test procedure would be workable, outlining the should do, then testing to be sure that the pro-
needed procedural documents, updating the proj- gram works as intended - has encouraged these
ect schedule and keeping all informed of progress team members to watch for possible future errors.
- the validation consultant worked directly with From there the software problem reporting work
UnderOver team members, and in time with the instruction gives them a mechanism for reporting
contract software developers. In nearly all cases, those errors.
this direct interaction worked exceedingly well. For the validation consultant, this project was
No project is free from snags, however. One at least as much of a learning experience. How
application proved even more difficult to char- to listen, how to persuade, how to determine dif-
acterize than the rest, perhaps because there had ferent communications styles, and how to keep
been no opportunity to meet the keeper of that an entire team informed as a large project moved
program in person at the outset of the project. forward: all were skills the UnderOver interaction
Early information was helpful - a flowchart, a helped sharpen. o
number of screenshots, some amount of explana-
tion - but filling in the holes became problematic. NOTE: Access, Excel, and Visio are trademarked
When obtaining information became difficult and products of Microsoft Corp. Lotus Notes (also
communication strained, work on this application called Notes) is a trademarked product of
was set aside for several months. The validation Lotus Development Corporation.
consultant could only use collegial influence,

76 Special Edition: Computer & Software Validation


Brian Shoemaker, Ph.D.

ABOUT THE AUTHOR Article Acronym Listing

Brian Shoemaker, Ph.D., is owner and princi-


pal consultant of ShoeBar Associates, which of- CSV Computer System Validation
fers consulting services and training in computer ERP Enterprise Resource Planning
system validation, software quality assurance FDA Food and Drug Administration (U.S.)
methodology, and electronic records and signa-
IQ Installation Qualification
tures. He has been responsible for validation of
software in a variety of FDA-regulated settings, ISO International Organization for
from the embedded applications driving immu- Standardization
nodiagnostics instruments to custom applications IT Information Technology
for clinical-trial data management. He has also MS Microsoft
designed and instituted quality systems for soft-
ware development. OTS Off-The-Shelf
Dr. Shoemaker served CSS Informatics (pre- SOP Standard Operating Procedure
viously PPD Informatics) as Quality Assurance
Manager and later as validation consultant. His embedded-software validation, instrument design
work revolved around clinical data management assessment, and interfacing with assay develop-
systems, clinical safety data systems, and soft- ment, instrument manufacturing, field service,
ware heavily used in the clinical-trials market. and quality-assurance groups. His awareness of
Previously, Dr. Shoemaker was QA/Valida- software quality and validation issues began with
tion Manager at Doxis, Inc., a software company his development of instrument interface and data
that provided flexible, 21 CFR Part 11 compliant, analysis applications in support of his R&D work
document-based data capture tools for operations at Technicon Instruments, and earlier at Miles
such as manufacturing, packaging, or inspection Inc.
in the regulated industry. As Systems Engineer- Brian earned his Ph.D. in chemistry from the
ing Manager at Behring Diagnostics (previously University of Illinois; he holds the ASQ Software
PB Diagnostics; a manufacturer of clinical im- Quality Engineer certification and can be reached
munoassay systems), he was responsible for at: bshoemaker@shoebarassoc.com.

Originally published in the Autumn 2007 issue of Journal of GXP Compliance

Special Edition: Computer & Software Validation 77


Jacques Mourrain

Computer Validation in the New-


Clear Age
Jacques Mourrain

ABSTRACT INTRODUCTION
Few would argue that the principles and processes of In 2001, I wrote a Hitchhikers Guide to the Universe
Validation (Big V) have undergone some transformative of Validation. It was a tongue-in-cheek introduction to
alterations over the last few decades, and more recently as the culture of validation: an ethnography of quality
we entered the 21st Century. In fact, it was in the very name engineering. It was written for the uninitiated, that is to
of the 21st Century that the US Food and Drug Administration say, for those who had not yet had the pleasure of string-
provided the nudge (1). In this article, I review what could be ing thermocouples. The document highlighted, what
seen as the three great forces that today form the discourse seemed at the time, the crucial concerns of our industry:
(language, acronyms, assumptions) on Validation; three dis- Part 11, not surprisingly, at the center of the mix.
cernible influences or factors that are shaping the universe But much has transpired in the field of validation
of Validation. For one, we can witness a move toward a more since then, especially within the field of computer sys-
probabilistic as opposed to deterministic paradigm loosely tems validation. For that reason I feel compelled to write
articulated under the terms of a risk-based approach. or, more accurately, compile varying thoughts and opin-
Secondly, at least in the world of computer validation, the ions on the topic of computer validation.
collision of Validation & Verification (V&V) with the x-Qs has The premise behind this collection of thoughts is that
opened a new space of dialogue between disciplines, which validation, today, is substantially different from what it
in the past did not have much occasion to talk. Finally, our was in the formative 1970s, or even more recently as
third influence is the ever-changing landscape of quality practiced in the name of Electronic Records Electronic
system discourses: from total quality management (TQM) Signature (ERES, a.k.a, Part 11) compliance (2). With-
to capability maturity model (CMM). FDAs Quality System out anticipating too much of what follows, I can safely
Initiative has provided a new framework from which to view say that today we find ourselves at a crossroads whose
Validation. As a consequence of these three forces, the field historical outcome has yet to be written (3).
of computer validation has become a repository of poorly- In fact, one can discern at least three influences or fac-
articulated acronyms (FMEA, QbD, UAT), some hybrid tors that are shaping our universe. The first such force is
expressions (Lean CMM), as well as a curious new-speak a move toward a more probabilistic as opposed to de-
(valudation and lean validation). terministic paradigm loosely articulated under the terms
The concepts (ideology) and practices (rites/rituals) of of a risk-based approach. Ever since FDA opened the
validation have historically elicited fear and awe among the door to a risk-informed approach to validation in 2003
uninitiated. For those who have not experienced the rites and tied it (at least thematically, if not effectively) to
of passage (redlining a P&ID), validation is perceived as an GMPs for the 21st Century, there has been a shift from a
obscure (perhaps dark) science/sance. Today, we increas- structured, determined, causal worldview (design quali-
ingly run the risk of promoting such misunderstandings, fication [DQ] leads to installation qualification [IQ]
when in fact validation boils down to something quite simple. which begets operation qualification [OQ] which) to
As the antidote to these speculative discourses, I propose, in a genealogy of many worlds and parallel universes (4).
this article, a return to the basics of Validation; basics that Risk-based validation has come to embrace the quantum
are primordial if we are to effectively navigate the new-speak mechanical insight that you cant have both position and
of Validation: Validation in the new-clear age. velocity, without some level of uncertainty: A new-world
where stochastic modeling (probability of failure) is a
better gauge than causal, mechanical determinacy. The

For more Author


information,
go to ivthome.
com/bios
[ ABOUT THE AUTHOR
Jacques Mourrain, Ph.D., is the Director of Corporate Compliance at Genentech Inc. in South San Francisco,
California. He can be reached via e-mail at mourrain@gene.com.

78 Special Edition: Computer & Software Validation


Jacques Mourrain

second moment occurred when FDA sanctioned the V&V as if somehow the principles of aerodynamics and the
model and subsequently undermined the sanctity of the coefficient of drag (Cd) could be applied to improve the
x-Qs (IQ, OQ, PQ, etc.) (5). This opened a new space of performance of validation. Turnover packages, construc-
dialogue between disciplines and discourse (i.e., IT and tion qualification, and factory acceptance testing (FAT)
Quality) that did not have much occasion to interact. As were designed such that repetition and redundancies (in
such, translation devices were created to build bridges verification and testing) could be avoided, and where
between the names for things: for example, regression overlaps could be leveraged. The prophecies were grand,
(in the IT sense) and qualification (not in the IT sense). the idea was simple: follow good engineering practices
As a consequence of this expanding universe, the lifecycle (GEP) and the area of validation will subsequently di-
model has come to permeate the discourse of validation. minish (perhaps totally eliminated). Was the implica-
The canonical terms of validation have slowly given way tion that we had been following poor engineering prac-
to the new-speak of user acceptance testing (UAT), re- tices (PEP) prior to this point in time? Expectations
gression, where even the word performance in per- were high: 90% of validation would be performed by
formance qualification (PQ) has suffered a shift. The the vendor under the banner of FAT (8). The impulse
third influence is the ever-changing landscape of quality to leverage development testing or FAT is deeply rooted.
system discourses: from TQM to CMM. FDAs Quality Arguments vary from the economic (high costs of vali-
System Initiative, to which we can add Six Sigma and dation) to the ubiquitous timeline imperatives. Some
quality by design (QbD), has provided a new framework arguments are compelling in their simplicity: Look, the
from which to view validation. In these terms, validation equipment or system already works upon implementa-
is conceived as a quality system as opposed to a qualifica- tion, so why do we need additional testing?
tion activity. This approach has expanded the scope of But project management is founded on the holy trin-
validation to include upstream development activities ity Cost-Time-Quality; three factors caught in a universal
and downstream maintenance controls. To paraphrase balance of power. When one expands, the other two
a brainteaser from a 2002 FDA guidance document (5): must adapt accordingly in order for the triad to maintain
the demonstration that a system is validated extends be- its integrity and for the universal balance to be equili-
yond validation in the strict sense of the term. Validation brated. In the past, the discourse on streamlining valida-
(Big V) exceeds validation (IQ/OQ/PQ activities, testing). tion was often at the expense of the quality role through
Validation has been propelled beyond the strict sense of benign neglect and silence. Today, perhaps ironically
the term, and has obtained connotative nuances. The and prophetically, it is in the very name of quality that
little v of testing (x-Q) has become the big V of quality the discourse is re-surfacing. Since the principles of time
controls process and procedure. and money have never swayed the regulatory agencies,
In summary, today we find ourselves at the crossroad it is reasonable that the idiom or principles of quality
of three great discourses: a theory of probability, a life- should intervene. In fact it is the very regulatory body
cycle model, and a systems theory approach. The con- (FDA) that has opened the door with its call to integrate
juncture of these three idioms, I will argue, has not yet quality systems and risk management into current manu-
been completely and thoughtfully fleshed out. As a con- facturing processes as the model for GMPs in the 21st
sequence of this incompleteness, the field of computer Century. And the publication of ICH Q9 (Risk Manage-
validation has become a repository of poorly-articulated ment) (9) and Q10 (Quality Systems) (10) has reinvigo-
acronyms (failure mode and effects analysis [FMEA], rated the old cry to streamline validation.
critical control points [CCP], UAT, QbD), some hybrid But the method is no longer predominantly GEPs.
expressions (Lean CMM), as well as a curious new-speak Sound Scientific Principles (SSPs) are now the call to
(valudation, lean validation, and risk-based validation) arms. Perhaps as amorphous and all encompassing as
(6). The concepts and practices of validation have histori- GEPs, the SSPs are never defined. By SSP do we mean
cally elicited fear and awe among the uninitiated. For parsimony, Occams Razor, Falsificationism (Vienna
those who have not experienced the rites of passage Circle), and gedenken experiments? For who would
(redlining a process and instrument diagram [P&ID]), argue with science (beside creationists) as the basis for
validation is perceived as an obscure science. Today, at a demonstration and confirmation (a.k.a., validation)?
this juncture we increasingly run the risk of promoting We need to understand the process and the critical con-
such misunderstandings (7), when in fact validation trol points, we are told. We should monitor and control
boils down to something quite simple. the parameters that impact quality. Define the design
space. The new-speak of QbD-driven, lean-validation
would have us believe that in the dark ages of validation
WHERES THE BEEF? FAT AND sance, we were testing in a vacuum. Is the implication
LEAN VALIDATION here that if you test against a design specification, you
In the 1990s the buzzword was streamlining validation have elevated your project to that of a scientific enter-

Special Edition: Computer & Software Validation 79


Jacques Mourrain

prise? If this new-speak of validation is to be more than resolved. This testing faces inwards, towards itself, so
a sound byte in the language game of obfuscation, it will to speak. For the exercise of validation (a demonstra-
need to be re-grounded in the foundational principles tion that the system performs reliably), the goal is not to
of validation. have issues surface at all. In fact, problems during valida-
The discourse on the science of quality and risk specifi- tion are not bugs, they are called deviations. This is not
cally in the area of computer systems has been supple- simply a semantic slight of hand intended to justify the
mented by a third term: The lifecycle concept. Since the accompanying paperwork. A problem during validation
publication of the FDA General Principles of Software testing must be assessed regarding impact on any previ-
Validation (5, Section 3.1.2), it has been generally rec- ous testing, the criticality of the problem regarding the
ognized that: a conclusion that software is validated is business process (intended use) must be evaluated, and
highly dependent upon comprehensive software testing, the root cause of the problem (for it might be a tip of an
inspections, analyses, and other verification tasks per- iceberg) must be investigated. Imposing this overhead
formed at each stage of the software development life during the development phase of a project, or conversely
cycle. As such, the final conclusion that software is vali- taking FAT (at face value) as Validation, would transform
dated is grounded in a determination that is somewhat our Janus face into a schizophrenic. In fact this illustrates
broader than the scope of validation in the strict sense of the classic definition of the double blind: find as many
that term. (5, Section 2). problems as you can while demonstrating that the sys-
In the strict sense of the term, validation has histori- tem is reliable, robust, and problem free.
cally been understood as the three (or more) Qs: installa- This, I fear is the risk of under-estimating validation in
tion, operation, and performance qualification (IOPQ). the broad sense of the term, and of conflating testing
IOPQ engineers have traditionally not ventured much with validation. When validation becomes sublimated
into the realm of design. Although a few forays by vali- in design and development, it risks becoming a parody
dation into the domain of design and development have of itself. As a consequence, one could easily imagine the
occurred, leading to such aberrations as construction emergence of validation tropes or styles. One example
qualification or design qualification packages, for the would be metonymic validation, where the part (partial
most part validation has been content to operate within regression) is taken for the whole (the validated state).
its holy trinity of acronyms. But the quality systems ap- Metonymic validation could be applied to application
proach to device software development, with its design upgrades, by selecting functionality that is intended to
review requirements, has slowly come to influence the represent the system as a whole. Perhaps another varia-
rest of the FDA-regulated software development arena. tion on the theme would be metaphoric validation.
The software development life cycle concepts, around for Here the terms (language and conditions) of validation
some time now in the software engineering disciplines are adopted to provide the allusion of the state of con-
(Institute of Electrical and Electronics Engineers [IEEE], trol. FAT, site acceptance testing (SAT), and turn over
Software Engineering Institute [SEI]), have come to frame package (TOP) can be infused with the essence of this
much of the validation being performed today. In the state with some minor rituals such as pre-approvals or
biopharma industry, the good automated manufacturing quality reviews. These rituals bring with them a whole
practice (GAMP) model (11) has certainly influenced language game which transforms the mundane into the
this direction with its v-model concept. And yet despite sacred. The key to a successful metaphoric validation
the history, despite the guidance, despite the principles, is to maintain the vivid imagery (the validation effect)
there continues to be an unhappy marriage between the throughout the implementation lifecycle and to repre-
V&V and the IQ/OQ/PQ approaches. While it is true sent change and flux (breakdowns as breakthroughs) as
that the scope of validation can no longer be confined the underlying substratum of a stable foundation.
to testing (the x-Qs)and has expanded to cover up- In fact, it will not be long when the parody of valida-
stream activates (design reviews) and downstream pro- tion will be confused with the act of validation proper: A
cesses (maintenance)this should not imply that since simulacrum, more real than reality itself. We will know
validation (object little v) is everyones responsibility, when validation has become truly post-modern when
it will be absorbed in design. The conflation of FAT, or the demonstration that a system satisfies its intended
development testing, QC or verification with valida- use is achieved by simply pointing to the absence of evi-
tion is bound to fail for one simple reason: Testing is dence to the contrary; or when the existence or presences
Janus-faced (12). FAT (or development testing) has a of an installed application is merely confirmed through
well-defined purpose: Find problems before the prod- the existentialist cry I am here (aka the splash screen/
uct goes out the door. A successful exercise will find scream). That will be the day when distinctions between
an abundance of issues that will be punch listed and retro- or pro-spective give way to the post-spective (or

80 Special Edition: Computer & Software Validation


Jacques Mourrain

speculative) phase of validation (13). The process produces a product (consistently)


At the risk of adopting such an apocalyptic tone, I be- The product meets specifications and characteristics
lieve the parody of validation is where we are headed (pre-determined)
today under our three great constellations. There is no Validation is the process that demonstrates the
denying that the rules of the validation game are shifting. above points.
Concepts from risk management (FMEA, hazard and op-
erability analysis [HAZOP]), quality systems (Six Sigma, From this analysis or reduction we can understand
QbD), and software quality engineering (Software De- and compile the holy trinity of validation as the rela-
velopment LifeCycle [SDLC], UAT) are impacting how tionship auditable-reliable-expected, represented as
we scope, plan, and execute validation projects. And follows:
yet the simple, unreflective adoption of these principles Establishing documented evidence (i.e., auditable)
and concepts may run the risk of transforming them into Providing a high degree of assurance of consistency
empty signifiers as they become sublimated in a shal- (i.e., reliable)
low call to arms and hybridized acronyms (14). This is Having pre-determined specifications and quality
a quasi-paradigm change which cannot be subsumed, attributes (i.e., expected).
absorbed, defused under the banner streamlining vali-
dation, Lean Validation, or QbD validation. Such The first component of this trilogy speaks directly to
approaches miss the point and the mark, by not under- what validation is: documented evidence. Here the tone
standing the history of the word validation. of the judicial (evidence) intersects with the judicious
(documented). Validation is evidence of a documen-
tary nature. It allows itself to be audited and permits a
VALIDATION DEFINED: A LINGUISTIC DE- reconstruction of a process or system. The FDA adage
COMPOSITION if it isnt written, it is simply rumor is evident in this
Despite a long and illustrious history (sterilization of passage. But documentation in itself does not accom-
parenterals circa 1970s), the concepts of validation plish anything if it does not allow us to conclude two
and its corresponding acronyms (the x-Qs) continue things. First, the documentation must lead to the conclu-
to be twisted, debated, and maligned. For this reason, sion that there exists a high degree of assurance that the
and as a recourse, I would like to start with the ca- process or system is reliable, consistent, and dependable
nonical paragraph that has traditionally defined vali- throughout its operating range and under worst-case situ-
dation (from a regulatory perspective). The paragraph ations (an elaboration from the Guideline). Secondly,
to which I am referring appeared in the 1987 FDA the object of this demonstration and documentation is
Guideline on General Principles of Process Validation: to confirm or verify predetermined specifications (i.e.,
design basis). The object of validation is the process:
validation is establishing documented evidence a process which produces a product. The product has
which provides a high degree of assurance that a spe- predefined characteristics, which validation (as an act of
cific process will consistently produce a product meeting establishing) documents. Validation is the act of demon-
its pre-determined specifications and quality character- strating that what ought to be (specifications) in fact
istics. (15) (i.e., as documented) is and will continue to be. It
is not conclusive (according to FDA) to merely demon-
Much has been written on this passage and its exten- strate what is from what happens to be (a tautology
sibility. Although the guidance is explicitly directed at prevalent in nave inductive generalization) (17). One
process validation (the process of manufacturing drug must demonstrate that what is is in fact what should
product/substance), one can apply the associative prin- have been (pre-determined). One must know before
ciple of language to this paragraph where process is re- hand, through independent review of the design, what
placed with system. The quote, then, takes on universal is expected to occur. Here is an area where the prom-
applicability. I am no transformational grammarian or ise of QbD has an opportunity to serve the interests of
professional linguist, yet I would like to start by decom- validation.
posing this complex paragraph into singular components Validation is a process; some have said a journey.
in order to unpack its embedded meanings. I do this Although we may speak of the validated state as if it
because I feel that few have analyzed its meanings, while possessed certain established characteristics, validation is
many have spoken its words. One diagram might look no thing (no object), but the articulation, the nexus, or
like the following figure. the conjuncture of the auditable, the reliable, and the ex-
Thus, the paragraph can be reduced to its componen- pected: The synthesis of the three laws. Since validation
tial elements or phrases to read: takes its referent, its grounding, and its measure from
Validation is establishing evidence (documented) the concept of quality, which itself is no thing, it cannot
Evidence provides assurance (high degree) (16) be circumscribed by its physical manifestations. Valida-

Special Edition: Computer & Software Validation 81


Jacques Mourrain

Figure: Validation diagram.

tion is not measured by the binder, the page, or the kilo; the effort at hand (planning). The era of the stand-alone
although, that has certainly been used as a strategy when processor, where the extension of hardware could easily
the terms were misunderstood. When clarity is lacking, define the limits of The System, and consequently the
the best strategy is to obfuscate, thus raising the bar, and boundaries of validation, is but a fleeting memory. With
upping the ante. Many validation packages are, in this enterprise applications, storage area networks (SAN), vir-
sense, a bluff, and a confidence game. tual servers, CITRIX, and inter-NETed applications, the
proper definition of the term system becomes crucial.
In fact, the elements of computerness (18) will vary
DEFINING THE VALIDATION SCOPE with respect to where this line is drawn. Even if the line
The first question to be asked of any validation project is a decision, defining the system-ness of the system (i.e.,
(once we have understood the terms of validation) is the qualities of being system) is the first step in the act of
what philosophers call the ontological question, and characterization. Complexity, control, and perhaps even
takes the form of What is? This question is particularly elements of criticality, will vary in response to how we
important for a computer system/application implemen- have circumscribed the system, and the boundaries we
tation. Manufacturing equipment (e.g., a lyophilizer) have defined.
might not provide the best illustration of the challenges So for example, does the extract transform load (ETL)
in properly coming to terms with this ontological ques- integration, between the enterprise resource planning
tion. After all, the boundaries of a piece of equipment (ERP) and manufacturing execution system (MES) ,
are usually demarcated by the utility connections at the become part of the ERP or MES boundary? Should
skid (and by the skid itself as physical frame). Or again, network and infrastructure components (e.g., switches,
the P&ID clearly defines the system boundaries, often router, clusters, SAN) and support software (e.g., CITRIX,
conveniently on a single drawing. What the thing is can Perl) be incorporated in an infrastructure qualification
be effectively walked down, empirically validated. Not (i.e., leveraged by individual systems) or included in the
so for a computer system, where defining the system and boundary of the system proper? How dissimilar in design
its boundary can be an art form; if not properly executed, (i.e., commercial-off-the-shelf [COTS] vs. custom) and
things can get very ugly. In fact the problem of valida- sourcing (e.g., software vendor, in-house development)
tion today is defining practical boundaries or scope for can applications be before the monolithic (The Manu-

82 Special Edition: Computer & Software Validation


Jacques Mourrain

facturing System = DCS+MES+ERP) approach exceeds its tended use, integrations, and dependenciesin order to es-
coefficient of elasticity and becomes unmanageable? All tablish the basis for a risk-informed approach to validation.
of these are scoping questions that have a lasting impact on Its intent is four-fold: to a) locate a computerized system
the maintainability of the validation throughout the lifecycle within its regulatory environment (applicable regulations),
of the application. b) describe its intended use (business process), c) outline
Certainly in this day of hyper-integration, a single system its architectural design, and d) map the system data flow
(e.g., laboratory information management system [LIMS]) is (integrations and dependencies). This information can be
only six degrees of separation from all other GMP applica- gathered and analyzed for the purpose of documenting the
tions. This fact does not, however, legitimate the lumpers technical, business, and regulatory risks associated with an
desire to conflate disparate components into one hegemonic electronic records management or process automation sys-
classification. It is not uncommon to see systems designed tem.
as four functional components (a COTS automation com- Characterizing a computer system (for the purpose of
ponent, a custom integration application, reporting tools, validation) can be accomplished through the elaboration
and a data warehouse), validated in two parts (automation of three domains: Intended use and regulatory context, sys-
and information), and maintained (change management) as tem design, and context of operation. These three domains
three (with of course overlap to other systems, not originally capture what is fundamental about a computer system, what
identified in the project plan). In the absence of an inte- I have come to call computerness. The following briefly
grated approach to the scoping of an application (between describe what is involved in the act of characterization:
design, use, and maintenance) the SDLC deliverables (qual- Intended use and regulatory context. In this domain,
ity records) become dissociated and no longer traceable. information regarding the business process, predicate
Appropriately scoping a validation project requires input rules, as well as data criticality is described. The purpose
from at least three functions: the technical function to deter- of this domain is to clearly document how the system is
mine the parts (i.e., hardware and software) and their inter- used to support a business function and/or regulated ac-
actions (i.e., interfaces/integrations); the user community to tivity and to determine the extent to which controls over
define the intended use and operational environment (i.e., e-records/e-signature need to be implemented. The use
people and procedures); and the quality organization (i.e., of a computer system has multiple dimensions that can
validation, change control) to manage the paper trail (i.e., impact its successful implementation in a regulated en-
change control, revalidation, discrepancies) from baseline vironment. The first dimension, which we could call
deployment. If system design (hardware and software) GXP impact, can be broadly assessed as the degree to
does not provide compelling reasons to establish boundar- which the application/system influences or affects the
ies around an application, then one should turn to intended quality, safety, purity, effectiveness of a product; or by
use. Boundaries established around intended use make the extension, how that application affects or impacts the
exercise of validation more effective and defensible (in so statement/claims to safety, quality, purity, and effective-
far as validation is often equated with the demonstration of ness. These claims can be found on product labels, cer-
intended use) (19). Yet in the era of enterprise applications, tificates of analysis (CofAs), safety reports. Determining
which cut across multiple functions, business processes, and the impact of a system on product/information quality
predicates, it is often difficult to demarcate clean boundar- defines the Criticality of use.
ies based on intended use. This is where one turns to the The second dimension of use, the predicated use,
quality systems. Change control (and discrepancy manage- defines a system in relationship to the records identified
ment) should also influence the choice of a boundary. Once in the Code of Federal Regulations (CFRs) and com-
baselined and deployed to production, a validated applica- pany standard operating procedures (SOPs). How a
tion will require perfective (upgrades) and corrective (bug system is used to create, modify, store, or transmit such
fixes) maintenance. All of the associated records (e.g., de- records needs to be defined. In addition, in complex
sign changes, code, testing, change control, corrective action systems, the predicated use also outlines the functional
and preventive action [CAPA]) will need to be maintained boundaries of a system, which may have corresponding
and immutably linked to the defined system. organizational structures (roles to features). Defining
the predicated records and business processes that are
satisfied or controlled by a system provides the Context
SIX Cs VS. SIX SIGMA of use.
Scoping is the first step in defining what a system is by es-
tablishing the boundaries of a system. But scoping does not System design. The purpose of this domain is to char-
necessarily provide an approach to validation. It delimits acterize the risks to application data derived from the
the territory but does not describe it. The next step is what technical design of the system. This entails a review of
could be called the characterization of the system. The system/architecture diagrams, integrations, and depen-
purpose of characterization is to assist in the development dencies, as well as flow of data (input/output [I/O]).
of a detailed description of a computerized systemits in- The design of a system is a significant contributor in

Special Edition: Computer & Software Validation 83


Jacques Mourrain

the ability of an application to satisfy its intended trol, conditions, confidence) have been documented, the
use. Risks associated with design may be related to application of a risk methodology can be achieved (20).
performance, user interface, or platform stability/ I dont claim this approach to be uniquely novel, or an
compatibility. The first dimension of system design untimely meditation. In fact the documentation of in-
is Complexity. System complexity comes in many tended use or criticality has been a central activity of most
forms, including technical and organizational. But validation planning, since time immemorial. More often
complexity is not simply a function of the number than not, however (from my experience), this thought
of I/O or branches in an algorithm. System depen- process, which is central to defining a validation strategy,
dencies and integrations (with their corresponding rarely finds its way to paper. Rationale for testing is either
information flows) contribute to complexity risks. undocumented (project decisions long since forgotten) or
Functions, roles, menus, features, screens (and their based on some shaky foundation (i.e., tautological). To
interdependence) also contribute to the complexity evaluate risk before a system has been adequately scoped
factor. The second dimension of design that requires and characterized is tantamount to placing the cart (of
characterization involves Control over the records risk) before the horse (of system). And yet this is not an
a system manages. The control element of system uncommon scene where we encounter fully-developed
design covers both the logical and physical security risk assessments, without a clear definition of the system
risks associated with the storing and transmitting of scope or characteristics.
(network, internet) electronic records.

Context of operation. The context of operation in- RISK MAY CAUSE FAILURE, BUT SUCCESS
cludes a description of the context or environment CAN NOT COME WITHOUT IT (21)
of operation. In this domain, issues regarding sys- Risk management is a complex subject because each
tem/data security and confidentiality are addressed. stakeholder places a different value on the probability
Business continuity and system support require- of harm occurring and on the detriment that might be
ments are defined. The purpose of this domain is to suffered on exposure to a hazard. (22)
define the procedural controls necessary to operate Operating under the influence of the three constella-
the system in compliance with its business critical- tions defined above (risk, quality, and life-cycle) provides
ity, regulatory impact, and technical complexity. some interesting challenges. If the call to a risk-based
The accuracy, integrity, attributability, and security approach is to be anything more than an empty signifier
of system data is not simply a function of its use and in a marching order, we must better understand how and
design. Systems are interactive and dynamic; they where we can apply this approach. In my capacity as an
undergo use, abuse, and change. How these Condi- auditor, I have reviewed many sophisticated (both from
tions of operation are designed and implemented a process and mathematical perspective) risk assessment
will directly affect the systems ability to perform its methodologies that come to the trivial and uninformative
intended use. Operations such as data backup and conclusion that, for example an MES system is a critical,
recovery, procedural controls over use, change con- high-risk, GMP system, indeed. The ultimate irony of
trol, and the management of system issues are all this exercise is that it simply leads, more often than not,
contributing factors to the environment of a system. to a classification without consequence. The system gets
System characterization must address the conditions assigned a category (usually a 1), a check box is filled, a
of operation as a contributing risk element. Finally, paragraph entered in a validation plan, and voila, instant
the sixth element of computer characterization that risk-based validation.
can affect risk of use includes the Confidence factors. The purpose of characterization, however, is not sim-
Confidence (in the human and statistical sense) in ply to catalogue a system within a pantheon of appli-
a computer system can be derived from a variety of cations (the naturalist impulse). Characterization must
sources, including the maturity of the product, infor- provide insight and justification for the control strategies
mation on the vendor (through audits, for example), (technical, management, operational) selected to ensure
and the product itself (i.e., level of documentation that system records are secure, accurate, have integrity,
available). All of these are mitigating factors in the and are attributable. System characterization informs the
determination of risk. Validation effort will be (in- validation strategy and the risk assessment. As such the
versely) proportional to the level of confidence in the characterization documentation must be risk informed.
system. Although confidence can be subjective (and A risk-based approach to system characterization must
often misguided), it is important to document these identify the particular potential vulnerability of the sys-
factors in the overall definition of risks associated tem under investigation. For each of the six domains
with the use of a computer system. defined previouslycriticality, context, complexity,
control, conditions, and confidencethe analysis will
Once the six Cs (criticality, context, complexity, con- identify those elements that affect the risks posed by the

84 Special Edition: Computer & Software Validation


Jacques Mourrain

system to product quality, data accuracy, security, etc. that question is, of course, What is Validation?
Risk informed means that each element (e.g., the num-
ber of functions and/or users) in the characterization of
a domain (e.g., complexity) can be assessed regarding its CONCLUSION
relative risk factor (and consequently assigned a value on In this article, I have tried to return to the primal question
an ordinal scale or a rank on a relative scale). What is?, using the computer system as the object in
Typical risk strategies begin with the application of question. I have taken up this topic, because valida-
functional requirements (set of functions and features) tion in the new-clear age finds itself at the confluence
to plot the probability, consequence, and detectability of three great forces: The revival of quality systems dis-
factors associated with a function/requirement. This ap- course, a probabilistic approach structured around an
proach provides a clear trace from requirements, through evaluation of risk, and an integrated perspective under
risk assessment to testing strategy. The problem, how- the framework of the lifecycle. This constellation, one
ever, is that it focuses predominantly on intended use could argue, constitutes a new paradigm for the new-clear
(criticality), at the exclusion of other risk factors (such as age of validation (although its history has yet to be writ-
application design, data flow, and condition of opera- tencomedy or tragedy).
tion) identified previously as key risk contributors. The Without a clear articulation of the fundamentals (e.g.,
general problem with these approaches is that they begin definition, scoping, and characterization), the act of vali-
at the end, as if to reverse engineer a desired outcome or dation runs the risk of being lost in sound bytes such as
an apodictic truth (the self-evident). The process, how- streamlining validation, Lean Validation, or risk-
ever, must begin at the beginning. It begins with scop- based validation: expressions without consequence.
ing, proceeds through characterization, and concludes Without a clear understanding of the basic tenet and first
in a strategy that is risk informed. I am not advocating principles of validation, we will never reach the heights
here a particular methodology Hazard Analysis and that these narratives promise. If we are truly to benefit
Critical Control Point (HACCP), HAZOP, FMEA, Fault from the great forces that today shape our universe, we
Tree Analysis (FTA)for documenting hazards, faults, must not forget our origins, even if they are only myths.
or effects. In fact I would warn against the fetishism of
method. Too often the method overshadows the process
and takes a life of its own. Whether or not a FTA is prefer- ENDNOTES
able to an FMEA is less important than knowing what the 1. FDA, Pharmaceutical cGMPs for the 21st CenturyA Risk-
object of investigation is (recall the ontological question). Based Approach, Final ReportFall 2004, September 2004.
The method is only as good as the staging or prework. 2. FDA, 21CFR11 Electronic Records, Electronic Signatures,
The final product can take many forms. My preferred Final Rule (20 March 1997).
approach is a document (stand alone or part of validation 3. Although I am no longer a card carrying member of the
plan) outlining the individual risk factors for each do- profession, having retired my thermocouples some years
main as a narrative description, with the corresponding ago, I would like to retain the form of the we throughout
risk mitigation strategy (controls, test strategy, etc.) that this paper.
will be implemented. I am not a proponent of quan- 4. By recording dates I run the risk of getting embroiled in
tification, and am more easily swayed by clear exposi- (false) historiographic debates over origins and first encoun-
tion or rationale. My personal bias, however, should not ters. I am only interested here in the confluence of forces
sway others from embarking on a model that would rank that are driving present terms and future directions. I am not
system risks along the domain defined previously. This interested, here and today, in cataloguing the first sighting of
relative ranking could trigger (pre-defined) strategies such a risk-informed validation that may have occurred in 1984.
as do nothing because risk is acceptable, to implement 5. FDA, General Principles of Software Validation; Final Guid-
procedural controls, monitor and report, or demonstrate ance for Industry and FDA Staff, 2002.
mitigation of risk through a formal test protocol. One 6. My argument, here, and the claims that follow, is not that
word of caution, however, is that the ranking should be FMEA or Hazard and Operability Analysis (HAZOP) can
tied to a strategy of control, otherwise it is without con- not be successfully leveraged to better understand system
sequence. A second word of caution is that, against the risks and points of failure (vulnerability), nor that a QbD
scientific precept (metaphysics), data do not speak for approach will not help us better focus our validation ef-
themselves. There is always an interpretive overlay that forts, but rather that a failure to attend to the first principles
makes sense of the facts (as we call them). If we are not of validation (outlined below) will ineluctably lead to the
to be seduced by our own ventriloquism, we must take diminution (or dilution) of their impact.
care not to misinterpret the risk score (quantifier) as the 7. An essay that best encapsulates this misunderstanding is
solution to the problem, as the end state in the analysis; (ironically) the 2005 ISPE white paper on Risk-Based Quali-
as if somehow the number (e.g., 42) could be a response fication for the 21st Century.
to a question that was never (and neednt be) asked. And 8. Many equipment vendors (and software vendors) today pro-

Special Edition: Computer & Software Validation 85


Jacques Mourrain

vide their own set of test scripts and validation protocols, is the fallacy of the empiricist who takes individual occur-
which for a small fee can be executed by the company for rences and events as general categories. Validation is not
instant validation (gratification). Pre-packaged protocols/ fundamentally an ontological exercise, it is exegetical. And
scripts sold with equipment or software can be useful as a yet it cannot escape this first act (and its trappings).
smoke test to confirm that the installation was successful, 18. Reference to Talking Heads - True Lives.
but rarely do they provide an adequate basis for validation. 19. FDA, Guidance for Industry Part 11, Electronic Records;
The reason is simple: These packages cannot provide ad- Electronic SignaturesScope and Application, 2003. In the
equate challenges without running the risk of significant Part 11 Scope and Application document (2003), FDA has
deviations. This is especially true for applications that are provided another compelling reason to link use (regulated
highly configurable, with multiple, complex, and divergent activity, predicate requirements) with validation, by suggest-
final states. The adaptive response is to provide a vanilla ing that validation might be optional (or reduced) if the
package, which has already been pre-tested at the factory to business process (and the corresponding record risks) can
guarantee success. be shown to be minimal.
9. ICH, ICH Q9: Quality Risk Management, November 2003. 20. I am not married to 6Cs, it could just as well be 4Rs or
10. ICH, ICH Q10: Pharmaceutical Quality System, September 3Ps. The point of this exegesis is that one cannot validate
2005. what one has not defined. And by extension, one cannot
11. ISPE, GAMP Guide for Validation of Automated Systems, validate well, what has not been well characterized. In my
Volume 4, 2001. ISPE, GAMP Volume 5, A Risk Based Ap- career, I have audited many validation protocols that failed
proach to Compliant GxP Computerized Systems, 2008. to describe the system in a manner that would enlighten
12. Janus, the Roman God of gates and doorways. that validation effort. Defining a LIMS as a Laboratory In-
13. One could easily discount my examples of validation tropes formation System is a truism; at best uninformative, at worst
as mere fantasy or exaggeration. Unfortunately, on more a platitude.
than one occasion, I have reviewed validation packages that 21. Actual fortune cookie wisdom.
can only be described in these terms. 22. ISO 14971, Medical DevicesApplication of Risk Manage-
14. The term empty signifier is used here to represent expres- ment to Medical Devices, 2000. JVT
sions (or acronyms) that are no longer grounded in the
history and traditions of a discipline, but circulate freely as ARTICLE ACRONYM LISTING
banners and call to arms. Because they are not grounded CAPA Corrective Action and Preventive Action
(weighed down) with the gravity of practice they can be CCP Critical Control Points
exchanged without consequence (FAT = OQ, UAT = PQ, Cd Coefficient of Drag
Design = Test). The terms are interchangeable, not as a func- CFRs Code of Federal Regulations
tion of an economy of signs (a formal exchange value), but CMM Capability Maturity Model
as a function of their propinquity or strange attraction. CofAs Certificates of Analysis
15. FDA, Guideline On General Principles of Process Valida- COTS Commercial-off-the-Shelf
tion, May 1987. DQ Design Qualification
16. There is some debate as to whether the relative pronoun ERES Electronic Records Electronic Signature
that was intended here, in which case the auxiliary clause ERP Enterprise Resource Planning
provides a high degree of assurance is intended to be ETL Extract Transform Load
restrictive of the evidence provided. Not all documented FAT Factory Acceptance Testing
evidence counts as validation, only that which provides an FDA US Food and Drug Administration
assurance. I have modeled the sentence structure accord- FMEA Failure Mode and Effects Analysis
ingly. It is the evidence that provides an assurance. How- FTA Fault Tree Analysis
ever, one could argue that by extension, applying the transi- GAMP Good Automated Manufacturing Practice
tive property of equality, validation does also provide such GEP Good Engineering Practices
assurance, in which case the choice of that or which is GMPs Good Manufacturing Practices
irrelevant. As a point of curiosity, it is not uncommon for HACCP Hazard Analysis and Critical Control Point
authors to misquote this passage in the literature on valida- HAZOP Hazard and Operability Analysis
tion. IEEE Institute of Electrical and Electronics Engineers
17. There are two well know philosophical traps associated with I/O Input/Output
the question What is? The first is known as the norma- IQ Installation Qualification
tive fallacy, which involves confusing What ought to be Lean
(principles, ideas, theories, Platonic Forms) with What is. CMM Lean Capability Maturity Model
This is the fallacy of the rationalist or idealist who takes first LIMS Laboratory Information Management System
principles (theory, what ought to be) as the truth of the real MES Manufacturing Execution System
(what I encounter). The second is known as the naturalist OQ Operation Qualification
fallacy, which takes What is for What ought to be. This PEP Poor Engineering Practices

86 Special Edition: Computer & Software Validation


Jacques Mourrain

P&ID Process and Instrument Diagram SOPs Standard Operating Procedures


PQ Performance Qualification SSPs Sound Scientific Principles
QbD Quality by Design TOP Turn Over Package
SAN Storage Area Networks TQM Total Quality Management
SAT Site Acceptance Testing UAT User Acceptance Testing
SEI Software Engineering Institute V&V Validation & Verification
SDLC Software Development LifeCycle

Originally published in the Summer 2008 issue of Journal of Validation Technology

Special Edition: Computer & Software Validation 87

You might also like