You are on page 1of 4

ELM 113 Summary

Introduction to Software Validation

Planning: Determine what needs to be validated, establish a framework, and create a validation
plan
Determine what needs to be validated:
o List all software and identify what needs to be validated
Based on regulatory requirements
When there is a risk or when it can impact quality
Validation effort should be directly proportional to risk
Risk Assessment
o Failure Modes Effect Analysis (FMEA) a study of how a system could fail by analysis of
the effects.
o Hazard Analysis Analysis of how systems can contribute to risk.
o Fault Tree Analysis (FTA) Looking at faults and failures and identifying what happened
to realize the fault.
Validation Master Plan (VMP)
o How validation is performed, how issues are managed, how to assess validation impact
for changes, and how to maintain systems in a validated state.
o Also includes
Scope of validation
What is validated
Elements to be validated and how to maintain the list
Interrelationships between systems
Validation responsibilities; periodic reviews
Re-validation approaches
Rationale for not validating certain aspects of the system
Approach to validating systems in production that have not been validated
Approach to validating a system that is commissioned and live but not formally
validated
General timelines, milestones, deliverables, and roles and responsibilities
Training and minimum level of training/qualification
Minimum qualifications necessary to support the validation
How errors are handled
Review boards
Change control
How deviations are handled
How variances are handled
Typographical errors
o Does not need to be too detailed. A general statement can be
made that typographical errors are found and corrected
Procedural errors
o In a Test Step
Tester should make the changes using mark ups and
annotations with explanation and the reviewer should
review the changes and record an annotation.
Should be individually summarized
o In a Test Result
Changes in expected results should be pre-approved
before execution
By a Reviewer or;
A QA representative
Suspension of Testing
o Situations where testing will be disabled should be included in
the VMP
Example: When a fatal flaw is exposed that could affect
the integrity and validity or validity of results
Suspend the testing, fix the problem, then
resume testing
In some cases, resuming may not be
appropriate; restart testing
Validation Plan
o Like the VMP
Specifies resources required and timeline required but in more detail
o Lists all activities involved in validation; includes a detailed schedule
o The scope has the following considerations:
Servers
Clients
Stand-alone applications
o Validation on every system depends on the risk the software poses to the organization
or to people
o All servers must be validated
o Must identify the required equipment and if calibration of that equipment is necessary
o Must specify if specialized training or testers with specialized skill is needed
o If the software is to be installed in multiple machines, the Validation Plan should
indicate the approach to take.
o For customized or highly configurable software, an audit of the vendor may be included.
To show that the vendor has appropriate quality systems in place
To show that the vendor manages and tracks problems
To show that the vendor has a software development lifecycle
Software Development Life Cycle (SDLC)
o Divided in distinct phases
o Should be in accordance with a proven system development lifecycle methodology
o It is important that:
There is objective evidence that a process was followed
There are defined outputs that ensure effort is controlled
o A set of requirements must exist, at a minimum, in order to know what to verify and
the support system must be established to ensure system can be maintained
V Development Model:

Specifying requirements
o You can only verify what you specify
Vague specification = Vague Test
o Keep all requirements specifications up to date
But if requirements are continually changing, you run the risk of delays and
synchronization issues
o Baseline you requirements specifications and update
o Keep a well-structured specification documentation
o Use shall to specify requirements.
There should be only one shall per requirement.
If there is more one shall consider breaking up the requirement.
o Do not create statements of fact requirements
Requirements Traceability
o Show all requirements have been fulfilled via verification testing
Trace requirements to the test
Small Systems: Spreadsheets
Large Systems: Trace Management Tool
Generates Trace reports quickly
Culminates in a Trace Matrix or Traceability Matrix
Maps the design elements of validation to the test cases that validated
these requirements
Becomes part of the validation evidence
Review Requirements
o Should include:
System Architects
To confirm the requirements represent and can support the
architecture
Developers
To ensure the requirements can be developed
Testers
To confirm the requirements can be verified
Quality Assurance
To ensure the requirements are complete
End users and system owners

You might also like