You are on page 1of 27

6

Computer Validation: A Compliance Focus

Timothy Horgan and Timothy Carey


Wyeth BioPharma, Andover, Massachusetts, U.S.A.

1 INTRODUCTION
Today it seems that everything has automation hiding in it somewhere!The
focus of this chapter on computer validation compliance, however is what
we will term the application-based computer-related system (CRS), as
opposed to computer-controlled process equipment. Examples of a CRS
may include laboratory information management systems (LIMS), supervi-
sory control and data acquisition (SCADA) systems, document control sys-
tems, calibration data management systems, and any other of the myriad
systems that mate a portable software application to commercially available
computer hardware. This is di¡erent, for example, from a modern water-
for-injection still that employs computer software and hardware but is inex-
tricably linked to specialized mechanical process equipment.The validation
testing for such equipment is typically focused on the mechanical perfor-
mance of the equipment, rather than on the performance of the software
itself. Many of the concepts discussed in this chapter (validation plans, speci-
¢cations, etc.), however have analogous counterparts related to equipment
validation, and the goal is the same: veri¢cation that a system consistently
performs its intended function throughout its usable life.
223

Copyright © 2004 Marcel Dekker, Inc.


224 Horgan and Carey

2 GAMP GUIDE
The most prominent and widely recognized guide to CRS validation prac-
tices is the Good Automated Manufacturing Practices (GAMP) guide [1].
While the concepts surrounding software testing and quality assurance have
been discussed for years (in fact, a seminal book in the ¢eld was published
in 1979) [2], the GAMP guide is a crucial reference because it is written from
the speci¢c perspective of the regulated pharmaceutical industry. The
GAMP guide was ¢rst published in Europe in 1994, and was written by a
small consortium of European professionals in response to European regu-
latory agency concerns. It did not take long for industry professionals world-
wide to appreciate this publication and recognize the industry’s need for it.
The GAMP guide has since become an international collaboration between
pharmaceutical industry validation and compliance professionals. The
guide has gained acceptance worldwide as the pharmaceutical industry
guideline on the validation of software and automated systems. While of
course one size never ¢ts all, the GAMP guide is an indispensable resource
for discussion of validation strategies.While this chapter will not extensively
review information already available in the GAMP guide, the following are
some highlights of what can be found in the GAMP guide:

Categorization of types of systems and guidelines on the extent of vali-


dation required for each
Supplier and vendor guidelines on software development expectations
within the pharmaceutical industry
A helpful glossary
Numerous valuable appendices with concrete recommendations on
system development, implementation phase management, and
ongoing system operation

Copies of the GAMP guide can be obtained through industry profes-


sional associations or directly from the GAMP forum organization
(www.gamp.org).

3 SOFTWARE DEVELOPER AUDITS


The International Standards Organization (ISO) de¢nition of audit is‘‘Syste-
matic, independent, and documented process for obtaining audit evidence
and evaluating it objectively to determine the extent to which agreed criteria
are ful¢lled.’’ The bottom-line goal of the software developer audit process
is to allow you to assess the developer’s quality assurance (QA) system.

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 225

3.1 Why Audit?


The FDA regulations clearly require evaluation of suppliers as evidenced by
the following current regulations:
From the device regulations: 21 Code of Federal Register (CFR)
820.50 Quality System Regulation Subpart EPurchasing Con-
trols: ‘‘Each manufacturer shall establish and maintain procedures
to ensure that all purchased or otherwise received product and ser-
vices conform to speci¢ed requirements. (a) Evaluation of suppliers,
contractors, and consultants. Each manufacturer shall establish and
maintain the requirements, including quality requirements, that
must be met by suppliers, contractors, and consultants.’’
21 CFR Part 11, Sec.11.10 (i).‘‘Determination that persons who develop,
maintain, or use electronic record/electronic signature systems have
the education, training, and experience to perform their assigned
tasks.’’

In business terms, auditing of software developers will allow you to


assess the vendor’s technical competence, vendor reaction to your com-
pany’s user requirements speci¢cation (URS),vendor QA system adequacy,
supplier experience with GXP systems, and quality level of vendor-prepared
validation and quali¢cation protocols. In short, vendor auditing is a regula-
tory expectation and auditing provides a means of assessing the supplier’s
ability to deliver a validatable system that will achieve the requirements of
your company’s URS.

3.2 The Preaudit


Assess the need for an audit.What is the criticality of the software product in
business terms? Evaluate the risk to the pharmaceutical product, production
process, and quality data associated with the software.
Prior to a site visit, assess the vendor remotely through product litera-
ture and submission of a preaudit questionnaire. When a vendor site audit
becomes necessary, plan the audit’s scope and focus and identify the audit
team. Audit team members should include user group lead, information
services groups (IT/IS/MIS), compliance, purchasing, and validation per-
sonnel.
Schedule the audit with the vendor to ensure that key development,
quality, and management personnel will be onsite and available to the audit
team.Verify how much on-site time the vendor will allow. Make sure that the
goals of your audit can be completed in the allotted time. Schedule a closing
meeting with key developer personnel for the end of the audit.Your company

Copyright © 2004 Marcel Dekker, Inc.


226 Horgan and Carey

should develop a vendor audit checklist. The checklist should be your plan
for proceeding through a thorough audit.
Elements should include
General company information
Standard operating procedures (SOPs)
Customer support
Quality management/standards
Software/system development methodology
Testing methods/veri¢cation and validation
Technical personnel
Change control, con¢guration, and distribution management
Security features
Documentation
21 CFR Part 11 compliance assessment

3.3 The Audit


Don’t allow the opening meeting to turn into an extended sales show. Inspec-
tion and interview should constitute the bulk of your work. Ask open-ended
questions; don’t set up for simple yes or no answers. These open-ended
responses will often lead you to unforeseen concerns.
Where documentation of a process is found to be substandard, describe
to the vendor how to comply with your standards.This is a free GXP consult-
ing service for the developer,which is usually eager to receive some feedback.
Plan an interim meeting with your team to check focus, issues, and pro-
gress. Remember that you, the pharmaceutical manufacturer, bear the ulti-
mate responsibility for regulatory compliance, including the compliance
level of the software you implement.

3.4 Postaudit Activities


Produce an audit report. Typical report sections will include
Purpose: State the company and division that was auditedwhen,
where, and by whom. List audit team members by name, title, and
department. List key representatives of the developer company.
Overview: Describe focus issues. Summarize ¢ndings. Refer to the audit
checklist for more detailed notes. Note that audit observations
require a response/action plan from the software summary.
Review summary.
History of company and product.Note that level of development sta⁄ng.
Software development (SW Dev) and quality Assurance manual: Describe
purpose. For example, is it intended to facilitate validation by

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 227

including protocols? What programming standards and develop-


ment life-cycle models are cited? The software quality assurance
(SQA) program manual should be version/revision controlled and
address an overview of the vendor’s quality program, the vendor’s
programming standards, version control (SDLC), and maintenance
procedures. It should also describe development documentation,
such as the data sheet used by developers when executing tests, sys-
tem requirements that detail all features and functionality of the
software and ¢nal system speci¢cations.
SW Dev/SWunit testing note: Version, operational software (VB 6.0, MS
Access 8.0, Crystal Reports 7.0) and operating environment/plat-
form (NT or Linux Server). Plans to version product how soon and
in what manner.What is the future development direction of this pro-
duct? Is an SQL server version in development? Is an Oracle-based
product coming? How closely linked are the content of the func-
tional specs and the unit testing? What steps are de¢ned in their
development cycle (design, develop, test, implement)? Is the docu-
mentation consistent with their SOPs and purported development
models? Describe testing review policy and procedures. Who is
responsible for what? What does review failure or success mean?
What happens in each case? Is there evidence of review failures?
Ensure that the review is meaningful. Describe the testing approach
and routine. Is documentation signed? Are comments dated and ini-
tialed? Are deviations and failures pursued to conclusion per SOP?
How is the product protected by an adequate backup, recovery, and
disaster recovery plan? Where and how is the product secured?
Change control: What policies are in place? Are they adequate? Are they
observed? Does the vendor rely on the development tools to control/
revisions, new functionality, and new functionality changes?
Employee training: Are the SOPs clearly written? How are they
reviewed and approved? Are they maintained in the work area or
available to employees? Are they numbered and versioned? Is train-
ing documented and assessed? How are training records stored and
¢led? Have they been audited? Is there evidence of GXP training?
There should be resumes on ¢le at a minimum.
Customer support: How many persons available/per week? How many
calls are handled per day or per week? Number of minutes per call
what is the average? Is there a formal problem/bug log? How is it fol-
lowed up? Is a previous product version supported? What is the cost
and coverage of the support service package? Are other corporate
services provided, such as database conversion or migration? Is a
statistical method or standard used to assess successful conversion

Copyright © 2004 Marcel Dekker, Inc.


228 Horgan and Carey

or migration? [See ANSI/ASQC Z1.4^1993 ‘‘Sampling by Attri-


butes’’; military standard 105E has been canceled (obsolete).]
Security: Is there server backup policy or routine? Is the backup log
being used accurate and reviewed? Is there o¡site storage? Are there
source code security and storage, source code escrow arrangements,
and facility access control? Is there physical and logical control of
the computing environment? How is the development server net-
worked? Is there an open or closed system? Is it ¢rewalled? What is
the password policy and control? Has the backup and recovery or
disaster recovery plan ever been tested? Was it documented?
CFR 21 Part 11: Do key personnel understand the rule clearly? Run
through a detailed list of the requirements of Part 11 and attempt to
determine
What Part 11 requirements does the vendor concede that the product
does not meet?
In your own judgment,what Part 11 requirements does the software fail
to meet? Beyond the speci¢c requirements of part 11 is your business
and operational context. How con¢gurable is the record review and
approval signature functionality? Assess the risk of the noncompli-
ance level. Risk is best assessed in terms of risk to the patient, proxi-
mity to the drug product production process, quality data, and
dispositioning process. Risk of implementing a less than fully com-
pliant system is also relative to the risk of continuing to use the even
less compliant system being replaced.

3.5 Possible Audit Observations


Communicate and document your ¢ndings to the vendor. Findings such as
the following may be identi¢ed:
Internal quality activities, including personnel training ¢les, need to be
up to date and documented on a regular basis.
Internal audits should be performed and documented to ensure SOPs
are observed.
Software is not fully 21 CFR Part 11-compliant. Document your plan
for bringing the product into compliance.
Documentation for software testing (release to production testing)
does not clearly indicate the version being tested.
There is no formal revision change control tracking method.
Formally communicate your summary report to the vendor. Ensure
that a corrective action plan from the vendor will be provided. Update the
report in accordance with the vendor response. Use this audit summary

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 229

report as an integral part of the validation plan, test protocols, and Part 11
remediation and assessment plan.

4 THE VALIDATION MASTER PLAN


The validation master plan (VMP) is the roadmap and gatekeeper of the CRS
validation process. It needs to answer several questions.
What speci¢cally will be validated?
How are we going to validate?
How will we know when the system is ready?
It should identify what validation protocols are required and every-
thing else that is needed before the system can be considered validated and
ready for use.
While a VMP is typically drafted under the auspices of the validation/
quality group, the end users and engineering groups involved with project
implementation should be involved in review and approval of this document.
Everyone needs to understand and agree to the objectives that must be satis-
¢ed before the CRS is put into GMP-related use. The following are points
and topics to consider when drafting aVMP.

4.1 Scope of Computer-Related System


The scope of the CRS validation must be de¢ned. For a stand-alone applica-
tion on a stand-alone computer system, this may be straightforward. If there
are any interfaces with other systems, however, the scope becomes more
challenging. A clear de¢nition of theVMP scope will help prevent misunder-
standings and ‘‘scope creep’’ that can cause signi¢cant schedule delays.
Consider such questions as
Will this CRS include a data backup system that needs to be validated
or will it utilize an existing validated backup system?
Is this networked CRS tying into an existing network infrastructure or
does it include its own network?
Does this CRS provide data to an existing system, and will the existing
system require any revalidation?
What group or groups does this CRS serve and who is the ultimate
‘‘owner’’of the system?
As with most project plans, it is important for the VMP to include a
de¢nition of personnel roles and responsibilities. Numerous documents may
need to come together to validate a system (e.g., speci¢cations, SOPs, proto-
cols), and team members must agree upon who will be responsible for speci-
¢c deliverables, as well as approve the ¢nal documentation.

Copyright © 2004 Marcel Dekker, Inc.


230 Horgan and Carey

4.2 Documentation Needs


All documentation that must be produced to support the validation process
should be listed in the VMP. At the time of drafting a VMP, it may not be
known, for example, exactly how many SOPs will be written for the system,
but the types and categories of SOPs needed should be delineated. Early on,
the need for operation, administration, and maintenance SOPs should be
apparent. Additionally, be sure to consider the following as they apply to
your project:
Speci¢cations
SOPs
Vendor documentation
Vendor audit reports
Engineering peer reviews
Module testing
Protocols
Protocol reports
In addition to listing the required testing documentation, describe the
philosophy behind the testing documents. Is the CRS a simple stand-alone
system requiring only a single installation quali¢cation (IQ) and operational
quali¢cation (OQ)? If so, brie£y state the objective of these documents.
Is the CRS a complex, multifunctional system requiring several layers of
testing, from discrete software unit/module tests up to a fully integrated
system performance quali¢cation (PQ)? If so, describe these layers of testing
and the requirements and objectives of each. In particular, note who will be
performing each layer of testing (the design engineers may be required to
perform some unit module testing before the validation/quality engineers
continue with higher-level testing) and the type of documentation required
at each level (possibly vendor-designed engineering review forms at the ear-
liest stage, leading to validation protocols at later stages). Consider creating
diagrams to help explain the testing methodology. These can be invaluable
to quickly convey the basics of the testing methodology to new or peripheral
project team members. For example, Fig. 1 illustrates a complex, multifa-
ceted system development and validation process, but communicates the
basic methodology very e⁄ciently.

4.3 Training
Ensure training requirements are addressed within the VMP. De¢ne the
training that must be performed and documented before the system can be
put o⁄cially into use. Determine how training will be documented. Refer to
organizationwide training policies that are applicable. Be sure to require

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 231

FIGURE 1 Complex multifaceted system development and validation process.

veri¢cation that CRS user security administration (if any) accurately re£ects
the training documentation. For example, untrained users may have been
allowed access to the CRS for testing purposes. These users must be inacti-
vated before the system is initiated into GMP uses.

4.4 System Acceptance


In order to determine whether all VMP requirements have been met,the CRS
VMP should require what will be referred to as a master validation summary
report (MVS). The MVS should discuss each deliverable required by the

Copyright © 2004 Marcel Dekker, Inc.


232 Horgan and Carey

VMP for its successful execution. Reference the location of each deliverable
and provide the detail necessary for retrieval at a later date (e.g., SOP and
protocol numbers). Identify any conditions surrounding the use of the sys-
tem. Were some features of the system found to be unsatisfactory for use?
Clearly state what aspects of the CRS are not approved for use until they are
re-engineered and tested and what formal controls are in place to enforce
this (SOPs, security programming, etc.). Design the VMP so that approval
of the MVS document is the end point that releases the CRS for GMP use
by appropriately trained users. This end point is called system acceptance
and signals the transition to the validation maintenance phase of the system
life cycle.

5 SYSTEM SPECIFICATIONS
System speci¢cations are essential to the CRS validation process. This does
not mean, however, that the process of creating speci¢cations need be an
onerous one. In fact, the speci¢cation creation and approval process can be
one of the most valuable aspects of the system life cycle, as it spurs everyone
involved to understand and agree upon what the system is ultimately
expected to do. Regardless of how well-thought-out a system may be in the
preliminary design stages, di¡ering perceptions are always brought to light
once speci¢cations are put in writing.
In contrast to the VMP, system speci¢cations are typically created by
the end user(s) and system engineering organizations. Since the system spe-
ci¢cations will be the basis for much of the validation testing, validation and
quality representatives should be brought into the speci¢cation development
process whenever possible. If these representatives are left out of the speci¢-
cation process, the result will often be speci¢cations that are only intelligible
to the design engineers and principal end users because much of the wording
in the speci¢cations will be based on perceptions that have not been written
into the documents. When validation and quality systems’ representatives
are excluded from developing speci¢cations, extensive revisions are likely
to be needed in order to make it suitable for the validation.
Speci¢cations typically required for a custom-designed application are
known as the URS, functional requirements speci¢cation (FRS), and the
detailed design speci¢cation (DDS). A description of each speci¢cation and
its function follows.
User requirements speci¢cationThe URS is an overview of the system
functionally from the perspective of the end users’ needs. This docu-
ment is often the ¢rst document generated to kick o¡ a new CRS
design and implementation project. It can be drafted early on by a

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 233

key individual or group and used to help clarify the business needs to
toplevel management. Once the system concept has the necessary
management support it can be used to introduce the project to a
wider audience and solicit additional user input as to the desired fea-
tures of the system. If an outside ¢rm will do system engineering, the
URS can be used as the de¢ning document for the engineering ser-
vices bid process. If the engineering will be performed internally, the
engineering group can use the URS to project estimated resources
needed to support the system development and implementation pro-
cess. The requirements in the URS will typically be used as the basis
for PQ testing. A typical statement in the URS might be ‘‘The system
must allow simultaneous use of di¡erent system features by multiple
material handlers.’’
Functional requirements speci¢cationThe FRS is a detailed descrip-
tion of required system functionality from the end users’ perspective.
This document should discuss all required functionality by the end
user since it will be used by the engineering team to establish the sys-
tem design. While the URS may be written entirely with end user
input, the FRS is typically a collaborative e¡ort involving both end
user representatives and engineering team representatives.The FRS
must address the users’desired functionality, but must also take into
account what functionality can be feasibly designed into the system
given the constraints of the schedule, budget, and development plat-
form. The requirements in the FRS will typically be used as the basis
for OQ testing. A typical statement in the FRS might be: ‘‘The user
must be required to enter a control code for the raw material lot that
is being dispensed; the system must verify that this control code is
valid and that the lot disposition status allows dispensing of this
material.’’
Detailed design speci¢cationThe DDS is a detailed description of how
the requirements identi¢ed in the FRS will be met from the perspec-
tive of the system design engineer. This document must be descrip-
tive enough to facilitate a consistent programming e¡ort across the
entire (potentially large) team of engineers. This document is typi-
cally drafted by engineers who understand the programming plat-
form that will be used to create the system. While this document
may be very technical, end user review is still necessary to help
ensure that the requirements of the FRS have been properly inter-
preted by the engineering team. Design information contained in the
DDS may include such things as
User interface screens
Database names and ¢eld de¢nitions

Copyright © 2004 Marcel Dekker, Inc.


234 Horgan and Carey

Security scheme implementation


Speci¢c features of the development platform to be used for speci¢c
functions of the application
The speci¢cations in the DDS will be used as the basis for IQ and some
OQ testing. As mentioned above, it is important that these speci¢ca-
tions are reviewed by validation and compliance team members
throughout the process in addition to the applicable end users and
design engineers.

6 SPECIFICATION NOMENCLATURE
The speci¢cation names used here are one example, but there are many ways
to refer to the same documents. The URS might instead be called the busi-
ness requirements speci¢cation (BRS).The FRS might be functional speci¢-
cation (FS) or the system requirements speci¢cation (SRS). The DDS might
be a design speci¢cation (DS) or an engineering speci¢cation (ES). The
intended purpose of the speci¢cation must be delineated in the document to
avoid confusion.

7 COMBINING SPECIFICATIONS
What if the project is small and the speci¢cations are not very complex? Can
these documents be combined? Absolutely. On simpler projects, it is not
unusual for the goals of the URS and FRS to be incorporated into one user-
focused document with a separate engineering-focused design document. It
is also not uncommon for the user to generate the high-level URS and submit
it to design engineers familiar with the process for creation of a combined
FRS/DDS document.

8 OFF-THE-SHELF/COTS APPLICATION
What if you are implementing a commercially available application (com-
mercial o¡-the-shelf, or COTS application) rather than designing a custom
application? Do you still need these documents? Yes, but probably not all of
them. The FDA has made it clear [3] that URS are expected, even for COTS
applications.Without a URS, there would be no de¢nition or required system
functionality on which to base initial validation activities. The COTS URS
will typically end up as a blend of the URS and FRS, as described previously.
This blend of requirement types is appropriate for the COTS situation, in
which you are specifying what is necessary for your business operations but
do not need to write speci¢cations su⁄cient to actually design the applica-
tion from scratch. After selection of the application, additional speci¢cation

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 235

documents should be written focused on the choices that will be made in


your speci¢c implementation of this application. A con¢guration speci¢ca-
tion is recommended to de¢ne the con¢guration choices made in implemen-
tation of the software. A security speci¢cation is recommended to de¢ne
the security implementation scheme. An installation speci¢cation is also
recommended to de¢ne the associated hardware and software requirements
necessary to run the application in the chosen environment.

9 TEST EXECUTION HARDWARE


One interesting variation of CRS validation when compared to equipment
validation is that software is not inherently linked to a unique piece of com-
puter equipment, yet it does require some form of suitable computer equip-
ment for operation. One can streamline the CRS validation process by
testing the software using computer hardware other than the ¢nal hardware
on which the software will eventually be installed for use. This means that
validation testing can start using a pilot hardware system while the ¢nal
hardware is, for instance, still being procured and installed.

10 INSTALLATION, OPERATION, AND PERFORMANCE


QUALIFICATION
When a pilot system is used for validation, it is preferable to perform an IQ on
the system to verify that it actually satis¢es the hardware requirements of the
software. If an IQ is not performed on this system, at a minimum the details
of the hardware (manufacturer, model, serial numbers con¢guration) should
be recorded as part of the OQ. There absolutely needs to be a record of the
test bed hardware in case the validity of testing on the pilot hardware is called
into question at a later date. Full OQ testing can then be performed on the
pilot system. Execution of the OQ on a pilot system does not mean, however,
that use of the ¢nal hardware system becomes a‘‘plug-and-play’’a¡air.There
still needs to be full con¢dence that the system performs in a satisfactory
manner on the ¢nal hardware.
The ¢nal integrated hardware/software system must achieve a full IQ
to verify that the predetermined hardware and software con¢guration speci-
¢cations are satis¢ed. Be sure to verify that the software version installed
on the ¢nal system is the same as the one used for OQ testing on the pilot sys-
tem. If not, there must be formal change control documentation in place that
veri¢es that the validated status of the software has not been a¡ected. There
must also be some operational testing done on the ¢nal system. If the valida-
tion plan has identi¢ed the need for IQ, OQ, and PQ, it would be reasonable
to perform the OQ on the pilot system and PQ on the ¢nal system. If only IQ

Copyright © 2004 Marcel Dekker, Inc.


236 Horgan and Carey

and OQ have been planned for the system, a second OQ should be prepared
that repeats some portions of the OQ that were performed on the pilot
system.
If the system being implemented is replacing or upgrading an existing
critical system, use of a pilot system for validation testing can pay o¡ with
greatly reduced downtime for the critical system, as less testing needs to be
performed on the critical system before it is put back into GMP use. If the
system being implemented is entirely new, validation testing on a pilot sys-
tem can pay o¡ by shortening the validation schedule as testing and ¢nal
hardware procurement and installation activities are allowed to run in paral-
lel rather than consecutively.

10.1 21 CFR Part 11


The FDA release and enactment of the Part 11 ¢nal rule in 1997 was met with
overwhelming confusion and reticence on the part of the pharmaceutical
industry. There was almost a sense that, given enough reluctance and pro-
crastination, the rule would be recalled. There was good reason for procras-
tination. The expense of complying with the regulation was and continues
to be great. In some cases the technology still does not exist to comply with
the ‘‘letter’’ of Part 11 without turning your pharmaceutical manufacturing
organization into a software development company, but procrastination is
no longer an option. Part 11 must be taken into account when specifying and
implementing all new electronic systems. In addition, strides toward compli-
ance must be made for legacy computer systems.
The ¢rst place to turn when coming to terms with Part 11 compliance is
to the regulation itself. While the regulation covered only about two pages
in the Federal Register, it was accompanied by a 30-page preamble upon its
issue.While it does not bind the FDA, the reasoning and clari¢cation o¡ered
in the preamble is the best available written insight into the goals and intent
behind the creation of Part 11. It is a good starting point.
The most important step in appropriately implementing Part 11 is
understanding that the FDA requires the electronic records used in GMP
activities to be as reliable and trustworthy as traditional paper records. This
concept has been expressed at numerous public speaking engagements by
FDA o⁄cials.
The Part 11 rule can be broken down into three major concepts.

Audit trails and related controls necessary to maintain record integrity


(11.10)
Requirements for valid electronic signatures (11.50^11.200)
System security (11.30 and 11.300)

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 237

While there is some overlap among these, the regulatory requirements


are easier to evaluate when taken in these separate parts. Break up your eva-
luation of any new or existing system into these three areas. Create a generic
list of questions or preferred features based on these three areas, keyed to
speci¢c sections in the regulation, then apply these questions or research
these features for the systems in question. Remember that not all electronic
records require signatures, so the electronic signature controls may not even
be relevant to the system under review.
When it comes to assessing the compliance level of existing (‘‘legacy’’)
electronic systems, don’t shy away from a critical review. If you operate in
an established manufacturing environment, you will need to upgrade and/
or replace some systems. A critical review is necessary to appropriately
prioritize the systems that need attention.
Plan and undertake a sitewide or companywide review of all legacy
electronic systems. Evaluate them based on your list of compliance questions
or features keyed to the Part 11 regulation. If you work at a large site or com-
pany, it will be more e⁄cient in the long run to execute this large-scale eva-
luation rather than enacting it in smaller portions. You will likely ¢nd some
commonality among many noncompliant systems across the organization.
That commonality can lead to common solutions. For example, it is far more
e⁄cient to implement a compliance solution for 100 spreadsheets used in
GMP operations than to implement separate solutions for 20 spreadsheets
in each of ¢ve workgroups. As another example, you may ¢nd 10 systems that
require customized solutions. You may be able to gain resource e⁄ciency by
building those 10 solutions on common core software functionality, however
(e.g., the same database platform).
Once you have identi¢ed the legacy systems that need to be brought
into compliance, documented plans must be put in place to remediate them.
Thoughtful prioritization is critical to the remediation process. Most ¢rms
do not have the resources to undertake all Part 11 remediation project at
once. Analyze and rank the records managed by these systems according to
their risk to quality systems and product integrity. The highest risk systems
should be improved ¢rst. Similarly, do not apply resources to prioritized pro-
jects that o¡er little bene¢t. For example, if a particular noncompliant
system will be phased out over the next few years, do not implement a 1 to
2-year e¡ort to develop and validate a computerized replacement for that
system. Instead, investigate the possibility of accelerating the phase-out.
When evaluating system upgrades or replacements, do not necessarily
expect complete Part 11 compliance on the part of the new system. You may
not ¢nd it. Compare solutions to each other and choose from among the ones
that o¡er the highest levels of compliance. Ensure that each of the three
major concepts noted above has been appropriately addressed to some

Copyright © 2004 Marcel Dekker, Inc.


238 Horgan and Carey

degree. Do not use the lack of a fully compliant solution as the reason for not
implementing any solution. The near-term goal for legacy systems is
improvement, not perfection.

10.2 Change Control


There should be written procedures to establish systems to manage and
control changes that may impact the development, validation, or implemen-
tation or a¡ect the maintenance of a validated state for computer systems.
Such procedures and controls should apply to all GXP operations and all the
systems that support GXP operations.
Change control procedures should be
Observable
Adequate to maintain a CRS’s validated state
Capable of maintaining the validity of CRS documentation
Able to ensure product quality and patient safety
The purpose of change management and control procedures is to docu-
ment, evaluate, and manage proposed changes in such a way as to maintain
the validated system.
Change management is typically a cross-functional activity involving
the vendor, QA, and development and engineering personnel. The develo-
per’s (or your own) QA unit should approve and occasionally audit confor-
mance to the change management process.
Con¢guration control may be de¢ned as an element of con¢guration
management, consisting of the evaluation, coordination, approval or dis-
approval, and implementation of changes to con¢guration items after formal
establishment of their con¢guration identi¢cation (IEEE 610.12^1990).
Changes are proposed, documented, requested, evaluated, approved,
and tested prior to implementation. The quality unit should be responsible
for managing both the entire process and all corresponding documentation
associated with the change. Change control typically begins at an SDLC
milestone de¢ned in a project validation plan. The CRS should at least be
fully designed, documented, validatable, and implementable.
The CRS’s VMP (or corporate policy) should state during the valida-
tion process that a system will be managed under the ¢rm’s change control
policy or system. For example, the subject system will be considered to be
under change control upon the acceptance of the OQ summary report by the
quality unit. Consider when the system in question will actually begin to be
used to support GXP production.
Multiple change management processes may exist within a company
documentation, equipment, CRS, and project-or system-speci¢c processes.

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 239

In compliance terms e¡ective change management and change control


are a clear regulatory expectation.
21CFR211.68, Subpart DAutomatic, mechanical, and electronic equip-
ment. ‘‘(b) Appropriate controls shall be exercised over computer or
related systems to assure that changes in master production and control
records or other records are instituted only by authorized personnel.’’
The ICH’s Q7A Good Manufacturing Practice Guidance for Active
Pharmaceutical Ingredients, V. Process Equipment (5), D. Compu-
terized Systems (5.4)‘‘Changes to computerized systems should be
made according to a change procedure and should be formally author-
ized, documented, and tested. Records should be kept of all changes,
including modi¢cations and enhancements made to the hardware, soft-
ware, and any other critical component of the system. These records
should demonstrate that the system is maintained in a validated state.’’
Change control‘‘A formal change control system should be established to
evaluate all changes that could a¡ect the production and control of the
intermediate or API.Written procedures should provide for the identi¢ca-
tion, documentation, appropriate review, and approval of changes in raw
materials, speci¢cations, analytical methods, facilities, support systems,
equipment (including computer hardware), processing steps, labeling
and packaging materials, and computer software.’’
Beyond the compliance imperatives, e¡ective change management is
also good business practice. Bene¢ts include maintaining knowledge of
complex CRS within your department or the organization. Concise and
detailed system documentation protects against system failures. Change
management supports traceability of the system’s evolution. It also helps
track the costs associated with a system’s life cycle.
The essential compliance characteristics associated with change con-
trol include evaluation of the change, quali¢cation testing, documentation
update, and approval and implementation.

10.3 The What, Why, When, Who, and How of Change


Control
What? Identify all of the CRSs that are covered under your change
control and management SOP. Classify and document what systems are
GXP and non-GXP. Make certain to include a process for implementing
emergency changes. Document the change, the evaluation and assignment
of a change level, the testing evidence, and a regulatory audit trail. An exam-
ple of a typical change would be when an individual leaves the company. The
¢rst step should be to disable the account. Ensure that systems’ users

Copyright © 2004 Marcel Dekker, Inc.


240 Horgan and Carey

understand that user rights are disabled. Additionally, all system history
related to that former user should be maintained.One might consider if there
are training implications associated with changes. Are all users aware of the
change? Will SOPs need updating? Pay particular attention to changes of
systems used in product manufacture or that generate, analyze, archive, and
report data to be used in submissions to the FDA.

Why? Change management in development, pilot, and production


environments.The development environment is where system changes, such
as software upgrades, are ¢rst examined. The pilot system is essentially a
quali¢cation and validation test environment. Essentially, identical pilot and
production environments allow you to protect your production data, envir-
onment, and business process from the risks associated with the manipula-
tion of the system that occurs in OQ/validation testing. It is essential to
protect against any risk to production environments. Make certain the test
environment is adequately segregated from the other production environ-
ments. Identify what data will be used and exposed to testing. Distinct yet
identical development, pilot, and production environments will allow for
OQ in the pilot rather than in the production environment.
When? A formalized change management process is required
throughout the entire software and system development life cycle. TheVMP
should state when the corporate change management policy will initially
apply to a new system, usually at the end of validation, when the ¢nal sum-
mary report is signed o¡ or a validation certi¢cation is issued. Some systems
are covered by the corporate change control policy when PQ begins. Change
management assures the maintenance of the application system in a
validated state. Change control steps should be logical, £exible, and applic-
able to contingencies. During posttesting determine what documents should
be routed and approved as a package. Remember that there may be a need
to update speci¢cations and validation summary reports as changes occur
during posttesting. Install a periodic review and evaluationperhaps
annually. It is also necessary to plan for system retirements. Develop a for-
mal plan when system retirement is needed. Do not forget to be prepared for
unplanned and emergency changes.
Who? The system owner or user initiates changes and is responsible
for adherence to change control policy and procedures; however, evaluation
and review should be cross-functional. The quality unit should always be
responsible for managing the change management system, even if there are
several processes for di¡erent systems (documentation change, facilities
change control, etc.). Ensure the system’s user is in the position to take full
responsibility of the system’s functions, and ensure it is used according to its

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 241

corresponding documentation and speci¢cation. Users should de¢ne the


system’s operational features in such a way that it can only be used as
described in the systems’ SOPs and can request changes to speci¢cations or
SOPs as needed.
The information technology (IT) group should provide such network
services as backup, recovery, continuity planning, and disaster recovery.
How? Change control documentation should be readily traceable for
every step in the process; for example, during initiation, evaluation, testing,
and implementation. It is particularly critical when the review process for
batch release is involved. Require that regression testing be part of all change
management. When change is of a magnitude that requires testing, speci¢-
cally test the new or changed functionality as well as associated and related
systems functionality. Consider how changes may impact data integrity and
critical user interfaces. Ensure that a risk assessment also accompanies
change management. Assess the impact on the overall system, the business,
and the consumer. Integrate your help desk application and system error logs
into the change control process.These systems are often change initiators.
Systematically characterize changes as major or minor (or multiple levels),
based on whether the change is a low-level one, such as a database mainte-
nance activity, or one in which one or more GXP systems will be a¡ected.
Validation is a process of verifying documented system features and
conducting quali¢cation testing. Interestingly, after a CRS is validated, its
validation status can be e¡ectively maintained through comprehensive
change control.

11 REGULATORY REALITY CHECK

FD-483 observation 1: SOP X,‘‘GMP Computer Systems.’’ The proce-


dure describes establishing a written security policy, maintain an
access control roster, and virus protection will be installed. There is
no written security policy, however, and there is no virus protection
installed for theYsystem.
FD-483 Observation 2: The Z workstation is considered GMP equip-
ment and as such generates electronic records that are not backed up
or stored for retrieval. The OQ document states that ‘‘since reports
are printed after each run and attached to the original laboratory
data document, no data is stored long term and data security is not
an issue . . . . Data will not be stored on the system long term since
analysts will printout and attach copies of reports to their original
laboratory data documents. Therefore backup and archiving of data
is not necessary.’’

Copyright © 2004 Marcel Dekker, Inc.


242 Horgan and Carey

In the ¢rst observation noted here, the ¢rm was clearly cited for not
adhering to its own computer systems SOP. In such cases, it is helpful
to evaluate whether the ¢rm wrote an unnecessarily stringent SOP
and developed a habit of not following it or whether the SOP is rea-
sonable and was ignored. This citation may o¡er some of both. All
computer systems that handle GMP electronic records should have
a written security policy and written security procedures, as made
clear by 21 CFR Part 11, Section 11.10’’Such procedures and con-
trols shall include the following: (d) Limiting system access to
authorized individuals.’’ It is quite possible, however that virus pro-
tection is not necessary on every GMP computer system at the ¢rm,
and that this was an excessive requirement that the ¢rm put on itself
and subsequently did not follow. The ¢rm needs to write a procedure
for security controls on this system and consider whether or not
virus protection is an appropriate SOP requirement for all of the
computer systems in the scope of this policy.
The second observation involves either lack of attention to or misinter-
pretation of the electronic records requirements. The FDA de¢nes
that 21 CFR Part 11 applies to ‘‘records in electronic form that are
created, modi¢ed, maintained, archived, retrieved, or transmitted,
under any records requirements set forth in agency regulations’’
(Section 11.1 b). Since the ¢rm feels the need to print out and retain
these records, the records are presumably ‘‘created under require-
ments set forth in agency regulations,’’ therefore the electronic
records on the computer system are bound by 21 CFR Part 11
whether or not the ¢rm feels the need to retain them. Ignoring the
state of the electronic records and relying on the paper printout is in
violation of at least one other clearly stated requirement of 21 CFR
Part 11, Section 11.10: ‘‘Such procedures and controls shall include
the following: (c) Protection of records to enable their accurate and
ready retrieval throughout the records retention period.’’ This ¢rm
needs to implement a secure backup system for these electronic
records.
FDA warning letter citation: ‘‘Your ¢rm failed to establish and main-
tain procedures . . . in order to ensure that speci¢ed design require-
ments are met. For example, the software designed by your ¢rm was
developed without design controls.’’
This citation talks to a core concept of computer systems validation:
development procedures and system speci¢cations. Note that the ¢rm
was not cited for lack of validation testing of the software but for lack
of design controls. E¡ective design controls would have included writ-
ten design procedures. Adherence to these procedures would have

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 243

produced well-written system speci¢cations. The ¢rm may have had


validation testing documentation for the software. In the absence of
robust system speci¢cations, however, there is little evidence that a
quality process was followed in the development of the software.

12 WORDS OF WISDOM

Involve the user in development and validation.


Many organization take a ‘‘change control committee’’approach; how-
ever, a comprehensive change control system based on a key change
control form or cover sheet and related documentation may prove
more e⁄cient.
As with pharmaceuticals themselves, remember that you can’t test qual-
ity into the ¢nished software product; it must be produced utilizing
quality processes throughout. Require evidence of quality assurance
and development processes within your ¢rm and during vendor
audits.
Involve technical sta¡ in software vendor quality audits.
A COTS application may have a large number of exciting features that
management and users will be eager to implement. Beware. Plan a
phased implementation of the system modules and features.
Temper COTS enthusiasm with the degree of customization
required. It is rarely the case that COTS can be used without some
customization.
Expect that the functionality of ‘‘o¡-the-shelf ’’ software will not be a
perfect ¢t for your business process. Chances are a ¢rm will have to
negotiate between the ability to con¢gure and modify the function-
ality of the software and changing the business process.
When it comes to Part 11 compliance, don’t let the perfect be the enemy
of the good! Have a long-term view and focus on product quality and
product risk.

APPENDIX I: COMPLIANCE GUIDEPOSTS FOR


A VALIDATION TEMPLATE
The following is a brief outline of the critical compliance parameters in a
software validation template.

Purpose
To plan and describe the validation activities for the deployment of the very
useful system (VUS) at a particular manufacturing site.

Copyright © 2004 Marcel Dekker, Inc.


244 Horgan and Carey

Scope
Describe the scope of the VUS validation e¡ort. What are the boundaries?
Are there interfaces to other systems included in this validation e¡ort? (This
is very important to identify and de¢ne.)

System Description
Provide a clearly worded description of theVUS and its purpose in the orga-
nization. Also describe the makeup of the VUS (multiple servers, networks,
software packages, etc.). This does not need to be technically in depth, but
should provide a basic framework for understanding the system concepts.
Consider including an explanatory diagram.

Responsibilities
List all parties who have responsibilities for actions or deliverables described
in the VUS validation plan. Describe their responsibilities in general terms
(e.g.,‘‘will review and approve all system speci¢cations’’).

Developer Acceptance
If an outside developer is being utilized (whether developing custom soft-
ware or providing a packaged COTS product), describe the steps taken to
qualify the software developer. If an audit was performed, brie£y state the
results and reference the audit report.

Standard Operating Procedures


List the SOPs that must be developed in order to use theVUS in an appropri-
ately compliant fashion. Include SOPs on operation, maintenance, adminis-
tration, security, backup, and disaster recovery. Some of these may be
combined into one document. If the exact document names and number are
known, list them; otherwise use generic names.

Specifications
List theVUS speci¢cations that must be developed in order to properly con-
trol system development and implementation. See Sec. 5 for a discussion of
speci¢cation types.

Qualification Protocols and Reports


List the quali¢cation protocols that must be developed in order to qualify the
system for use. Note whether each protocol will have a corresponding report

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation 245

or whether multiple protocol results will be discussed in one report. If the


exact document names and number are known, list them; otherwise use gen-
eric names.

Training
Describe the training requirements for the VUS. Reference any appropriate
SOPs.

Change Control
Describe or reference the change control system that will be used to main-
tain the validated state of theVUS after the initial quali¢cation.

Validation Document Management


Describe or reference the documentation system that will be used to control
validation documentation.

Validation Methodology
Describe in clear, concise terms the steps that will be taken to qualify the
VUS for use in the manufacturing environment.

System Acceptance
Describe the process by which VUS acceptance for use will be achieved. If
the VUS will be phased in and accepted on a modular basis, describe the
acceptance procedure for each phase.

Traceability Matrix
Here is an example of a traceability matrix as de¢ned by the IEEE. By
de¢nition, a matrix records the relationship between two or more pro-
ducts of the development process; for example, a matrix that records the
relationship between the requirements and the design of a given software
component (Std. 610.12^1990).
As commonly used in CRS validation, a traceability matrix veri¢es the
relationship between system speci¢cations and testing protocols. The goal
of matrixing is to establish the adequacy of protocols. More speci¢cally, all
speci¢ed system characteristics and functionality should correspond to ver-
i¢cation and quali¢cation testing in protocols.
This example uses a small portion of a traceability matrix for valida-
tion of a database application, here named ‘‘your CRS.’’

Copyright © 2004 Marcel Dekker, Inc.


246 Horgan and Carey

Good practice would require citing exactly what speci¢cations


are being matrixed to the test protocols. In our example, four speci¢-
cation documents were matrixed to the contents of IQ and OQ validation
protocols.

Reference specification requirement title Document number

Functional requirement specification Record document number


for your CRS
Installation specification or detailed design Record document number
specification for your CRS
Security specification for your CRS Record document number

Copyright © 2004 Marcel Dekker, Inc.


Computer Validation
Specifications

Installation
specification
section number
(for COTS)
Functional or detailed design Protocols
requirements specification Security
specification section number specification IQ OQ Comments/
section number (for non-COTS) section number section number section number Conclusion

Cite specification Cite specification Cite specifi- Cite protocol Cite protocol Does the protocol
section number(s) section number(s) cation section number section number contain verification
and title or not and title or not section and title. and title. or qualification
applicable (N/Ap). applicable (N/Ap). number(s) Consider Consider testing to provide
and title or not stating test stating test documented
applicable (N/ objective. objective. evidence that
Ap). The objective of The objective each specified
May be this column of this column system feature
multiple is to verify that is to verify that will be adequately
specific the ‘‘specific the specific tested?
citations. system feature’’ system
(such as system requirement
hardware) will be (report printing)
verified as will be verified
installed per the as functional
installation per the functional
specification or requirements
design

247
specification

Copyright © 2004 Marcel Dekker, Inc.


248
8.1—Computer 7.1.2—Computer N/Ap 9.0—Hardware N/Ap OK
Hardware Server—Minimum Components
Requirements (8.1 1) Hardware Requirements Installation
7.1.3—Application Verification
Clients—Minimum The objective of this
Hardware Requirements test is to verify
that the ‘‘your CRS’’
database system
hardware has been
installed as per
the installation
specification.

Horgan and Carey


Copyright © 2004 Marcel Dekker, Inc.
Computer Validation 249

BIBLIOGRAPHY
Food and Drug Administration. Guidance for Industry. 21 CFR Part 11. Electronic
Records; Electronic Signatures.Glossary of TermsDraft Guidance.Washington,
D. C., Aug. 2001. (Later withdrawn by the FDA.)
Myres, G. J., The Art of Software Testing, New York: Wiley, 1979
ICH. Harmonized tripartite guideline. Good Manufacturing Practice Guide for Active
Pharmaceutical Ingredients. ICH Steering Committee. Nov. 10, 2000.
IEEE. Standard Glossary of Software Engineering Terminology. IEEE Std 610.12-1990.
IEEE, Sep. 28, 1990.
ISPE/GAMP Consortium. GAMP 4 Guide, Validation of Automated Systems.
Tampa, FL: ISPE, 2001.
Validation of computer-related systems technical report no. 18. PDA Committee on
Validation of Computer-Related Systems. PDA J Pharm Sci Tech 49 (1; supple-
ment): Jan.^ Feb. 1995.

INTERNET REFERENCES
Freedom of Information Act warning letters are available on the FDA Website
(currently linked at http://www.fda.gov/foi/warning.htm).
Informative FDA documents and communications related to 21 CFR Part 11
compliance are posted to public dockets on the FDA Website (currently linked at
http://www.fda.gov/ohrms/dockets/dockets/dockets.htm). The relevant dockets are
00D-1538 through 00D-1543.

REFERENCES
1. ISPE/GAMP Consortium. GAMP 4 guide. Validation of automated systems.
Tampa, FL: ISPE, 2001.
2. G. J. Myers. The Art of SoftwareTesting. New York: Wiley, 1979.
3. Food and Drug Administration. Guidance for Industry. 21 CFR, Part 11. Electro-
nic Records; Electronic Signatures validation. draft issues. Sept. 2001. (Later with-
drawn by the FDA.)

Copyright © 2004 Marcel Dekker, Inc.

You might also like