Escolar Documentos
Profissional Documentos
Cultura Documentos
Samuel Torres
Lilly del Caribe, Inc.
1
10/18/2010
Key terminology
and Concepts
Computer System
Application software
Platform
y System Software
y Hardware
2
10/18/2010
Computer System
Software: (ANSI) Programs, procedures, rules, and any associated
documentation pertaining to the operation of a system.
3
10/18/2010
Notes:
Configuration is the process of modifying an application program by changing
its configuration parameters without changing or deleting its program code or
adding additional custom code.
4
10/18/2010
In general software is
Sample
5
10/18/2010
Validation
validation. (1) (FDA) Establishing documented evidence which provides a high degree
of assurance that a specific process will consistently produce a product meeting its
predetermined specifications and quality attributes. Contrast with data validation.
validation, software. (NBS) Determination of the correctness of the final program or
software produced from a development project with respect to the user needs and
requirements. Validation is usually accomplished by verifying each stage of the
software development life cycle. See: verification, software.
Verification (ISO) (1) (ASTM) (2)
(1) Confirmation, through the provision of objective evidence that specified
requirements have been fulfilled. (2) A systematic approach to verify that
manufacturing systems, acting singly or in combination, are fit for intended use, have
been properly installed, and are operating correctly. This is an umbrella term that
encompasses all types of approaches to assuring systems are fit for use such as
qualification, commissioning and qualification, verification, system validation, or other.
The specific terminology used to describe life cycle activities and deliverables
varies from company to company and from system type to system type.
Whatever terminology is used for verification activity, the overriding
requirement is that the regulated company can demonstrate that the system
is compliant and fit for intended use.
6
10/18/2010
The FDAs analysis of 3140 medical device recalls conducted between 1992
and 1998 reveals that 242 of them (7.7%) are attributable to software
failures. Software validation and other related good software engineering
practices are a principal means of avoiding such defects and resultant
recalls.
7
10/18/2010
Appropriate controls for removal or reduction of the identified risks should be identified
based on the assessment. A range of options is available to provide the required
control depending on the identified risk. These include, but are not limited to:
-modification of process design
-modification of system design
-application of external procedures
-increasing the detail or formality of spececifications
-increasing the extent or rigor of verification activities (TESTING)
8
10/18/2010
The above considerations are context sensitive. For example, risks associated with solid oral dosage
manufacturing area are very different to those in a sterile facility, even when the same
computerized systems are used.
Similarly, the risks associated with an adverse event reporting system are very different to those
in a training records database. The former can have a direct effect of patient safety, whereas the
latter system is very unlikely to affect patient safety.
The acceptable level of risk, sometimes known as risk tolerance, should be considered.
9
10/18/2010
Validation Benefits
Validation provides tangible benefits to the business and its
customers by:
assuring the system works as intended
reducing long-term costs by reducing rework and defects
providing a knowledge base that can support the product
long after the people involved in creating it have moved on
10
10/18/2010
Practical Considerations
11
10/18/2010
VALIDATION OF OFF-THE-SHELF
SOFTWARE AND AUTOMATED
EQUIPMENT
Most of the automated equipment and systems used by drug or device
manufacturers are supplied by third-party vendors and are purchased off-
the-shelf (OTS).
The vendor audit should demonstrate that the vendors procedures for and
the results of the verification and validation activities performed to the OTS
software are appropriate and sufficient for the requirements of the drug or
medical device to be produced using that software. The vendors life cycle
documentation can be useful in establishing that the software has been
validated.
However, such documentation is frequently not available from commercial
equipment vendors, or the vendor may refuse to share their proprietary
information. In these cases, the drug or device manufacturer will need to
perform sufficient system level black box testing to establish that the
software meets their user needs and intended uses.
For many applications black box testing alone is not sufficient. Depending
upon the risk of the product or device produced, the role of the OTS
software in the process, the ability to audit the vendor, and the sufficiency
of vendor-supplied information, the use of OTS software or equipment may
or may not be appropriate, especially if there are suitable alternatives
available.
12
10/18/2010
Security Requirements
The security of our computer systems and data has become more
important than ever before given the widespread use of the internet
to access our systems and the increased incidents of viruses and
attacks by outside hackers. Physical and logical security controls
ensure that data residing in a computer system is protected from
unauthorized access, inadvertent or unauthorized alteration or
destruction.
Security Plan and Security Administration SOP are the two
primary security-related CSV deliverables.
13
10/18/2010
Supplier Management
Supplier Management assures that the suppliers
quality practices are adequate to deliver and support
a reliable software product or service. The scope
and extent of supplier management depends on the
risk associated with the system/service and the
extent to which the regulated company depends
directly on supplier activities.
Supplier management is a joint responsibility of
IT/Automation, Business and Quality, and involves
activities to conduct the initial and ongoing
evaluation of the suppliers quality practices as well
as managing the ongoing relationship with the
supplier.
14
10/18/2010
Functional Specifications are normally written by the supplier and describe the detailed
functions of the system, i.e., what the system will do to meet the requirements. The regulated
company should review and approve Functional Specifications where produced for a custom
application or configured product. In this situation, they are often considered to be a contractual
document.
Configuration Specifications are used to define the required configuration of one or more
software packages that comprise the system. The regulated company should review and approve
Configuration Specifications.
Design Specifications for custom systems should contain sufficient detail to enable the system
to be built and maintained. In some cases, the design requirements can be included in the
Functional Specification. SMEs should be involved in reviewing and approving design
specifications.
Software Development
15
10/18/2010
The review aims to ensure that the code is fit to enter testing (module,
integration or system tests), and that the code can be effectively and
efficiently maintained during the period of use of the application.
The review should be carried out in accordance with a documented
procedure, and performed by at least one independent person with
sufficient knowledge and expertise, in conjunction with the author of the
code.
The extent of source code review should be driven by the criticality of the
requirements/design that the source code is implementing. For example:
All source code implementing critical requirements (e.g., perform critical
calculations or make GxP decisions) should be reviewed. Reviewing only a
representative sample of source code implementing noncritical functionality
may be appropriate.
Traceability Matrix
16
10/18/2010
Testing/Verification Levels
(Approach for a Configured Product (GAMP Category 4)
17
10/18/2010
Testing/Verification Levels
(Approach for a Custom Application (GAMP Category 5)
Testing/Verification Levels
Installation Testing (GAMP):
Many companies call this installation Qualification or IQ. The purpose is to verify and document that system
components are combined and installed in accordance with specifications, supplier documentation and local
and global requirements. Installation testing provides a verified configuration baseline for subsequent
verification and validation activities and also verifies any installation methods, tools or scripts used.
Requirements (System) Testing:
testing, system. (IEEE) The process of testing an integrated hardware and software system to verify that
the system meets its specified requirements. Such testing may be conducted in both the development
environment and the target environment.
Design Based Functional Testing:
testing, design based functional. (NBS) The application of test data derived through functional analysis
extended to include design functions as well as requirement functions.
Configuration Testing:
Configuration testing. (GAMP) For each Configuration Specification, an associated Configuration Test
Specification should be produced. The tests should verify that the package has been configured in
accordance with the specification. The tests could take the form of inspections or check of supplier
documentation.
Integration Testing:
testing, integration. (IEEE) An orderly progression of testing in which software elements, hardware
elements, or both are combined and tested, to evaluate their interactions, until the entire system has been
integrated.
Module (Unit ) Testing:
testing, unit. (1) (NIST) Testing of a module for typographic, syntactic, and logical errors, for correct
implementation of its design, and for satisfaction of its requirements. (2) (IEEE) Testing conducted to verify
the implementation of the design for one software element; e.g., a unit or module; or a collection of software
elements. Syn: component testing.
18
10/18/2010
19
10/18/2010
20
10/18/2010
Normal Case testing (Positive Case or Capability testing) challenges the systems
ability to do what it should do, including triggering significant alerts and error
messages, according to specifications.
Testing with usual inputs is necessary. However, testing a software product only
with expected, valid inputs does not thoroughly test that software product. By
itself, normal case testing cannot provide sufficient confidence in the
dependability of the software product.
Invalid Case testing (Negative Case or Resistance testing) challenges the systems
ability not to do what it should not according to specifications.
Performance testing challenges the systems ability to do what it should as fast and
effectively as it should, according to specifications.
21
10/18/2010
The level of structural testing can be evaluated using metrics that are designed to show
what percentage of the software structure has been evaluated during structural testing.
These metrics are typically referred to as coverage and are a measure of completeness
with respect to test selection criteria. Common structural coverage metrics include:
Statement Coverage
Decision (Branch) Coverage
Condition Coverage
Multi-Condition Coverage
Loop Coverage
Path Coverage
Data Flow Coverage
Regression testing challenges the systems ability to still do what it should after
being modified according to specified requirements, and that portions of the
software not involved in the change were not adversely affected.
Acceptance Testing
testing, acceptance. (IEEE) Testing conducted to determine whether or not
a system satisfies its acceptance criteria and to enable the customer to
determine whether or not to accept the system.
Acceptance may be carried out in two stages. Factory Acceptance and Site
Acceptance.
y Factory Acceptance Tests (sometimes abbreviated to FAT) are performed at the
supplier site before delivery to show that the system is working well enough to be
installed and tested on-site
y Site Acceptance Tests (sometimes abbreviated to SAT and sometimes called
System Acceptance Testing) show that the system is working in its operational
environment and that it interfaces correctly with other systems and peripherals
This approach is often used for automated equipment and process control systems.
22
10/18/2010
Other Validation
Activities
Validation Report
The Validation Report is the gate between developing
the system and moving the system into the Production
(maintenance) phase. The Validation Report must be
completed in order for the system to be in Production.
23
10/18/2010
Production Support
(Maintenance phase)
These are the CSV activities performed once a system is in
production. All CSV processes must be defined during the
system development phase so that a clear process is in
place and related roles are resourced. Upon system
production, these processes are then executed as needed.
The following are the post-production CSV deliverables that
will be presented:
Business Continuity Plan (BCP) and Disaster Recovery
Change Control
Business Procedures and Training
Periodic Review
System Retirement
24
10/18/2010
Configuration management and change management are closely related. When changes are
proposed, both activities need to be considered in parallel, particularly when evaluating impact
of changes.
Both change and configuration management processes should be applied to the full system
scope including hardware and software components and to associated documentation and
records, particularly those with GxP impact.
25
10/18/2010
26
10/18/2010
What to do if my software/system is no
needed any more?
(System Retirement)
The primary purpose of System Retirement activities is to ensure that the
data residing in the system are retained, secure, and readily retrievable
throughout the record retention period. The data may need to be retrieved,
for an inspection or some other reason, years after the system is
retired. This possibility underscores the need for both sound retirement
processes and related documentation.
Inadequate system retirement processes can result in data not being
available or readable. Depending on the situation, this can be a significant
issue, especially if this data is requested by a regulator or if product safety is
in question.
Common challenges related to the system retirement process:
Determination of record retention for data, software, and
documentation
Data migration and archival
Clear ownership for continued stewardship of the data after the system
is retired
VALIDATION OF OFF-THE-SHELF
SOFTWARE AND AUTOMATED
EQUIPMENT
Most of the automated equipment and systems used by drug or device manufacturers
are supplied by third-party vendors and are purchased off-the-shelf (OTS).
The vendor audit should demonstrate that the vendors procedures for and the results
of the verification and validation activities performed to the OTS software are
appropriate and sufficient for the requirements of the drug or medical device to be
produced using that software. The vendors life cycle documentation can be useful in
establishing that the software has been validated.
For many applications black box testing alone is not sufficient. Depending upon the
risk of the product or device produced, the role of the OTS software in the process,
the ability to audit the vendor, and the sufficiency of vendor-supplied information, the
use of OTS software or equipment may or may not be appropriate, especially if there
are suitable alternatives available.
27
10/18/2010
Statistical packages
Spreadsheets
Network monitoring tools
Scheduling tools
Version control tools
3. Non- Run-time parameters may be Firmware-based applications Abbreviated life cycle approach
Configured entered and stored, but the COTS software
software cannot be configured Instruments (See the URS
to suit the business process GAMP Good Practice Risk-based approach to supplier
Guide: Validation of assessment
Laboratory Record version number, verify correct
Computerized installation
Systems for further Risk-based tests against requirements
guidance) as dictated by use (for simple systems
regular calibration may substitute for
testing)
28
10/18/2010
29
10/18/2010
Case A:
The system is considered a low complexity system and is based on mature
technology. The system/software has been commercially available (for the last ten
years) and its fitness for use has been demonstrated by a broad spectrum of
commercial users.
A supplier assessment was performed and it was determined that the supplier has
adequate Quality Management Systems (QMS) and that the basic functionality of
system has been adequately tested.
Case B:
The system is considered a low complexity system but it is based on new technology.
The system/software has been commercially available for the last year.
The supplier refused to share their internal testing documentation and little information
about vendor quality practices was obtained.
Example (continued)
Results of Risk Assessments
The following sample functions takes into consideration only software related
hazards.
Function Specified Hazard Consequence Impact Probability Detectability3 Risk Priority
(Harm) (severity)
Alarm alarm fail to Product not High Impact Case A: Low1 Case A: High Case A: Low
Management activate low sterilized Case B: High2 Case B: Low Case B: High
temperature
Alarm alarm fail to Product is High Impact Case A: Low1 Case A: High Case A: Low
Management activate- high damaged Case B: High2 Case B: Low Case B: High
temperature
Temperature Control failure- Product not High Impact Case A: Low1 Case A: High Case A: Low
control low temperature sterilized Case B: High2 Case B: Low Case B: High
Temperature Control failure- Product is High Impact Case A: Low1 Case A: High Case A: Low
control high damaged Case B: High2 Case B: Low Case B: High
temperature
1: For Case A, a low probability of occurrence is considered
based on systems low complexity, mature technology,
product had being in the market for 10 years, and basic
functionality of system has been adequately tested by the
supplier.
2: For Case B, considerations are based on systems low
complexity and new technology. In addition, the
experience with the product/software is limited (i.e., only
one year in the market) and the likelihood of regulated
company uncovering a significant issue with the software
for the first time is high. The supplier refused to share
their internal testing documentation and little information
about vendor quality practices was obtained. A worst case
scenario is being assumed.
3: Detectability is based on reliance on the systems
generated report.
30
10/18/2010
Example (continued)
Recommended Testing Activities
(based on GAMP5 Categories and Risk Priority)
Typical testing activities for a Category 4 (Configured Product) should focus on:
Questions?
31
10/18/2010
Interactive Exercise
Bonus Material
Example of a Test Plan
List of Relevant Regulations
32
10/18/2010
33