This section will briefly discuss various types of validation done in the TS and will reinforce the concepts introduced in Part 1 (terminology). The types are not meant to be all inclusive.
Reminder. Don't get too caught up by the terminology. Regardless of the terms used, all in-house validation involves proving and documenting that something consistently works the way that it is supposed to work under the conditions present in a given TS.
Some user acceptance or validation approaches consist of preparing a ?mirror copy? of the final draft of the SOP ? with columns beside the steps in the procedure, where technologists can sign off that the step in question works. In fact, more than one technologist should ?test? or ?validate? the procedure.
SOP Improvement Form: a tool for collecting input. This form can be used to collect feedback. We use a ?user acceptance phase? where we send out procedures/SOPs to each site for feedback. Draft SOPs are distributed with an ?SOP Test Phase Improvement Form? to each site (Refer to page 100). Alternately the SOP itself can be modified to allow for comments or initials.
As discussed earlier, process validation is part of process control and provides evidence that a process consistently produces a result meeting predetermined specifications. Process validation is the umbrella under which all validations fall, the mother of all validations, as it were.
For example, when we validate a new instrument for testing ABO, Rh, and antibody screening or validate a new container for shipping donor blood, we are validating parts of a process, the process for pretransfusion testing and the process for transporting blood products to remote locations, respectively.
For further discussion see Quinley's article, Process validation: Will it ever be "no big deal"?2
Validating equipment provides assurance that equipment will consistently operate as intended. If the equipment is an automated or semi-automated instrument, the process may be referred to as qualification. The instrument will undergo
- Installation qualification
- Possibly an operational qualification equivalent to a full method evaluation, especially if it is a relatively new methodology with only a few published external validations
- Performance qualification to test its routine performance with atypical work flow in the user's laboratory.
Validation of a laboratory information system (LIS) is a special type of validation with its own guidelines. 3-6Software validation provides objective evidence that all software specifications conform to user needs and intended uses. The complexity and issues of who should be responsible for LIS validations are discussed on a recent thread in the CBBS e-Network forum.7
Some of the processes involved in LIS validation include:
- Computer system validation requirements (risk analysis)
- System design requirements and specifications (user and functional)
- System validation master planning
- Policies and SOPs
- Business continuity planning
- Backup and recovery
- Documenting and testing the system (test plans, test protocols, scripts, execution, and reports)
- Computer validation training
- Validation reports and acceptance
- System implementation and project management
- Archive and record retention
Steps in the validation process showing user and supplier responsibilities are shown in this figure from the ISBT guidelines3 :
Readers are encouraged to read the ISBT guidelines3for an overview of LIS validation.
Method validation is the process of proving that an analytical method is acceptable for its intended purpose. Most literature on method validation relates to quantitative testing, especially in the chemistry laboratory.8-9
FDA industry guidance for bioanalytical methods specifies:
The fundamental parameters for this validation include (1) accuracy, (2) precision, (3) selectivity, (4) sensitivity, (5) reproducibility, and (6) stability. Validation involves documenting, through the use of specific laboratory investigations, that the performance characteristics of the method are suitable and reliable for the intended analytical applications. The acceptability of analytical data corresponds directly to the criteria used to validate the method.10
For qualitative methods used in the TS, i.e., methods with usually only two possible results (positive and negative), typical validation parameters such as precision do not apply. Nonetheless, because of the lack of literature on validating qualitative methods, some TS attempt to adapt some of the concepts used in quantitative validations to blood bank serologic methods.
For validating qualitativemethods sensitivity, specificity, and predictive value can be assessed. Often these statistics are used in comparison studies where a new method is compared to the existing one. Predictive value theory has been used in blood bank serologic literature11-13, and is a key component of assessing the usefulness of a diagnostic test.14, 15
As noted, sometimes method validations are called method evaluations 4, especially when they assess a new method's performance in detail. Alternatively, some laboratories may refer to full validations (as opposed to limited or partial validations) to indicate how extensively a method is assessed.
As with equipment, if the method is performed by an automated or semi-automated instrument, the validation process may be referred to as installation qualification, operational qualification, andperformance qualification.
1. Do new methods always have to be validated extensively using worst-case conditions?
2. Which statistical tools are well suited for validating qualitative methods?
3. Do the ISBT guidelines suggest who should be responsible for the operational and performance qualifications of an LIS?
- Part 1: Introduction to terminology
- Part 2: Types of validation <--You are here
- Part 3. Determining when and how extensively to validate a method
- Part 4: Regulations and standards
- Part 5: Validation examples (tools and resources)
1. Hamilton-Niagara Quality Essentials for Safe Transfusion (QUEST). Sharing our Strategies (SOS ) Manual
4. BCSH. Guidelines for blood bank computing (2000)
5. FDA. General principles of software validation (Jan. 2002)
7. CBBS e-Network forum. Who should validate computer software used in blood banks/transfusion services?
8. Green JM. A practical guide to analytical method validation. Analytical Chem 1996; 68: 305A-309A.
9. Westgard JO. Basic method validation, ed 2.
10. FDA. Guidance for industry. Bioanalytical method validation. May, 2001.
11.Judd WJ, Barnes BA, Steiner EA, Oberman HA, Averill DB, Butch SH. The evaluation of a positive direct antiglobulin test (autocontrol) in pretransfusion testing revisited.Transfusion. 1986 May-Jun;26(3):220-4.
12.Judd WJ, Steiner EA, Oberman HA, Nance SJ. Can the reading for serologic reactivity following 37 degrees C incubation be omitted? Transfusion 1992 May;32(4):304-8.
13. Meyer EA, Shulman IA. The sensitivity and specificity of the immediate-spin crossmatch. Transfusion 1989 Feb; 29(2):99-102.
14. Jaeschke R ; Guyatt G ; Sackett DL. Users' guides to the medical literature. III. How to use an article about a diagnostic test. A. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA 1994 Feb 2;271(5):389-91.
15. Centre for Evidence Based Medicine. Critical appraisal sheet for diagnosis.