вторник, 13 марта 2012 г.

Implementing 21 CFR Part 11 in analytical laboratories: Part 5, the importance of instrument control and data acquisition

The time for compliance with

21 CFR Part 11 is now. Bringing different laboratory instruments into compliance takes planning. The key strengths and weaknesses of different levels of control and feedback for analytical instruments and data transfer systems are highlighted in this fifth installment of our series.

he first four parts of this series gave an overview of the requirements of the FDA rule (21 CFR Part 11) on electronic signatures and records (1). We focused on data security, data integrity, long-term archiving, and ready retrieval of data (2-5). We demonstrated how access to the system and critical functions could be limited to authorized personnel. We also demonstrated how the integrity of data can be assured at the time of data analysis and evaluation and how creation, modification, and deletion of records are logged in a computer-generated audit trail. And we showed the best method for archiving data and accurately retrieving it after several years. In those first four articles, we focused on compliance of data generated by the system. Frequently, the question comes up whether computers that just control analytical instruments - those that do not acquire data - must comply with Part 11. The answer is simply, "Yes, if FDA has ever looked at or asked for paper printouts of the parameters." Without proper documentation of the instrument control parameters, it is difficult to prove that a given result was generated according to the appropriate procedure or protocol. If a computer was used in the procedure, and if the control parameters are stored on a durable storage device (typically the computer's hard disk or a storage card for the instrument itself), then Part 11 applies. Levels of Instrument Control Analytical laboratories typically operate with a diverse base of instruments, often from a variety of manufacturers for historical or strategic reasons. Because most modern instruments are computer controlled, the instrument control parameters have to be treated as the rest of the metadata are treated. (See the "Learning 21 CFR Part 11 Language" box for words in italics.)

Level 1. Instrument control can be implemented at differing levels of sophistication and complexity (Table 1). Often, instrument parameters are set manually using the instrument's own panel and keyboard, with the signal recorded by an analog-digital converter (level 1). This is frequently the approach chosen to integrate an instrument into a system from a different manufacturer. In such cases, it is often impossible to obtain a printout of the instrument set points used during an analysis. Analysts are forced to document instrument parameters manually. Furthermore, analog-digital converters do not always support binary coded decimal (BCD) or bar code input from an autosampler, which could be used to positively correlate an injection with a given sample using the sample name or vial number. In agreement with other authors, we view BCD communication with an autosampler as essential to ensuring sample continuity (6).

Level 2. Many systems implement a rudimentary level of instrument control obtained through reverse engineering: recreating the design of the communication protocols for another manufacturer's instrument by analyzing the final product (level 2). This method supports the basic parameters of an instrument (such as solvent composition, flow, oven temperature, or detector wavelength). If the control codes are not officially disclosed by the particular instrument manufacturer, it may be more difficult to obtain an officially supported solution. Also, additional effort should be planned for performing qualification and other validation tasks on such a system. Because the manufacturer of the original instrument may neither be aware of nor responsible for the implementation of the communication, instrument firmware updates may result in nonfunctional communication with the data system. Error handling and logging are typically weak at this level. When selecting a system that will control instruments from other manufacturers, it is therefore important to verify that the control codes are officially obtained from the manufacturer of the instrument and not from reverseengineering.

Level 3. In most cases, manufacturers achieve full instrument control for their own systems (level 3). That makes it easier to create a complete set of raw and metadata and the proper documentation. At this level, the error reporting and handling are quite sophisticated, which makes it easier to verify that analyses were completed without technical failures and to diagnose errors when they occur.

Going on better. Some manufacturers have implemented an additional level of instrument capability that can be controlled from within the data system. A data system controls those functions that execute detailed and sophisticated instrument diagnostics and other service functions. Also under this instrument control are provisions for preventive maintenance and early maintenance feedback (EMF), a technique first used in the aeronautics industry (to alert technical personnel to perform maintenance jobs proactively) and subsequently implemented by companies such as Agilent Technologies (Palo Alto, CA).

Systems implemented at this level provide sophisticated support for tracking instrument or module serial numbers and for firmware revisions. Such information is handy for inventory tracking, and it also performs some of the function checks required by Part 11.

Level 4. In level 4 instrument control, all communications (including commands and data transfer) are performed using a handshake. A handshake requires the recipient of a data record to actively acknowledge to the sender that the record has been received. For example, the controller sends a command like "sTART" to the device, the device interprets the command and acknowledges "oK, START." If the device is unable to execute the command, it sends a negative receipt like "NOT OK, NO START." This approach prevents situations in which the controller thinks it sent the instructions to the device, but the device never executes them. Protocols for Data Integrity Understanding the strengths and weaknesses of some widely used instrument communication protocols will help ensure the data integrity and traceability required by 21 CFR Part 11. One example can be drawn from the legacy world: a generalpurpose interface bus (GPIB), the well known and widely spread "IEEE 488" standard.

Contrast the GPIB to state-of-the-art networking protocols like the well known and ubiquitous TCP/IP protocol used for intra- or Internet communication. (Tables 2 and 3 provide a detailed list of the strengths and weaknesses of these two communication systems and recommendations for avoiding some weaknesses.) We are not providing a detailed technical description of the technology itself. Many publications cover those aspects accurately and in technical detail (7-9).

Instrument commication vfW9 GPIB. GPIB is a parallel communication interface that can connect up to 15 devices on a common bus. All communications using GPIB, including commands and data, use a hardware handshake for every byte. All devices connected to the bus participate in that handshake. As a consequence, every device on the bus can influence the ongoing communication or cause severe communication problems like bus "hangups" or data corruption. The reason for that can be a firmware error or a hardware failure in one of the participating devices (such as the printer). But powering a seemingly "idle" GPIB device on or off during ongoing communication also can cause such problems. Even though the electrical specifications of GPIB do not prohibit the actions that lead to those scenarios, the combination of chip-set implementation, firmware, and application software often leads to that failure.

LAN Communication Using TCP/IP Local area network (LAN) communication using the transmission control protocol/ Internet protocol (TCP/IP), often somewhat casually described as the "language of the Internet," enables devices to exchange information over a network. The central idea of TCP/IP is the breaking of information into pieces or packets. The packets are specifically structured to allow error detection and correction by using redundancy mechanisms like checksums. Redundancy mechanisms are the major difference between more advanced systems and the majority of GPIB implementations.

Checksums are, in principle, a running total of all transmitted bytes attached to the packet and are used by the recipient to backcalculate and compare with the original checksum provided by the sender. If a mismatch is detected, a retransmission is requested. This technique guarantees error-free data transport and is excellent for implementing the "device checks" and "system checks" mandated by 21 CFR Part 11. Communication in a TCP/IP environment is, by definition, highly dynamic. Ongoing communication between other participants is unaffected by the addition or removal of "idle" devices in the network. In contrast to most GPIB implementations, TCP/IP supports (without risking data loss) the safety procedures of analytical laboratories that require turning off instruments not currently in use. Selecting a Good System

When selecting and setting up instrument control and data acquisition systems, the following recommendations should help.

1. Assess the level of instrument used currently in your laboratory (level 1, 2, 3, or 4). 2. Find out what levels of control are available from the different manufacturers of that hardware.

3. Write a protocol documenting the instrument parameters for instruments not directly controlled by the data system (level 1). 4. Determine whether the communication protocols were obtained with the approval and support of the instrument manufacturer for instruments that claim to be controlled.

5. Plan additional qualification and acceptance tests to obtain a high degree of certainty that the control and communications are accurate and reliable for instruments for which communication protocols were apparently developed by reverse engineering.

6. Adapt your company's internal procedures to take advantage of additional diagnostics, maintenance, and tracking functions that validate, maintain, and document the system and the measurements obtained from those functions.

7. Define test cases for boundary conditions: Does the system reliably synchronize all the devices required for an analysis? Could a contact closure problem allow it to go unnoticed that a device, such as a detector, did not start? If the instrument has a local user interface, does it track parameter changes from the local interface while data are being acquired from the computer? Alternatively, is the local interface "locked" while data is acquired? Does the system quickly detect power failures of a connected device, or are data lost until a "time out" occurs? These are the kinds of questions to ask when bringing your laboratory instrument control and data acquisition systems into compliance with 21 CFR Part 11. Looking Ahead

In the next article, we will discuss biometrics devices for system access and electronic signatures. In computer security, biometrics are authentication techniques that rely on measurable physical characteristics that can be automatically checked, such as fingerprints, retinas and irises, voice patterns, facial patterns, and hand measurements for system access and electronic signature.

[Sidebar]

Learning 21 CFR Part 11 Language Bus is a collection of wires through which data travel within a computer. In this context, bus means an interface and communication system for peripheral devices (such as connections, cables, and the communication protocol).

Byte is an abbreviation for binary term. It is a storage unit capable of holding eight bits or the space required for a single letter or number, a single character. Checksums are a running total of all transmitted bytes that are attached to a packet and are used by the message recipient to back-calculate and compare with the original checksum provided by the message sender. If a mismatch is detected, a retransmission is requested. This technique facilitates detection of data transport errors.

Chip-sets are a number of integrated circuits designed to perform one or more related functions. For instance, the integrated circuit components of a specific GPIB interface card.

Device is any machine or component that attaches to a computer, such as disk drives, printers, mice, and modems. Those particular devices fall into the category of peripheral devices because they are separate from the main computer. Display monitors and keyboards are also devices, but because they are integral parts of the computer they are not considered peripheral. Most devices, whether peripheral or not, require a program called a device driver that acts as a translator, converting general commands from an application into specific commands that the device understands.

Durable storage device is typically the computers hard disk or a storage card for a particular instrument.

Early maintenance feedback (EMF) is a technique that automatically alerts technical personnel to perform maintenance jobs proactively.

Firmware is a combination of hardware and software written in read-only memory.

Handshake requires the recipient of a data record to actively acknowledge to the sender that the record has been received.

IEEE is the Institute of Electrical and Electronic Engineers that develops standards for computers and the electronics industry.

Legacy systems are hardware and software applications in which a company has already invested considerable time and money. Legacy systems typically perform critical operations in companies for many years even though they may no longer use state-of-the-art technology. Replacing legacy systems can be disruptive and therefore requires careful planning and appropriate migration support from the manufacturer.

Local-area netwoks (LANs) are networks with computers geographically close together (that is, in the same building), and wide-area networks (WANs) have computers farther apart and connected by telephone lines or radio waves.

Metadata is complete data with processing parameters and audit trail logs.

Networking: A group of two or more computer systems linked together.

Pocket:. A piece of a transmitted message that contains both the data and the destination address. In TCP/IP networking, packets are called datagrams. When you send an email message, the message can be broken into several packets, each packet can be transmitted separately, each packet may travel different routes, and all the packets can be put back together at the recipient's site.

Reverse engineering: Recreating the design of hardware or software by analyzing the final product and working backward.

TCP/tP:. Transmission control protocol/Internet protocol, enables devices to exchange information over a network.

[Reference]

References

[Reference]

(1) Office of Regulatory Compliance, Code of Federal Regulations, Food and Drugs: Electronic Records; Electronic Signatures, Title 21, Part 11 (U.S. Government Printing Office, Washington, DC), issued March 2000. Available at www.fda.gov/ora/compliance_ref/ part 11.

[Reference]

(2) L. Huber, "Implementing 21 CFR Part 11 in Analytical Laboratories: Part 1, Overview and Requirements," BioPharm 12(11), 28-34 (1999).

[Reference]

(3) W. Winter and L. Huber, "Implementing 21 CFR Part 11 in Analytical Laboratories: Part 2, Security Aspects for Systems and Applications," BioPharm 13(1), 44-50 (2000).

(4) W. Winter and L Huber, "Implementing 21 CFR Part 11 in Analytical Laboratories: Part 3, Ensuring Data Integrity in Electronic Records," BioPharm 13(3), 45-49 (2000).

(5) L. Huber and W. Winter, "Implementing 21 CFR Part 11 in Analytical Laboratories: Part 4, Data Migration and Long-Term Archiving for

[Reference]

Ready Retrieval," BioPharm 13(6), 584 (2000).

[Reference]

(6) R. D. McDowall, "Chromatography Data Systems: Part 1, The Fundamentals," LCGC North America 180), 56-67 (2000).

(7) W. Winter, "Dynamic Interprocess Communication between a Spectrophotometer and a Spreadsheet," diploma thesis and presentation for faculty for physical electronics, University of Karlsruhe (31 July 1989).

[Reference]

(8) M. F. Arnett et al., "Understanding Basic Network Concepts," Inside TCP/IP (New Riders Publishing, Indianapolis, 1994) pp. 51-54.

(9) ANSMEEE Std. 488.1-1987: Standard Digital Interface for Programmable Instrumentation (The Institute for Electrical and Electronics Engineers, New York, 1987). BP

[Author Affiliation]

Wolfgang Winter is worldwide product manager for networked data systems and corresponding author Ludwig Huber is worldwide product marketing manager, HPLC, at Agilent Technologies GmbH, PO Box 1280 D-76337, Waldbronn, Germany, +49 7243 602 209, fax +497243 602 501, ludwig_huber@agilent.com, www.agilent.com.

Комментариев нет:

Отправить комментарий