FDA surprises the pharmaceutical industry with its new interpretation of the Electronic Records, Electronic Signatures rule.
On 20 February 2003 the FDA issued a new draft guidance document on 21 CFR 11 that surprised many in the industry. The draft guidance spells out a new, narrowed scope to the interpretation of 21 CFR 11, the Electronic Records, Electronic Signatures rule, with an emphasis on a risk-based approach to 21 CFR 11 activities. To appreciate the implications of this change, it is worthwhile to explore some of the original basis of 21 CFR 11, which is good clinical practice.
In 1974, during my sophomore year in college, I found my scientific idealism shaken at its roots when the Patchwork Mouse scandal exploded onto the national science scene (see box). Unfortunately, this was not an isolated incident in medical research. The ensuing decades have provided many instances of scientific misconduct relating to manipulation and fabrication of data. Some of them took place at the laboratory bench, others were close to homein the clinical trial arena. Some occurred during the past few years and involved clinical trials for the development of pharmaceutical products.
The FDA and other regulatory organizations have long recognized the high stakes of pharmaceutical development and the potential for intentional fraud in clinical investigations. Indeed, regulators have significant experience in uncovering and dealing with fraudulent data in clinical trials. Behind their concern is the very real possibility that fraudulent data could lead to unwarranted drug approvals, exposing hundreds of thousands of patients to a drug that may be more toxic or less effective than was believed from the approval data. The FDA also has experience with another data issuepoor quality data resulting from careless data collection and management.
The principles of good clinical practice (GCP), originally codified in the Federal Register and expanded by FDA Guidance, industry best practices, and company policies, were created to address many issues of patient safety, explicitly including the mitigation of fraudulent data collection. Certainly, GCP addresses many other issues that may or may not be related to fraud, including informed consent, institutional review board oversight of clinical studies, and reporting requirements to name a few. The most relevant issues to discussions of 21 CFR 11, however, are those related to data integrity and fraud.
GCP includes a number of requirements that share characteristics with the concept of chain of custody for criminal evidence. When a crime is being investigated it is necessary for the investigating team to establish a chain of custody for all evidence. A piece of evidence is typically tagged in some nonalterable fashion and handed from person to person along with paperwork and sign-offs that document its transfer. Only with this complete chain of custody can a jury be sure that the evidence was not tampered with or exchanged. Interestingly, the chain of custody is often written about as the paper trail for an event that documents the process, people, and timing of the events involved in the management of an investigation. In fact, the chain of custody is the entire procedure including, but not limited to, the documentation.
The aspects of GCP related to subject data, as it is typically practiced, closely resemble the chain of custody concept. Those aspects are designed to assure the FDA that the data being examined reflects the actual data obtained by the investigator, and that adequate controls are in place to control and/or identify tampering with data by an investigator or sponsor. In parallel with chain of custody descriptions, many often talk about the GCP of subject data as it relates to the documentation and not the actual process and intent.
For data captured in physical formon paperconventional means can be used for investigation of any irregularities. To begin with, the data are often signed and dated by the person making the entry. The ink used, the handwriting, and various other factors involved in documentation can be compared with eyewitness observations to establish the veracity of written data. If necessary, ink can be checked for date of productionand other sophisticated tools can, and often are, brought to bear. However, the process of data collection on paper doesnt itself create automated recording of the time and identity of the person who created the data. Properly constructed computer systems and applications will do this, and much more.
The application of chain of custody GCP principles to the electronic world requires consideration. Data in the electronic world consists of a series of ones and zeros recorded on a variety of electronic, magnetic, or optical media. By definition, my ones and zeros are exactly the same as any of your ones and zeros. Therefore, the data doesnt have the characteristics of ink and handwriting that can link it to an individual. Furthermore, the concept of a signature attributable to a specific individual must be entirely redefined in the electronic world. Anyone who has access to the computer system holding the data can change some of my ones into zeros and vice versa. Finally, the computer itself might accidentally swap some of my ones and zeroscorrupting, damaging, or even deleting my data. It is these very issues that 21 CFR 11 was designed to address. In fact, in a widely quoted article (www.businessweek.com/archives/1998/b3574124.arc.htm) this very issue was brought to the attention of the public:
The Food and Drug Administration reports that some pharmaceutical companies are discovering errors as they copy drug-testing data that back up claims of long-term product safety and effectiveness. In several recent cases involving data transfers from Unix computers to systems running Microsofts Windows NT operating system, blood-pressure numbers were randomly off by up to eight digits from those in original records, FDA and company data specialists report.1
This important observation should give pause to us all; it is a strong vote for the need to validate any data handling process. Those of us who have ever written a search in Google, a simple rule in Outlook, a macro in Excel, or an SQL query of a database know that they dont always work as intended. Each of these computer entries has some similarity to writing software. We have an idea of what we want the rule or query to do. We do the best we can, test, and then rewrite and/or debug.
For a junk mail rule in Outlook, for example, we may check and test it in different ways over the next few days to make sure it is not disposing of our important mail. The testing we have done is a very crude form of validation. Any software being used for handling of clinical trial data (and for that matter any important data) must be validated to ensure that it will reproducibly perform as specified, with the entire gamut of possible inputs. Finally, careless or sloppy data management practices can lead to loss of data as surely as these same practices can lead to loss of paper.
21 CFR 11 was designed to address these and other issues in electronic records. Although electronic records can be far more organized and searchable than paper records, there was definite concern at the FDA that the integrity of electronic records was at risk unless care was taken in the design and implementation of the system. In the briefest summary, 21 CFR 11 addressed the criteria under which the agency considers electronic records, electronic signatures, and handwritten signatures executed to electronic records to be trustworthy, reliable, and generally equivalent to paper records and handwritten signatures executed on paper. Some particular aspects were covered, including:
Although the final rule was only three pages long in the Federal Register, a 29-page preamble clarified, and in some cases even amplified, the literal reading of the rule. Shortly after the final rule was published, FDA published a Guidance for Computerized Systems Used in Clinical Trials (www.fda.gov/ora/compliance_ref/bimo/ffinalcct.pdf). The guidance document wove together a good deal of FDA thought about interpretation of 21 CFR 11 and software validation principles to clinical systems. Both the rule and the guidance are still alive and well, and in force.
Since the issuance of the rule and the guidance, there has been much discussion in the industry about how best to comply with themmuch of it focused on the cost and effort required for compliance, especially for legacy systems and record archiving. The effort required for a company to comply has often been compared to the effort required for compliance to Y2K. Initially FDA signaled that they intended to vigorously enforce the rule, but only when issues of violation of the predicate rule (GCP, GMP, or GLP) were involved. The agency published a series of draft guidances intended to clarify and specify aspects of compliance to the rule. Some aspects of the guidance documents were quite explicit in terms of solutions, and were aimed at a rigorous interpretation of 21 CFR 11.
In addition, there is an inherent difficulty in regulating technology while encouraging innovation. Regulations take some time to be developed and adopted, and they are often left unchanged for years. Technology, on the other hand, progresses rapidly and often veers off from the expected direction. A good example of this is the emphasis on biometrics in the final rule. Although biometric authentication was not mandated, most readers interpreted 21 CFR 11 to show a strong preference for the application of such authentication. That seemed like the logical direction in the spring of 1997 when the rule was published. Biometrics, however, have been slow in gaining broad use, and even slower in the pharmaceutical industry. It may be some time before biometrics are the standard for authentication in clinical trial systems (or maybe not. The one thing clearly predictable about technology is that it is unpredictable).
It was in this context that the newest draft guidance was released on 20 February 2003. The latest guidance essentially repealed all previous 21 CFR 11 draft guidance(s)but not the rule itself or the original Guidance on Use of Computerized Systems in Clinical Trials. Aside from some minor confusion (Can a draft or final guidance countermand a regulation?), the new draft guidance made the agencys intention quite clear: The FDA will reexamine 21 CFR 11.
During the reexamination period, they intend to enforce it with a narrower scope, and with discretion. They also expressed their intent to not take regulatory action to enforce compliance with the validation, audit trail, record retention, and record copying requirements of 21 CFR 11. Additionally, FDA announced its intention to use enforcement discretion with legacy systems that were in use prior to 21 August 1997the date that 21 CFR 11 went into effect. Although these items are subject to FDA enforcement discretion, the FDA still intends to enforce signature rules, authentication, education and training requirements, system controls and procedures. Finally, and perhaps most importantly, there is no repeal of the predicate rulesclinical trial data for submission must always meet the integrity and quality standards required by GCP.
If validation, audit trials, and the requirements for record retention (and other aspects of 21 CFR 11) are not covered explicitly by the predicate rules, the agency recommends using a justified and documented risk assessment and a determination of the potential of the system to affect product quality and safety and record integrity. This approach means that industryand for that matter, FDA field inspectorscan use a risk assessment process to determine whether validation, explicit record retention methodology, and audit trails apply to a particular system. With luck, everyone will agree.
This new draft guidance presents definite risks for sponsors. As in many things in life, the risks are at either side of the bell curve of response. On one side, the ultraconservative side, the risk is that pharmaceutical companies will freeze all technology purchases, deployment, and other integration until the dust settles. The dust here may take years to settle, and in the end may be no more specific than it is today. To delay the deployment of technology in this fast-paced time could be to give away a competitive advantage.
On the other hand, the risk on the liberal side would be to declare 21 CFR 11 as repealed and to stop validation and other best practices that make sense well beyond the Electronic Records rule. In fact, these activities are not specific to 21 CFR 11, make eminent sense, and were expected in deployment of technology in the pharmaceutical industry well before 21 CFR 11 was a regulation. This is an important point to keep in mind.
Discretion is the better part of valor in this case. For the most part, 21 CFR 11 makes good, solid sense as a means to mitigate against data damage and loss. The best approach would be to continue with business as usual while integrating a risk-based assessment of current projects and systems, using the draft guidance. Sponsors should plan for reducing, but not eliminating, 21 CFR 11 activities in areas that are clearly carved out in the guidance. It might not be necessary, for example, to plan a 21 CFR 11 compliance upgrade for a legacy system that has been in place since early 1997 without any modification. If it has been modified since then, it is not yet clear whether it would still be considered a legacy system. In any case, you might decide that the upgrade made sense despite the exclusion under this guidance. Of course, it is also possible to use a risk assessment to prioritize these projects. In the end, in any questionable instance, it is always best to err on the side of compliance.
References1. Marcia Stepanek, Data Storage: From Digits to Dust, Business Week, April 1998.
Sidebar: The Case of the Patchwork Mouseby Jeremiah Kerber
Major factors leading to the development of courses in research ethics are several widely publicized cases of research misconduct. In recent years, one of the first such cases was that of the Patchwork Mouse. William Summerlin, a scientist working at the Sloan-Kettering Institute for Cancer Research in New York in 1974, painted white mice with patches of black, misrepresenting the animals as having accepted skin grafts from black mice.
Summerlin had initially reported that tissue kept in organ culture for a period of time could then be transplanted without rejection into another animal. When these findings werent replicated by others at Sloan Kettering, Peter Medawar, another scientist, suggested that Summerlin had performed this experiment on a heterozygous mouse, one with chromosomes from both white and black parents.
In other words, if the black mouse had one white parent and one black parent, a successful skin graft from a white mouse would come as no surprise. On the other hand, if Summerlin could produce a white mouse with a successful skin graft from a black mouse, he would have a significant result, because white mice are homozygous, they can only come from two white parents.
When challenged, Summerlin was forced to either produce a white mouse with a black patch or admit his failure. For this reason he used a felt pen to mark black patches on the backs of two white mice. Summerlin met with his superior, Rober Good, who was considering publication of a retraction. Although Good did not view the mice, the patches were later discovered in the animal facility. Summerlin was suspended soon after.
Eventually, Summerlin argued that he perpetrated the fraud because of a lapse in judgment during a period of time in which he suffered acute mental exhaustion, brought on by extreme professional and personal stress. An investigating committee stated that, for whatever reason, Dr. Summerlins behavior represented irresponsible conduct that was incompatible with discharge of his responsibilities in the scientific community. The committee recommended that Dr. Summerlin be offered a medical leave of absence, to alleviate his situation, which may have been exacerbated by pressure of the many obligations which he voluntarily undertook.
References
1. Joseph Hixson, The Patchwork Mouse, (Anchor Press, New York, 1976).
Driving Diversity with the Integrated Research Model
October 16th 2024Ashley Moultrie, CCRP, senior director, DEI & community engagement, Javara discusses current trends and challenges with achieving greater diversity in clinical trials, how integrated research organizations are bringing care directly to patients, and more.
AI in Clinical Trials: A Long, But Promising Road Ahead
May 29th 2024Stephen Pyke, chief clinical data and digital officer, Parexel, discusses how AI can be used in clinical trials to streamline operational processes, the importance of collaboration and data sharing in advancing the use of technology, and more.