Results from a Multicenter Clinical Trial Using a Quality by Design Methodology, Risk-Based Monitoring and Real-Time Direct Data Entry
Abstract
In August 2012, a clinical trial was initiated under US and Canadian Investigational New Drug Applications (IND). The protocol was operationally designed for the clinical sites to perform direct data entry (DDE) of subject data at the time of the office visit, and for the clinical research associates (CRAs) to execute risk-based (adaptive) monitoring (RBM). For DDE, the trial used Target e*CRF for electronic data capture (EDC) of case report forms (CRFs), Target e*Clinical Trial Record (e*CTR) as the subject’s eSource record and Target Document as the electronic Trial Master File (eTMF). After meeting with the U.S. Food and Drug Administration (FDA) and Health Canada (HC) to review both the protocol, the use of RBM and the proposed eSource methodology, a multicenter clinical trial was initiated in the US and Canada. The study was performed at 18 clinical sites which screened 656 subjects in order to treat 180. All of the clinical sites were required to use DDE to enter the trial data at the time of office visit. The CRAs were trained on how to conduct risk-based on-site and central monitoring which was clearly defined in the clinical monitoring plan. Results from the study indicated that DDE at the time of the office visit and RBM allowed for acceptable levels of protocol compliance and data quality. As a result of the daily and weekly central monitoring activities: there was close to 100% compliance with all protocol requirements; the need for protocol amendments was identified and implemented rapidly when just a few subjects were enrolled; modifications to EDC edit and logic checks were completed early in the study which minimized issuing the same query multiple times, and; the CRAs and site personnel were retrained based on findings made during the weekly quality by design (QbD) review meetings. This paper supports the rationale for RBM integrated with eSource methodologies, and for the pharmaceutical industry to move in the direction of the paperless clinical trial. Once RBM and DDE are adopted, there will be a major reduction in monitoring resources and costs needed to manage a clinical trial, with no loss of quality.
In order to support the transformation of how the pharmaceutical industry manages the performance of clinical trials, in 2013, the Food and Drug Administration (FDA) issued its Final Guidance for Industry: Oversight of Clinical Investigations - A Risk-Based Approach to Monitoring,
1
and a Guidance for Industry: Electronic Source Data in Clinical Investigations.
2
These guidances are consistent with the European Medicines Agency (EMA) Reflection Paper on Risk Based Quality Management in Clinical Trials
3
and Expectations for Electronic Source Data and Data Transcribed to Electronic Data Collection Tools in Clinical Trials.
4
In 2011, a critical publication on the varied practices of monitoring clinical trials 5 was published by the Clinical Trials Transformation Initiative (CTTI), a public-private partnership formally established in 2008 by the FDA and Duke University, to identify practices that, through broad adoption, will increase the quality and efficiency of clinical trials. This publication has served as an impetus for the pharmaceutical and device industries, together with the regulators, to address the monitoring of clinical trials, which as currently practiced, is inefficient, costly and thus unsustainable. 6,7 In 2010, a comprehensive paper of the pros and cons of risk-based monitoring (RBM) was published 8 with a follow-up paper imploring the Industry that it is time to change to RBM. 9
Quality by Design (QbD) is a concept first outlined by Joseph M. Juran and is based on the premise that quality should be part of the project planning process. 10 According to Juran, most quality crises and problems are due to the way quality is originally planned. While QbD methodologies have been used to advance product and process quality in every industry, they have most recently been adopted by the US FDA for drug manufacturing. 11 In clinical research the protocol identifies the quality requirements and detailed plans and activities complement what is in the protocol. Key factors to QbD methodologies include a well-designed protocol, proper execution of the protocol, steps to assure protocol compliance, corrective and preventative action methodologies and clear and concise communication strategies.
FDA has recently published a paper on Quality by Design (QbD) methodologies describing how clinical research is changing, and how FDA and other regulatory authorities are fostering these changes. 12 The eClinical Form 13 and Transcelerate 14 have recently published very thoughtful papers on how the pharmaceutical and device industries could address RBM. Just recently, an approach to quality assurance in the 21st century was published in the Monitor describing a QbD methodology for clinical research. 15
Results from a Phase II study using RBM and DDE, where the clinical site entered each subject’s data into an electronic data capture (EDC) system at the time of the office visit, demonstrated a major reduction in on-site monitoring compared to comparable studies that use paper source records; that EDC edit checks were able to be modified early in the course of the clinical trial; and that protocol compliance issues could be identified in real time and rapidly corrected. 16 The use of DDE and near real-time monitoring also led to rapid detection of safety issues. The clinical site reported major cost savings, and estimated that just in terms of data entry, the site was able to save 70 hours of labor by not having to transcribe data from paper source records into the EDC system. 17
The current paper reports the results of a clinical trial initiated in both the US and Canada which included 18 clinical sites and 180 treated subjects, where all of the clinical sites performed DDE, all of the CRAs performed RBM where the bulk of the monitoring activities occurred centrally from the home office.
In addition to a well-designed protocol, the following QbD elements were operationally incorporated into the clinical trial:
Risk-Based Clinical Data Monitoring Plan (CDMoP)
A written strategy was developed to address the review of site-specific source data/documents, the schedule for on-site monitoring, the frequency of central monitoring and the issuance of central monitoring reports. The CDMoP specified roles and responsibilities as well as the specific monitoring requirements to ensure that the clinical sites complied with the study protocol and regulatory requirements.
The CDMoP also indicated that monitors were to record all monitoring reports in the EDC system, and that all sponsor and study documents were to be maintained in the Electronic Trial Master File (eTMF).
Within the CDMoP, a risk mitigation strategy identified a total of 23 risks to subject safety and/or trial outcome. Each risk was assigned a low to high probability score (1-3) and severity score (1-3). Each risk was then assigned a score which was a multiple of the two scores, as well as a risk mitigation strategy. For example, Subject Dropouts was one risk to the trial outcome since any dropout was potentially to be considered a treatment failure. Therefore, Subject Dropout was assigned a severity score of 3 and a probability score of 2, for a total score of 6. The risk mitigation strategy was “Training and Evaluating and Resolving Reasons for Dropouts, Phone Alerts Prompted by the eCRF and Review of Online Management Reports.”
DDE
The CDMoP documented that the study would use DDE at the time of the clinic visit. The eClinical Trial Record (eCTR) allowed the clinical study sites to have a contemporaneous electronic copy of the subject’s record. To comply with regulations, access to the eCTR was controlled by the clinical investigator or designee and not the pharmaceutical company sponsoring the study, and these original data were stored in “a trusted, third-party repository” prior to the data being transmitted to the EDC database.
QbD Meetings
Initially, weekly meetings were held with key team members to review all monitoring activities. Integrated online data management reports addressed safety, quality, compliance and study specific issues. As the study progressed, the frequency of these meetings was changed to every two weeks.
On-site and Central Monitoring Activities
As part of the approach to RBM, the CDMoP identified the need to perform both on-site and central monitoring. To avoid the need to duplicate data already within the EDC system, key metrics from the EDC system were displayed within the monitoring reports. The monitoring reports were generated online within the EDC portal and signed electronically by the CRA and the CRA’s supervisor. Lists of observations requiring followup were also maintained within the EDC portal.
Safety Monitoring
A detailed Safety Monitoring Plan was developed. There was nothing unique in this approach to safety monitoring except that an Adobe Acrobat version of an FDA-approved online MedWatch Form 3500A and CIOMS Form 1 could be generated directly from the EDC system for both original and followup reports. Both the investigator and the Medical Monitor could enter online narratives and the Medical Monitor could control the finalization of the original and followup reports needed for regulatory submissions. These reports became an integral part of the EDC system and could be retrieved on demand, based on permissions, anywhere in the world. In addition to the agreed-upon procedures involving serious adverse event (SAE) reporting to both the sponsor and regulatory authorities, email alerts occurred at the time of data entry for any SAE and if any SAE data were modified. In addition, the EDC system summarized all adverse events and it was possible to assess adverse events across sites.
QbD Meetings
As part of the QbD methodology, initially, weekly meetings of approximately one hour occurred with the clinical team (n=3), the sponsor (n=2) and an outside expert who performed quality oversight (n=1). Over the course of eight months, this represented a total of 20 meetings and 80 hours (two weeks) of human resources. This effort was roughly equivalent to three on-site monitoring visits.
Time to Data Entry from the Visit Date
One of the key advantages never consistently accomplished with EDC, was the ability to have rapid access to the clinical trial data from the time of the office visit. With DDE, the site was “forced” to enter the data at the time of the office visit. However, as this involved a change in behavior at the clinical site, there was no guarantee that the sites would comply. Therefore, the time to data entry from the day of the clinic visit was assessed. However, not all data could be entered directly at the time of the office visit since sites maintained certain source records outside the EDC system. As a result, some of the data associated with these source records were entered after the clinic visit. For example, unreported medical histories and medications were identified during “chart review” at the time of the monitoring visit.
In spite of DDE being a “disruptive innovation,” 92% of data were entered on the day of the office visit and 95% within five days and 98% within eight days (see Figure 1). Some of the outliers were due to findings during the monitoring visits and delays in data entry when the sites waited for additional information to complete a form.
Figure 1 Time to Data Entry from the Day of the Office Visit
Time to Data Review
The time to data review by the monitors is a key factor to optimize RBM, since without having access to real-time data, the same errors are repeated and any corrective actions are delayed. Key forms included in this analysis were:
A total of 13,124 forms were analyzed from 180 subjects (see Figure 2). Results showed that 50% of the entered forms were reviewed within 13 hours (0.54 days) of data entry, 75% within 27 hours (1.1 days), 95% within 124 hours (5.2 days) and 100% within 335 hours (14 days). It should be noted, however, that occasionally a form was “missed” by the CRA and a small number of forms were “saved” awaiting additional information or conclusions based on consultations with the principal investigator (PI).
Figure 2: Time to Data Review (hours) From Day of Data Entry
On-site and Central Monitoring Activities
Between August 1, 2012 and May 31, 2013, 31 on-site monitoring visits were performed at the 18 sites. No other on-site monitoring was deemed necessary based on the observations at the initial on-site visit, daily review of online eCRFs, in-house review of the eTMF, and site audits by Quality Assurance. The bulk of the 2nd monitoring visit was combined with the closeout visit since most of the subjects had completed treatment at the time of the visit. The final closeout activities were performed over the phone.
Since measurements on the last day of the three-month treatment phase of the study included evaluation of the primary endpoint, prior to the first subject arriving for that final visit, each site was retrained over the phone as to the required activities taking place on final day of the study. In addition, each site was instructed to inform the CRA when the first subject was to arrive for the Day 90 visit so that the CRA could immediately review all of the data entered on that day. An email alert was also sent to the project team at the time the Day 90 visit date was entered within the EDC system.
A total of 211 central monitoring reports were issued, and once it was clear that both the sites and monitors were adequately trained, the frequency of issuing these reports was changed from every two weeks to every four weeks.
Source Data Verification (SDV)
For this study, there was a total of 27,957 EDC “pages” entered for 29 unique CRFs. As part of the approach to RBM, the CDMoP identified specific data elements collected at the clinical sites either within the electronic medical record (EMR) or on paper charts for SDV.
A total of 5,581 of these paper/electronic source records were reviewed at the site and compared with the clinical trial database. These source records represented about 20% of all entered pages. Results showed that only 13 of the 29 forms had any changes, with a total of 48 changes made to the database as a result of SDV (Table 1). These changes represented a 0.86% “error rate.” The vast majority of the changes (66.6%) occurred in just three forms, medications (13; 27%), medical history (10; 21%) and clinical laboratory result (9; 19%).
Table 1: Summary of Changes Made to the Database Post SDV
In order to evaluate this “0.86% error rate,” Table 2 identifies examples of types of changes made to the database as a result of SDV. As can be seen, only one modification, Titration Result (278.3 changed to 123.2) could have had any impact for the study. However, as this parameter was defined as Critical to Quality (CTQ), a specific risk to protocol compliance and subject safety, a copy of the record was available to the CRA at the same time as the site received it. In addition, all of the changes identified via the SDV process, would have had no impact on subject safety, data integrity or protocol compliance.
Table 2: Itemized Changes to the Database Post SDV
Queries
There were a total of 1,099 queries that were generated from 27,966 CRFs entered by the clinical sites. This represents an overall form query rate of 3.9%. However, only 403 (37.6%) of the queries resulted in changes to the database. Thus, only 1.4% (403/27,966) of forms had database changes as a result of queries generated by the CRA.
In order to measure efficiencies of CRA review activities, the time from data entry to query generation was assessed. Strikingly, 39% of queries were generated on the same day as the office visit, 58% within one day and 70.6% within five calendar days. What this really means is that corrective actions were able to occur early and rapidly during the clinical trial.
Figure 3: Cumulative Time from Data Entry to Initial Manual Query Generation
Time from Query Generation to Resolution
Queries were generated in response to an edit check being fired at the time of data entry (auto queries) for which the reason provided by the clinical site required additional information, or as a result of a de novo request for additional information based on clinical review of the CRFs. For example, an ongoing diagnosis of Type 2 Diabetes was reported but no treatment was documented. While online monitoring was done in ‘real-time’ with current snapshots of data, queries were also generated from on online batch edits which were generated, within the EDC system, based on cumulative comparisons of information entered across forms, and over time, that suggested data inconsistencies.
In the previous tables, it was demonstrated that with central monitoring and DDE, it is possible to rapidly enter data and generate queries from the time of data entry. The next challenge was to assure that queries are resolved in a rapid manner. In the following figure (Figure 4), the time to query resolution was assessed for all generated queries including those done manually, those done based on edit checks being fired at the time of data entry (auto queries) and those resulting from batch queries run at night within the EDC system.
As can be seen, with central monitoring, 22% of queries were resolved on the same day they were generated, 78% within five calendar days, 91% within 10 days and 99% within 30 days.
Figure 4: Cumulative Days from Generation of Manual Queries to Resolution
One of the keys to a successful outcome of a clinical trial includes timely data entry and data review, and ideally for data entry and data review to occur at the time of the office visit.
Risk has to do with the probability and impact of an event to the outcome of a clinical trial, and risk mitigation strategies are put in place to manage that risk. Clearly, we should not put the same effort into monitoring variables that “do not matter” as we do into the ones that “do matter.” RBM is not about more or less monitoring visits or SDV, but rather, targeted, efficient and intelligent monitoring. CRAs need to be retrained in their way of monitoring by focusing on the elimination of errors that matter.
DDE can dramatically reduce or even eliminate paper records, and as a result, SDV should also be dramatically reduced. Since SDV typically assesses how well people transcribe from one medium to another, and since such transcription “error rates” are typically below 1%, SDV as currently performed, should have no impact on the study results. However, as part of the risk assessments performed at the beginning and during the study, the rationale and scope of SDV should be defined. SDV requirements will most likely be replaced, in part, with source data review (SDR) or what would be better described as Chart Review. Chart Review truly allows for a snapshot of the study subject where critical information “buried” in the chart can be discovered.
After all, monitoring is all about training and oversight. Think about a typical Phase I PK study. Should we put in the same effort to verify the date of an appendectomy 10 years in the past as we would to verify the time of critical PK draws, storage conditions of the samples and shipping procedures for analysis and methods validation?
The main lessons learned from the study were:
The following are recommendations to consider when doing RBM and DDE:
The present study clearly demonstrates the advantages of RBM and DDE. Beyond potential cost savings, benefits include:
Accelerating Clinical Trial Design and Operations
Fully-integrated, component-based CDMS offers flexibility, customization, and efficiency.