Applied Clinical Trials
Pilot study compares a risk-based monitoring and remote trial management method with traditional on-site source data verification for trial oversight.
Five years ago, the FDA and the European Medicines Agency (EMA) released final guidance to change clinical trial oversight methodology from on-site visits using source data verification (SDV), the gold standard for more than 30 years, to a risk-based monitoring (RBM) approach.
1,2
Implementing this guidance created two daunting challenges to reconcile as follows:
This lack of scientific data on trial oversight effectiveness is a critical unmet clinical research need. It affects more than 100,000 research participants per year and their healthcare providers.
This article represents a prospective analysis comparing the effectiveness of using traditional SDV versus one method of RBM (i.e., the MANA Method). We identified the specific RBM method used herein due to inconsistent RBM definitions and RBM implementation methods, and the varying levels of effectiveness for all the different RBM approaches.
Research methods
PaxVax conducted a Phase IV vaccine trial in approximately 500 subjects at nine U.S. sites.
The study was conducted using electronic data capture (EDC). The trial was approved by an institution review board (IRB) and each subject signed an IRB-
approved informed consent prior to participating. Subjects received one dose of the study vaccine. Participants collected any changes in health for nine days in a paper diary aid, and sites entered the results into the EDC. Each research site maintained its own informed consents and site regulatory binders.
Site monitors visited the research sites monthly and spent approximately 72 days on-site conducting SDV of the trial data. PaxVax’s senior management team (i.e., medical monitor; senior director, biostatistics; director of data management and statistical programming) reviewed the data monthly to identify trends or data errors that would be followed up by the site monitor.
MANA RBM modified its risk-based monitoring and remote trial management system (i.e., the MANA Method) to initiate an RBM approach for this study that began after 5.5 months of trial conduct (approximately 500 subjects already enrolled). To fully implement the MANA Method, additional trial oversight and remote document review, including informed consents and site regulatory documents, would also have been implemented and evaluated.
In this pilot study, MANA RBM independently and remotely reviewed and used the existing trial data available electronically to determine whether errors and trends could be identified faster and more comprehensively than using the traditional SDV method. Analysis of informed consents, regulatory documents, and source documents were not included in this pilot study because the documents were not available electronically.
MANA RBM first conducted a proprietary risk assessment service based on the protocol. It then designed proprietary study-specific reports and data visualizations to evaluate the high-risk data and processes identified during the risk assessment. The basic categories included: efficacy endpoints, safety assessments, investigational product (IP) management, and human subjects’ protection.
Trial data was imported from the EDC platform into JReview, hosted by Integrated Clinical Systems, Inc. MANA RBM designed its proprietary Subject Profile Analyzing Risk (SPAR) tool to provide an integrated visualization of the high-risk data for each subject over time and trained the remote monitors in its use. SPAR configuration is unique for each trial based on the critical issues identified during the risk assessment. Additional proprietary, custom reports were also developed to support protocol-specific analysis of high-risk data and processes and trends.
All review was performed independently of the EDC system and based on MANA RBM data analytics. Results of the review were captured in a separate, proprietary MANA RBM Site Tracker Analyzing Risk database (STAR); MANA RBM developed this tool to conduct study quality oversight. Subject review was documented in JReview.
MANA RBM conducted review using its remote quality management approach as shown in Figure 1. The MANA Method splits the review process into tiers. Remote site monitors focus on subject review and high-risk data and process oversight at the subject level. Central review focuses remote review on trend analysis by evaluating data across subjects at a single site and across sites.
The pilot study compared SDV versus the MANA Method in the following areas:
1) Identifying major deviations
2) Queries raised as a result of SDV
3) Identifying trends in data affecting trial conduct and/or results
4) Timing of the subject review
5) Resource use
Results
Risk assessment and development of protocol-specific reports: MANA RBM conducted the risk assessment and implemented the SPAR within two weeks of uploading the data into JReview. Additional custom reports were developed over eight weeks. These reports included customized, cross-database reports and trend analysis of high-risk data and processes.
Subject review: Once the SPAR was available, reviewers began reviewing the data immediately. MANA RBM split the subject review. An experienced monitor reviewed half of the subjects during the first month and the data reviewer, new to subject review, reviewed the other half of the subjects during that month. The following month, the subject reviewers switched subjects to review to allow evaluation of oversight by remote monitors with different training and experience. The lead monitor performed quality control (QC) oversight of each of the remote monitors to provide immediate feedback on items missed or documentation correction.
Identification of protocol deviations. MANA RBM’s remote site monitors identified critical deviations using the SPAR and accompanying high-risk reports. The MANA Method identified critical deviations not previously identified by the sponsor’s on-site monitors.
Speed of identification. Using remote methods, the monitoring team could have identified deviations faster and earlier than using SDV and on-site visits. Within six weeks, two rounds of review of all critical subject data were completed and all deviations for critical data were identified.
Categorization of deviations. Differences in classifications of deviations as major or minor were identified between the MANA RBM remote monitoring team and the on-site monitors. This resulted in challenges when comparing the total numbers of deviations. These totals were similar and there were no major deviations discovered by the sponsor’s on-site monitors that the MANA Method did not also identify remotely.
Source document review: This study was conducted with paper memory aids and transcription by the research sites. Since this was a pilot study, sites were not asked to convert the paper memory aids to certified copies, which would have allowed remote review.
To evaluate whether there were findings that the MANA Method would not have been able to identify without on-site visits or using eDiaries, MANA RBM reviewed the queries related to subject diaries generated from the study. The MANA RBM team identified 300 queries associated with SDV. Table 1 shows the distribution of the queries and illustrates how remote review would have identified all critical findings with the use of eSource or certified copies of the diary aids. Important data is defined as data that would affect subject safety or analysis of efficacy.
MANA RBM reviewed the important data remotely from source review query rates and found that two sites had much higher query rates (i.e., 2-10 times the rates of the other sites), as shown in Table 2 (see below). This information, if known to the sponsor, would have allowed it to determine the need for continued on-site SDV and, if needed, focus SDV, additional training, or other strategic considerations on only two sites instead of all sites.
MANA Method central review and trending
Central review and trending was conducted in addition to subject review. This review occurred during the second and third month of the pilot study using proprietary reports designed specifically for the high-risk areas identified in the study’s protocol. From this review, MANA RBM central monitors recognized several trends that could have significantly impacted this study as follows:
1) Deviation evaluation identified at least one trend that could have enabled more evaluable subjects. A higher rate of out of window visits existed for one site. While not usually considered a major deviation, the timing of this critical visit represented the collection point for primary efficacy data. The MANA Method would have identified and corrected this error sooner, leading to more evaluable subjects. On-site monitors did not identify this issue. The PaxVax senior clinical research management team identified this site deviation at its monthly review meeting while the MANA RBM reviewers discovered this issue immediately upon performing central review.
2) A vital signs evaluation identified one site that had issues with collecting vital signs; specifically, collecting manual temperatures. Analytics identified this issue by using the differences in the mean values and a scattergram of actual values. This indicated a process issue that could have significant impact on future studies where immediate measures of temperature elevation after an IV injection could have been under-reported. Only the MANA Method remote central monitoring approach identified this issue.
3) Incomplete dosing represented another area where variability existed in performance across sites. Since sites “batch” (i.e., enroll large groups of subjects over a few days) their dosing for vaccine trials, identifying this issue rapidly may have increased the number of subjects that took the complete dose. The senior clinical research management team noted this issue at its monthly meeting. The MANA Method central monitors noted it immediately upon review. On-site SDV did not identify this issue.
4) Variability in reporting on adverse events of special interest occurred across sites. One site routinely ranked lowest or second lowest among the sites across the reported eight adverse events. While it was not clear if an issue existed, it was a trend that should have been evaluated to understand the processes by which this critical assessment was being conducted. Only the MANA Method central monitoring approach identified this finding. In Table 3, all sites with at least 15 subjects enrolled were evaluated on the ranking, across sites, for severity of the adverse events of special interest using z-scores (i.e., the number of standard deviations from the mean). One site routinely ranked subjects either the lowest or second lowest in severity, while one site routinely ranked subjects at a higher severity.
Findings not requiring action
1) The early termination rate was higher at one site than at the others. The reasons for early termination were not different across sites. No action was recommended at that time.
Review timing
The MANA Method enabled remote, comprehensive subject review of the high-risk data and processes to begin within two weeks of starting the project. No minimum data requirement was required to begin the review after a subject’s visit data was entered.
Central trend analysis began approximately two weeks after remote subject review and identified additional data errors that could be corrected quickly. This rapid review could have eliminated errors in several aspects of study conduct as follows:
Resource use
The sponsor assigned eight months of resources to the study as follows:
MANA RBM used the following resources
Design, build, and validate study-specific reports in JReview: 2 FTEs for two months.
The reviewer (“monitoring”) resources were much smaller than used in a traditional trial as follows:
Time savings occurred in three areas
Discussion
Increased quality, lower cost, and faster review times (including earlier detection of problems) represent the holy grail of trial oversight. The dogma was that you could only achieve two of the three. Using the MANA Method for remote trial oversight in this pilot study confirmed this is no longer true.
1. Quality-The MANA Method identified issues not seen using SDV. Its review focused on “errors that matter” that could affect trial outcome, not just traditional SDV point-to-point checking or identifying only data that did not conform to expectations (e.g., out-of-range values). Central (cross subject/cross-site) and remote subject review identified specific site actions that could be corrected rapidly, enhancing the number of subjects that could be evaluated and lowering the overall burden of trial management.
A second quality benefit of this RBM approach was the ability to perform and document QC remotely on each monitor/data reviewer’s performance. This provided enhanced oversight not possible when all or most activities occurred at the research site. In 2013, MANA RBM reported on using remote review to perform a 10% QC review of informed consents on 788 subjects across 12 sites. This review took two days and required no travel.3
PaxVax senior management spent a significant amount of time evaluating trends, which the MANA Method identified with fewer resources and faster while conducting the monitoring/trial oversight. Many companies do not have the resources and/or make the commitment PaxVax made to oversee the trial at this level. These findings confirmed that the MANA Method provided a cost-effective alternative for allocating senior management resources efficiently.
Using the MANA Method, monitors/data managers understood the critical data and processes and how they should be evaluated based on the data and document review guidelines. Instead of reviewing the subject’s data in the electronic case report form (eCRF), whether doing transcription checking or just reviewing the eCRF, the MANA Method allowed more comprehensive oversight of each subject’s data in context (i.e., across multiple data sets) and over time. This approach identified errors in process that were not obvious when the review focused only on out- of-range values, transcription errors, or missing data.
2. Time-The MANA Method meets the RBM regulatory guidance for rapidly reviewing critical data. The main tool used for subject review, MANA RBM’s Subject Profile Analyzing Risk (SPAR), was built and deployed within two weeks of data upload into JReview-allowing comprehensive subject data review immediately after data entry.
While not possible in this pilot, when the MANA Method is implemented from the beginning of the trial, actual time to subject review and time to identification of major issues could be calculated, delivering oversight in days rather than waiting for an on-site visit.
This illustrates how overall monitoring time can be greatly decreased. Instead of selecting a subset of subjects or a subset of data for SDV, now every subject’s critical data can be reviewed without impacting overall study costs. Rapid, comprehensive review can also occur when new data are added without significantly impacting costs. There is no “critical amount” of data needed to perform subject review. The data from a subject visit is sufficient to start review. These findings align with the data MANA RBM previously published on the speed of using the SPAR to conduct subject review.4
Once the MANA RBM protocol-specific complete reports were designed, developed, and validated, the actual review process was significantly shorter, and performed remotely. This provides tremendous potential savings for studies, such as oncology trials, that currently require on-site visits to review subject data, even for a single subject.
Time savings were not restricted only to monitoring time. Using the MANA Method, site monitoring savings were at least 83%, data management time savings could have exceeded 40 hours per month, and senior management time savings could have exceeded 60 hours per month.
3. Cost-This approach should be, at a minimum, cost-neutral. Cost savings can be significant depending on how the entire study is designed and implemented.
Any cost comparisons of methods should include total costs for trial oversight. With better oversight by the monitors, data are corrected faster-saving site time and enhancing the number of evaluable subjects. In addition, this pilot demonstrated that internal senior management time can be saved when the MANA Method is used to ensure cleaner data and identify critical issues earlier.
Using an electronic investigator site file (eISF) and certified copies of informed consent and other source documents would have enabled complete remote review because all documents would have been available remotely. Clinical trial associates can perform many tasks to manage the regulatory binders (i.e., complete and correct documents) and informed consent review-adding to cost savings. While the eISF and remote informed consent review were not used in the pilot, these tools can save additional resources and enable more comprehensive remote review.
Employing ePRO/eDiary in this study would have also yielded significant cost savings as discussed ahead. If eDiaries had been used, with eConsent (or certified copies of paper informed consents and subject diaries) and eISF, the number of on-site visits could have been significantly decreased.
The importance of eSource and eConsent
eSource and eConsent provide several benefits for RBM and remote trial management. Most companies incorrectly assume a change is necessary to add these tools to its EDC. eSource can be implemented using EDC with direct data entry or with a system designed to be used on a tablet. The benefits include:
Using eSource provides significant cost savings. For the 500+ subjects in this study, using an eDiary would have resulted in savings from sites entering 20,000 data points from memory aids (assuming 40 items/subject, five seconds of data entry/item), monitors visiting the sites to review the 20,000 data points (five seconds/item), and an estimated 500 queries (2.5% error rate, 15 minutes/query). This one change could have saved, conservatively, 179 hours of study staff time (over four weeks of work), not including costly monitor travel time or the increased frequency of visits required to review these critical data.
For eConsent, additional benefits include:
Using certified copies of paper informed consents and paper subject source data such as diaries provide an intermediate alternative to eConsent and facilitates rapid remote review.
The importance of central review and trending
While MANA RBM remote site monitors found important deviations using subject review, the central review process was invaluable in identifying the critical findings discussed in this article.
Reviewing trends allowed the MANA RBM team to identify sites having problems with scheduling patient visits, dosing according to the protocol, methods for collecting vital signs, and rating differences. While not necessarily critical findings in isolation, these issues can affect trial outcomes if left alone to compound over time. Investigating critical data and process findings represent the core of RBM principles.
Oversight should be focused on “errors that matter,” which include processes in addition to analysis data. Trend analysis is critical because trends indicate systemic issues with those data and processes. These types of issues cannot be identified by SDV or even remote eCRF review. Only through using more scientific, data-driven, systematic approaches can important findings be identified, evaluated, and corrected.
Protocol-specific analysis
It is notable, for many reasons, that many RBM models incorporate SDV as its method for quality oversight; albeit fewer fields are now reviewed than the previous 100% SDV standard prior to the release of the FDA, EMA, and ICH Guidances. One problem reported in the Kunzi et al. paper is echoed by others: That monitors, although instructed to do less SDV, are concerned that they do not have a good grasp of the subjects when doing anything less than 100% SDV and will, therefore, perform 100% SDV regardless of the monitoring plan-this negated any anticipated RBM cost savings and required longer site visits.5
Our data conflict with the perceptions published by Kunzi et al., which reported that 58% of monitors in Europe, experienced in RBM, thought important protocol violations were missed using RBM.5 The MANA Method identified remotely all critical deviations discovered by on-site monitors.
In addition, the MANA Method allowed the monitors to know exactly what the important data were and how to efficiently review all critical data in minutes, while providing more effective oversight than traditional SDV.
Sponsor opportunities
These data demonstrate the potential opportunities for enhanced trial oversight using remote, systematic, data-driven, analytic methods focused on the data that matters, (i.e., affecting trial analysis, subject safety, IP management, and human subject protection). These approaches use fewer resources, at a lower cost, and can be adopted without increasing study budgets-in many cases with lower study budgets. More importantly, trial quality is improved and sponsors know immediately about the issues that can affect the study, study participants, and regulatory submissions.
Just as sound research methods are the hallmark of pharma, biotech, device, and vaccine discovery efforts, sponsors now have the opportunity and the responsibility to apply sound, quality-based research methods and tools to the clinical research they conduct. As clinical research professionals, it is our responsibility to embrace improved methods for quality oversight and not be complacent and continue to perform trials “as we have always done them.” Regulators, patients, and their physicians are counting on us.
The MANA Method is a proprietary, study-specific RBM approach performed remotely, independent of the EDC system used, and adoptable at any time during trial conduct. It was shown to systematically identify errors in trial conduct, subject safety oversight, and GCP compliance. The MANA Method identified critical errors in trial data and study conduct trends, within and across sites, more effectively when compared with on-site SDV. This pilot study demonstrated that subject review could be started earlier, and overall resource use was less than with traditional SDV on-site monitoring.
References
Penelope Manasco, MD,is CEO, MANA RBM;Eric Herbelis President, Integrated Clinical Systems, Inc.; Sean Bennett, MD, PhD, is Senior Director, Clinical Development and Medical Affairs, PaxVac, Inc.;Michelle Pallas is Director of Statistical Programming and Data Management, PaxVax; Lisa Bedell, MA, is Senior Director, Biostatistics, PaxVax;Deborah Thompson, MPH, is a consultant for MANA RBM;Kevin Fielman, PhD, is affiliated with MANA RBM; Garrett Manascois a consultant for MANA RBM; Charlene Kimmel is affiliated with MANA RBM; Everett Lambeth is a consultant with MANA RBM; Lisa Danzig, MD, is Chief Medical Officer, PaxVax
Driving Diversity with the Integrated Research Model
October 16th 2024Ashley Moultrie, CCRP, senior director, DEI & community engagement, Javara discusses current trends and challenges with achieving greater diversity in clinical trials, how integrated research organizations are bringing care directly to patients, and more.
AI in Clinical Trials: A Long, But Promising Road Ahead
May 29th 2024Stephen Pyke, chief clinical data and digital officer, Parexel, discusses how AI can be used in clinical trials to streamline operational processes, the importance of collaboration and data sharing in advancing the use of technology, and more.