Applied Clinical Trials
Using a risk-based model to navigate the inherent changes and fluctuations in master protocol studies-and help maintain data integrity throughout.
With recent reports noting that R&D returns in the biopharmaceutical industry have fallen to their lowest levels in nine years,1 life sciences companies need to evaluate the current R&D model and the way clinical trials are designed and implemented. One possible innovation to enhance R&D efficiency is master protocol studies. In FDA draft guidance issued in October 2018, a master protocol is defined as “a protocol designed with multiple sub-studies, which may have different objectives and involves coordinated efforts to evaluate one or more investigational drugs in one or more disease subtypes within the overall trial structure.”2
Common types of master protocol studies include:
Master protocol studies can provide potential opportunities to shorten R&D timelines, reduce R&D costs, and improve the probability of success-if designed and implemented properly. The uniqueness of the master protocol design is that it starts from a more open beginning and gradually adjusts the design to the direction where there is a higher probability of success. For instance, adding new treatment arms, changing the standard of care arm, adding or removing disease populations, and changing eligibility criteria. These are achieved by utilizing more frequent interim data reviews and decision-making that relies significantly on data currency and data quality. However, the master protocol design has dramatically increased implementation complexities because of the frequent substantial protocol amendments and/or protocol adaptations. As a result, the questions about what, why, when, where, who, and how have become critical to answer.
Master protocol studies usually are large global studies and involve numerous countries and investigational sites that have different requirements and processes for reviewing protocols. The sites also have different processes and systems in place for patient data, labs, contracts, finance, etc. Besides sponsors and contract research organizations (CROs), many vendors are used in the study for clinical supplies, lab sample processing and analysis, and imaging data review. Sponsors, CROs, and vendors also use many different systems for data capture and analysis. Because of the numerous entities involved, these processes and systems may have significant data overlap and lack the necessary integration.
A master protocol study can constantly change. It is reflected in the downstream work activities conducted by various parties, including regulatory affairs, ethics committees, institutional review boards (IRBs), sponsors, CROs, vendors, and sites, all of which ultimately impact the patient. In the current R&D model, all of these activities are performed by different companies and organizations using different processes and systems. As a result, interoperability needs to be assessed, with protocol development and any amendments handled by the sponsor or a sponsor-contracted company or consultant.
To ensure data consistency and an integrated process flow, various process plans, data transfers, and reconciliation agreements are established, which takes significant time and effort to develop and implement at the start-up stage of a trial. In a master protocol study, the development and setup activities need to be conducted again whenever there is a protocol amendment and adaptation that impact the critical process and data identification. That can make it even more challenging than at the initial setup because enrollment is ongoing, and/or a high number of patients are still active in the study.
Compared to a master protocol study’s innate flexibility and adaptability, a traditional protocol model can seem inflexible, slow, and unclear. In this context, an orchestrated cross-functional, risk-based approach to implement master protocol studies address many of the concerns with a traditional study methodology. The principles established within ICH E6 (R2) provide opportunities to effectively address these challenges, while ensuring human subject protection and reliability of trial data are maintained throughout the life cycle of the clinical trial.
The implementation of a master protocol study starts with the development of a sound integrated quality and risk management plan (IQRMP) to de-risk the complexity by answering the questions about what, why, when, where, who, and how in relation to an evolving protocol. The IQRMP defines the actions each functional group and party will take to proactively identify, assess, and manage risk throughout the life of the master protocol and each sub-study. As important as predefined actions are to identified risks, an IQRMP also should have the agility and flexibility to adapt to the speed and frequency of the changes with master protocols since it’s impossible to foresee all the risks when the study is initiated. Therefore, rather than trying to predict a fixed future, the IQRMP serves as a tool to understand the data that will be available at each point to make real-time decisions as the study evolves.
The initial step in the development of an IQRMP is the risk assessment using an instrument such as the risk assessment categorization tool (RACT). The RACT requires the identification of critical process and data identification based upon the protocol and its endpoints and objectives. Those evaluations will evolve as the protocol design advances and connects with functional oversight plans, processes, and systems residing with various parties and stakeholders. This creates an ideal venue to orchestrate the cross-functional and cross-party changes using a risk-based approach. The risk assessment process also ensures that ongoing risk reviews are performed throughout the life cycle of the study.
These regular reviews offer an opportunity for orchestrated and cross-functional engagement to safeguard important protocol direction decisions with ongoing risk reviews and mitigations. Due to the complexity of master protocol design, it can be valuable to separate out each sub-study from the master protocol while critical processes and data identification are collected as a component of the risk assessment. Once risks are identified for each sub-study, the data is consolidated to identify similarities and prioritize those risks that bring greatest impact to the overall master protocol. Another approach is to “break” the protocol based on what is certain and what is subject to change, then identify the associated critical data and processes according to these “time zones” to help prioritize, while remaining agile and flexible.
As previously noted, to achieve the purpose of a master protocol study, data must be entered and cleaned in a timely manner. Data currency and quality directly impact the timeliness and quality of these important study direction decisions as well as the trial’s overall success. However, it’s extremely challenging to maintain high data currency and quality in such fast-paced, complex, large global studies. To help address the related challenges, ICH E6 (R2) allows for varied approaches with a focus on centralized monitoring.
Centralized monitoring is valuable for master protocol studies due to the speed of study execution, the frequent study adaptations and amendments, and the scope of changes in study implementation. Potential data integrity and subject safety concerns must be identified and investigated promptly. Centralized monitoring, inclusive of advanced data analytics, allows for large volumes of data to be reviewed more quickly than the traditional 100% onsite source data verification (SDV) monitoring approach. Focused, targeted monitoring activities can promptly identify data anomalies for additional root cause analysis and corrective actions by clinical research associates (CRAs).
A risk-based monitoring (RBM) approach also brings enhanced quality control to master protocols via the use of key risk indicators (KRIs). KRIs are alerts at specific sites, or at the study level, in the form of atypical patterns of data that indicate potential risks related to the conduct of the study to ensure the defined risk-mitigation techniques are effective. Establishing these alerts in a master protocol study provide a just-in-time mechanism to identify the signals and trigger appropriate action plans. Due to the constantly changing nature of master protocol studies, it’s also important to ensure agility and flexibility are built into the KRIs and reviewed at more frequent intervals than traditional protocol designs. These RBM approaches help to align precious time and resources more intelligently in accordance with the evolving protocol design and ongoing risk assessment.
In the evolving journey of the master protocol study, from its initial version to its final iteration, there are many interim timepoints at which data is reviewed and protocol direction decisions are made. An orchestrated risk-based model provides opportunities at these timepoints to establish cross-functional engagement and safeguard these important protocol direction decisions. Its structured framework embedded in the life cycle of the study helps de-risk the inherent protocol complexity and navigate changes. This model provides clarity to the earlier questions about what, why, when, where, who and how, and establishes a continuous model that can be reused as needed in the study duration in its full or partial version.
Brian Barnes is Director, Risk-based Monitoring, PPD; Rachael Song is Associate Director, Project Management, PPD
Driving Diversity with the Integrated Research Model
October 16th 2024Ashley Moultrie, CCRP, senior director, DEI & community engagement, Javara discusses current trends and challenges with achieving greater diversity in clinical trials, how integrated research organizations are bringing care directly to patients, and more.
AI in Clinical Trials: A Long, But Promising Road Ahead
May 29th 2024Stephen Pyke, chief clinical data and digital officer, Parexel, discusses how AI can be used in clinical trials to streamline operational processes, the importance of collaboration and data sharing in advancing the use of technology, and more.