Regulatory Data: How Technology Can Help ClinOps Survive

Publication
Article
Applied Clinical TrialsApplied Clinical Trials-12-01-2020
Volume 29
Issue 12

Clinical operations professionals, burdened by lack of data standardization, turn to technology in hopes of streamlining regulatory processes for the future.

Joe Constance

As clinical trials grow in complexity, clinical operations (ClinOps) professionals are finding it challenging to keep pace and fulfill their oversight responsibilities. Their workloads are increasing like never before, and their resources are not helping them keep pace with a tsunami of data, much of it unstructured. One of the central areas of data requests come from their internal company colleagues in regulatory, for data that regulatory bodies will eventually require, as well as data requests from regulators themselves.

Mike McLaughlin, Associate Director of Clinical Operations at Dermavant Sciences, Inc., says the major ClinOps issue is dealing with large amounts of data and a lack of standardization. “It was much simpler when we dealt with paper files and not electronic data,” he says. “Now we have to sift through and streamline the data and find the data that provides the most value.

“Different companies collect data in different formats. For example, there’s a recommended DIA file structure and format for the TMF, but not all companies file documents using the same file structure,” McLaughlin explains. “It is more difficult to review the TMF if the documents are not structured and formatted correctly. Time-consuming manual labor is needed to correctly structure and format the documents. Software could speed this task.”

Karen Roy

Karen Roy, Co-Chair of the DIA-founded Trial Master File (TMF) Reference Model and Chief Strategy Officer at Phlexglobal, a technology and services organization for clinical and regulatory matters, continued: “I think the biggest challenge is that the regulators want to see the detail behind the documentation, including the different systems and audit trails that back up the documentation and (ClinOps) activities. It’s no longer just a matter of providing a few documents to review. This has made the TMF much bigger because people don’t know exactly what they should be collecting or presenting to regulators. They collect way more than they used to, and sometimes more than what’s needed. Mandates have made it harder to keep up from a regulatory perspective.”

“There’s pressure on clinical operations to produce data in a consistent, manageable, and compliant structure for their colleagues in regulatory affairs and operations to streamline submissions. But there are different mandates to follow, such as the Electronic Common Technical Document (eCTD) for the FDA. It’s challenging, especially when you have mandates within mandates creating duplicate work,” adds Jim Nichols, Co-Chair of DIA’s Regulatory Information Management Committee and Chief Product Officer at Phlexglobal. Roy and Nichols add that sometimes the mandates are not clear, leaving ClinOps in the lurch.

Poor metadata compliance

Paul Fenton, President and CEO at Montrium believes there has been poor compliance with registering metadata against documents because the process is very labor-intensive and ultimately, “We don’t fully leverage the metadata as an information source. We can scan documents using AI and identify a broader range of metadata that could be extracted from those documents and used to provide a richer base of information about particular types of documents, a particular trial site, or a particular process,” he states.

Several companies are developing AI solutions to recognize documents and metadata, make them more structured, and perform automated extraction on indexing of information. He expects to see these types of systems, which are much more integrated and process driven, with AI and machine learning tools built in, in the next two to three years.

In addition, development of a forms management solution, which would be part of the TMF, would allow companies to collect and add data in a much more structured way directly into a database, according to Fenton. “When you initiate a site, there are many forms that must be completed, which provide basic information about that site, like FDA Form 1572. These forms could be filled out electronically online instead of being uploaded as a document, which would make the data easier to integrate and exploit,” he says.

Fenton chairs a group which has developed a standard for exchange of TMF information between systems, eTMF Exchange Mechanism Standard that is a sub-group of DIA’s TMF Reference Model. They have started to define areas like standard metadata, which is a step toward having more standardized information about operational documentation.

“There’s a lot of variability in terms of the information we have to manage,” Fenton continues. “There are probably around 500 different types of documents collected in a TMF, which all have their own format, and formats can change from one company to the next. Standardization will help, and machine learning could help unlock the gold mine that all of that information represents.”

Jim Reilly, Vice President, Vault R&D, Veeva Systems, a provider of cloud solutions, said, “Why can’t we use AI in the form of natural language processing or machine learning to examine content as it comes into a TMF and automatically classify and attribute metadata to the documents so that we don’t have to spend much resource time reviewing material that’s already been received? This would be a tremendous cost and time saver.”

The CRO/sponsor data flow

Mike McLaughlin

Although requiring more effort on the part of sponsors and CROs, McLaughlin thinks it’s important that the FDA wants more sponsor oversight and is placing more emphasis on quality. “The current guidance (ICH E6 (R2)) is more specific than in the past. They’re driving home quality and risk assessment to make sure that clinical trials are performed with care.”

In the guidance document, the FDA updates and further outlines responsibilities of institutional review boards (IRBs), investigators, and sponsors based on “evolutions in technology and risk management… to increase efficiency and quality.” The goal is to improve clinical trial design, conduct, oversight, human subject protection, and the reliability of trial results.

“Sponsors must review documentation on a regular basis, ensuring that CROs are following good clinical practice. As sponsors, we’re responsible for making sure that the obligations that you’ve transferred over to the CRO are being fulfilled,” McLaughlin notes.

Jim Reilly

“It’s incumbent upon sponsors to be accountable for their trial execution even when they outsource. So, it’s important for them to have a mechanism by which they can accurately receive information from their service providers. Technology can have a significant play here,” explains Reilly.

He thinks that technology needs to “up its game” and provide a consistent mechanism that enables CROs and sponsors to share and exchange information, driving better trial oversight. Part of this involves having better operational data standards and consistent data standards, but he also believes there is a need for technologies that have open connection points, and which are networked so data can flow between CROs and sponsors and be better understood by the receiver.

Quality and AI principles

There is discussion of using artificial intelligence (AI)—as in many areas of pharma—to build quality into unstructured and structured documentation that becomes part of the TMF. “Regulators are increasingly supporting the use of Quality by Design principles (QbD) in clinical trials. Risk-based quality control is a key component of QbD,” Roy explains.

Addressing risk management and taking a risk-based approach pushed by regulators is important for Fenton. But clinical trials contain a lot of information.

“A clinical site might be higher risk because they’re recruiting a lot of patients, or there may be many safety cases at that site, or delays in providing information that can impact risk. We should be able to flag groups of documents or information that should receive additional scrutiny, QC, or verification because they’re considered an inspection risk. Machine learning could improve the risk models over time,” explains Fenton.

While AI can be used to check documentation for quality and to help with risk assessment, there also is a need to collect data from disparate sources more efficiently and to unify the data collection process. “Companies haven’t been the best at building quality into documentation from the beginning and putting processes, such as the ALCOA framework, into their systems,” states Roy. She suggests adding ALCOA and other quality processes to source systems and then using AI to analyze the documents and transfer them to the centralized TMF, which would bemore efficient.

It’s also important to map out what technology can do. Technology can bring things together and indicate knowledge and data gaps. But it won’t tell you which documents are needed, unless it is directed to that need. This is a skilled task that requires the expertise of the ClinOps team, or outside experts, adds Roy. And that subject matter expertise can be built into technology to essentially establish a risk-based approach to a process to check on documentation.

Other AI approaches

McLaughlin is hopeful that advanced software, such as AI and machine learning, will remove some of the burden from ClinOps personnel, for example by helping to structure any unstructured data, optimizing documentation in the TMF, and by helping trial participants remain compliant with their medications. But he thinks the technology still requires additional commercial exposure before it can fully demonstrate its usefulness in clinical trial settings.

Jim Nichols

“ClinOps people can do many tasks but they are struggling to keep pace with regulatory requirements,” Nichols explains. “Compliance is taking up more time and effort from trained people to do manual, repetitive tasks. These tasks are tailor-made for AI technologies, such as machine learning.”

To connect the dots of the ins and outs of the trial, including the many different study events, Fenton indicates a standard process model could be established that defines all of the processes that occur in a clinical trial, including the events that trigger them and the data required to describe them. This model would then be used to pull information from different sources.

“Technology can help by leveraging this model. But the model needs to be defined by people who know about clinical processes and by the regulations,” Fenton says. “We can improve the model over time by making connections between different processes and events as well as looking at historical data through machine learning and AI.”

Another related technology trend Alok Tayi PhD, Vice President of Life Sciences at Egnyte, sees is the emergence of natural language processing (NLP) in the clinical trial space. “NLP promises to automate rote tasks and improve data integrity by helping staff extract key data from study reports, CVs, and other documents,” explains Tayi. Recent advancements in NLP enable software to perform cognitive tasks, like document classification, data extraction, and even summarization. Tayi believes that these novel capabilities, integrated into the trial workflow, will allow trials to process quickly new data, check for inaccuracies, and extract insights quickly.

Reilly is encouraged by how the FDA and other regulators are piloting initiatives that can make the review process less difficult. The agency’s Real-Time Oncology Review Pilot Program aims to explore a more efficient review process which includes companies submitting top line results as they occur, speeding the review process.

“In many ways they’re pushing along the data directly to the agency rather than compiling it in a big package. Technology can help with modern submission tools and mechanisms where data can be compiled in a normalized sense and pushed into the regulatory system. A more hardened connection between clinical and regulatory can make a real-time submission process more efficient,” Reilly explains.

AI and data privacy

Alok Tayi

Tayi expects AI algorithms will help the industry address key data privacy issues. AI can help sponsors, sites, and CROs monitor their data for sensitive information and ensure compliance in an automated fashion. For example, technology can ensure that consent documents are properly classified, or that data managers are alerted to personally identifiable information (PII) embedded within the metadata of CT imaging. “But we need clarity on how to train the models to help identify data as well as very specific regulations to which we can tie the data,” Tayi notes. He also sees AI helping structure data and metadata for one’s TMF.

In addition, as sponsors and CROs consider running new types of trials, such as virtual clinical trials, data privacy risks are heightened. According to Tayi, institutions will need automation and AI software that ensures that the right people have access to the right data, without the overhead and burden to the team.

Joseph Constance, Contributing Writer

Recent Videos
Related Content
© 2024 MJH Life Sciences

All rights reserved.