Ensuring Proper Data Collection

Commentary
Video

In this video interview with ACT editor Andy Studna, Erin Erginer, director of product, Certara touches on the burden associated with data collection.

In a recent video interview with Applied Clinical Trials, Erin Erginer, director of product, Certara discussed the growing complexity and volume of data in clinical trials. Data is now captured from biospecimens and digital health technologies, including advanced genetic and imaging data. Regulatory agencies require standardized data submission for analysis, but do not dictate collection methods, leaving pharma companies to develop their own standards. Lack of standards leads to inefficiencies, with only 20% of studies meeting deadlines, causing significant delays and costs. Improper data collection can lead to missed endpoints and regulatory issues. Collaboration among pharma companies and external vendors is crucial for improving data standards and trial efficiency.

A transcript of Erginer’s conversation with ACT can be found below.

ACT: Since there are no standards for data collection, how much burden does this put on stakeholders in a trial to properly collect it?

Erginer: I think there's a significant burden on the different stakeholders, especially during study startup. In order to be able to understand that you've collected everything that needs to happen at the beginning of the study, it really requires a lot of input from the different stakeholders, internally and as well as the external lab vendors. You need to pull in the clinical research team who has expertise in the therapeutic area being studied. You need to pull in the different programmers that will be changing that data throughout the life cycle of the study. You need to pull in the biostatisticians who are going to be doing all of that complicated math to be able to pull out those results in there. Bringing all those people together at the beginning of the study in a centralized way is definitely difficult. Without industry wide standards, it really comes down to individual pharma companies aligning and creating their own, specialized knowledge groups that are able to tackle these study by study, and hopefully eventually create standards that they can use globally across all of their studies.

It really comes down to the fact that the better control you have and the better standards they have, end-to-end in that process, that means the faster they can get from data collection all the way to those data structures that are used for analysis and reporting, and the earlier you can do that in a study, that means you can get analysis results faster, and you can make those really critical go- no-go decisions, so having those results early on in the study means you can make changes to the protocol if you need to, and adjust based on some of the results you're seeing

Another part of that too is that there's going to be some tradeoffs in the way that that's done, so there's two approaches that industry organizations are looking at right now, either creating their own standards that are more aligned with what is the submission standard, and that's going to be where you're going to have an easier tradeoff between that data collection space and into that SDTM submission standard. The other option is to actually standardize it based on whatever is coming in from that data provider, and that is going to make it faster for you to get the data actually into the trial to be able to do analysis, so really trying to find an approach, the best option is to do a combination of both, or work flexibly to make sure that you can reduce all the downstream issues that happen when you don't have a good control at the beginning.

Recent Videos
© 2024 MJH Life Sciences

All rights reserved.