Applied Clinical Trials
An expert view on how sponsors can formalize the use of real-world data and generation of real-world evidence to drive critical insights.
Clinical trials tell us about the safety and effectiveness of drug and treatment therapies in carefully defined environments with carefully selected participants. But how will they perform in the full spectrum of medical use in an imperfect world? To answer that question, we’re seeing a lot of interest in real-world data from various sources that represent much larger populations, broader eligibility criteria, and data from external sources such as health insurance claims data and electronic medical records.
Real-world evidence helps identify which patients will benefit the most, based on biological, social, and lifestyle attributes that might not be captured in clinical trials. Real-world evidence provides a clearer picture of a product’s safety, effectiveness, economics, and value in day-to-day use. And it offers a deeper understanding of epidemiology trends and disease management, resulting in better diagnostics and treatment path.
Click to enlarge
However, real-world data can be massive, messy, and diverse; and most life science organizations aren’t fully prepared to deal with it. Analytics systems are generally a patchwork of products and tools that don’t speak to each other. It’s hard to find data scientists who understand the intricacies and caveats of the data sources. Data queries are difficult and complex to write and take a long time to run, and this is often compounded by different groups unknowingly duplicating each other’s work.
As researchers look for more predictive insight from huge streams of data, the traditional ways are no longer sufficient. It’s time for life sciences organizations to formalize the platform and processes they use to create, govern, share, and reuse real-world data to drive critical insights.
The essential foundation
To formalize the management of real-world data and generation of real-world evidence, organizations must have six foundational capabilities:
1. A unified data architecture simplifies IT’s role and ensures that all functional groups, such as epidemiology, commercial, and R&D groups, are working off the same page.
2. Moving data from a dedicated processing appliance ($20,000 to $30,000 per terabyte) to high-performance distributed computing (Hadoop, about $4,000/terabyte) saves $800,000 for every 50 terabytes of data.
3. While there will likely never be one common data model, tools can simplify and automate the processes needed to transform and standardize data, regardless of the source and target systems.
4. Well-governed data management ensures that data transformations occur the same way each time, only the right people are accessing the data, and data processes are maintained in a structured way.
5. Reuse of cohort definitions is supported once all data sources are mapped to a common data model, and you are defining cohorts in a consistent and repeatable way.
6. Templates for analytic use cases, such as signal detection, can be created and shared, which helps accelerate adoption, while more sophisticated use cases can be built on top of them. There’s no need to reinvent the wheel.
Taking real-world evidence to the next level
Standardization and reuse of data transformations and analytics combined with advanced ad hoc analytics capabilities make real-world evidence faster and more consistent, repeatable, intuitive, and powerful.
Standard, customizable cohort builder
If you use a cohort builder, your choices have traditionally been a) easy-to-use tools that didn’t do much, or b) sophisticated tools that required serious technical expertise.
The key for cohort builders today is to strike a balance between the two. Having an intuitive visual interface can support those who understand the population but don’t necessarily know how to code, and can guide them through the process of specifying the criteria of interest. Such tools also support the complex query logic often required for these types of projects, such as multiple events and temporal relationships between activities or events in a patient’s history.
Traditional and visual analysis
A visual analytics interface empowers non-technical users to do their own ad hoc exploration and streamlines the work of technical users. As a statistician, I can apply models, perform regression analysis and so forth in a visual tool that’s working in memory. So now, I can fit models to my large data very rapidly, and I can hone in on which variables are important or of most interest. Then I can take that insight and do a more detailed, maybe hand-coded type of analysis of the data.
Advanced analytics for deeper insights
Advanced analytics changes the story from hindsight to insight and then foresight-which is where organizations really need to be.
Unlike hypothesis-driven research, machine learning uses automated model building to adapt to what’s happening in a population and finds things a human might not have thought to search for. With every iteration, the algorithms get smarter and deliver more accurate results. These methods have the potential to identify groups of patients who will benefit the most-or potentially be harmed by-a therapy.
Add value to your existing data investments
These foundation capabilities and enhancements translate into real value for life sciences organizations, especially considering the investments already made in the data sources themselves.
Managing real-world evidence in a well-structured, easy to use, and repeatable way is not just about wrapping up more projects in less time at less cost. It’s about gaining insights to make decisions that deliver real value for your company and the patients who rely on your discoveries.
Robert Collins is Senior Life Sciences Industry Consultant with SAS.
Driving Diversity with the Integrated Research Model
October 16th 2024Ashley Moultrie, CCRP, senior director, DEI & community engagement, Javara discusses current trends and challenges with achieving greater diversity in clinical trials, how integrated research organizations are bringing care directly to patients, and more.
AI in Clinical Trials: A Long, But Promising Road Ahead
May 29th 2024Stephen Pyke, chief clinical data and digital officer, Parexel, discusses how AI can be used in clinical trials to streamline operational processes, the importance of collaboration and data sharing in advancing the use of technology, and more.