Today’s convergence of health data and technology is fun to watch unfold, but a focused vision on system and process integration-and the framework for continuous enrichment-will be key to future lasting gains for clinical research.
The data explosion in clinical research is very real-there is no denying that. Industry attention and focus in this area-as a legitimate force for change-is so high these days that it can be dizzying to keep track of it all (we managed to catch our balance to highlight some of the latest thinking in our June/July issue on eSource and data integration). Today, a biopharma conference, or an industry webcast, or even a regulatory forum, more times than not probably features a headlining theme on new data technology trends and strategies in drug development. Big data, real-world data, electronic health records, wearables, advanced analytics, remote patient monitoring-you name it.
These are indeed exciting times to dream big on how these new data tools and process improvements can potentially revolutionize clinical trials. And by extension-critical to changes now taking place on the healthcare level-help usher in more personalized, so-called value-based treatments.
But before we get too mesmerized by the spectacle, a prudent and pragmatic lens on things is also warranted. R&D aficionados and those plugged into the day-to-day operations of clinical research will likely agree that perhaps the biggest issue in managing this data convergence in the life sciences is connecting reality with the conceptual promise of these “next-generation” information tools and processes.
“Like so many industries, the drug development enterprise tends to approach the use of technologies with this tremendous exuberance that is so far removed from the realities,” says Ken Getz, the longtime chairman of CISCRP and director of sponsored research at the Tufts Center for the Study of Drug Development. “This exuberance comes without really thinking through the kind of long and painful journey that’s going to be required by all of the workforce within drug development to change legacy processes and systems. Some of that will require cultural change and philosophical change in how we approach development activities.”
Given those challenges as well as the gaps in data standards and governance that still exist, crafting a practical vision of sorts for how researchers can capitalize on new opportunities in data use will be crucial-whether involving collection, management, reporting, sharing, etc. As part of a panel discussion at this month’s DIA Annual Meeting in Philadelphia (entitled “The Future of Clinical Research Data: 2020 and Beyond”), Getz and others will explore such a vision and the steps and mindset necessary to truly harness data to improve the research process of tomorrow. Here are more excerpts from my recent conversation with Getz on the topic.
ACT: What do you feel are the most compelling trends at the moment in the area of clinical research data?
Getz: I think the biggest trend is the big data movement, and the use of really large datasets on millions of patients. And even the use of real-world data in clinical research and in pharmacovigilance, for example. There are so many sources of structured and unstructured data-data coming from large electronic health record and electronic medical record platforms, from wearable devices, and mobile devices and
Ken Getz
applications. How do you gather and integrate all that data? What gets me most excited is how do you then use that data? Can we use that data to identify targeted patient subpopulations? Can we use that data to really inform us while the study is underway or at different critical points, so that we have a new understanding of data safety or we identify efficacy or safety patterns in the data? How can that data be constantly used to enrich our insight? Not only insight into treatments and their effectiveness, but from a management standpoint-to manage our clinical research more effectively.
Then there’s a lot of interest in using this data to predict performance. To select a more effective group of investigative site partners, for example. Or to predict patient recruitment and retention rates more effectively. That whole big data movement, it sounds so cliché, but it continues to make great progress and hold tremendous promise.
ACT: What about mHealth, specifically? Can we measure the impact yet on clinical care?
Getz: A lot of companies are piloting the use of select wearable devices and the use of smartphone and mobile applications. The biggest challenge is that some of these technologies are not validated. So some of the biometric data that they are collecting technically can’t be used in a submission. There are issues with some of the technologies, but there’s a growing number of organizations that are finding ways to integrate with really large datasets of patient records. I think we’re going to see a lot of progress in this area over the next 18 months.
ACT: How do you view the topic of data transparency in this whole equation? You hear a lot about initiatives in the industry promoting data sharing-whether things like patient genomes in the precompetitive space to even the sharing of clinical trial data (also see here, here, and here). What does all the discussion and activity mean right now for R&D practice?
Getz: Transparency-and I would add integration, which is central to that-are critical both within the enterprise at an operating level and obviously essential to partnering with patients and healthcare providers and payers. On the drug development side, we see that so much of this data and the technologies that are being used are siloed. They’re not integrated-they have really poor transparency. That’s requiring a lot of manpower to manage; it’s highly inefficient. Transparency and integration can play a huge role in breaking down those silos from a drug development operating standpoint.
With patients and healthcare providers and payers, there are huge ethical issues around use of patient data and making sure that it’s entirely transparent. That’s very essential to building trust with the patient community and with healthcare providers and payers. It’s also essential to ultimately best serving patients’ needs by creating what the FDA has called a learning environment, where clinical practice and clinical research are communicating with each other in real time and in an integrated way so that constant feedback and insight is coming from both domains. Integration and transparency are areas that we really have to adhere to the highest principles and ensure that we’re honoring and we’re meeting the obligation to provide unprecedented levels of transparency and disclosure.
ACT: Any thoughts on efforts in the U.S. and Europe to promote data transparency and require public disclosure of clinical trial results?
Getz: With a lot of these technologies, the patient isn’t even aware that data is being collected. Many of these applications now collect a
time and geographical stamp on a person while they’re carrying out their day-to day activity. Some of that information, if patients were fully aware of how it’s being used, many would be concerned-or at least should know and give their consent to having a lot of that information gathered. We have a lot of work to do to improve awareness of the kind of transparency that we’re looking for to build meaningful datasets and ultimately best serve the patient community.
ACT: Have you heard direct feedback from patient groups expressing these concerns?
Getz: We have. Patients generally are extremely willing to share their data when they understand how it’s going to be used-and how the data has been used has been honestly and comprehensively explained to them. They’re quite eager to share their data and they’re quite trusting when that transparency and disclosure is presented completely openly. I think part of the issue is just building into the process the steps that we always need to take to partner fully and adequately with the patient community.
ACT: In coming up with a future vision for research data, what are some logical steps needed to take these new opportunities from concept to reality in advancing the whole trial process?
Getz: The reality is that the use of most technologies is remarkably fragmented, it’s poorly integrated, we use so many disparate systems, and we use them inconsistently. All of these approaches ultimately are hurting productivity. They’re increasing inefficiency, they’re actually increasing our cycle times, and there’s just so much redundant activity. For example, the adoption and use of eSource is relatively low. We see so many professionals that have to do multiple data entry to move and transcribe from one system to the next.
Often, we throw new technologies and new systems that hold great conceptual promise into this stew of these kinds of operating conditions. As a result, there’s that wide gap between the reality and the conceptual promise. At the DIA forum, it’s our hope that we’ll be able to connect some of these dots. We want to discuss the conceptual promise but also suggest a reasonable and feasible roadmap to get us there.
The Tufts Center and CenterWatch have gathered a lot of data in this area-just showing how siloed and fragmented and inconsistent is the use of various technology solutions. Those individuals involved with executing activities and using these solutions are often quite frustrated.
ACT: You mention eSource. What challenges remain in the adoption of data standards governing the collection and analysis of clinical trial data electronically?
Getz: The Tufts Center, in collaboration with CDISC, did a study about nine months ago and we showed that the adoption of data interchange standards continued to grow steadily. Not only the adoption of some of the most mature and traditional standards, but also the use of some of the newer standards, including those that are tied to the sharing of data between compatible systems. But what we also saw is the lack of integration, the lack of interoperability, are hindering adoption. So you have a lot of professionals who are really committed to embracing the CDISC standards but they face a lot of headwinds in the process. Some of those headwinds we want to continue to explore.
Some integration will come when it has regulatory winds behind it or when there’s such great necessity that it forces the sort of siloed functions to interact in a more integrated way. We haven’t yet reached that point.