Applied Clinical Trials
Recent initiatives demonstrate that European health authorities are serious about combatting the proliferation of threats and opportunities from big data.
In today’s era of news sharing, you can’t be too careful about what you listen to, and what you believe. And since data sources are ceaselessly multiplying, the challenge becomes all the greater in deciding how to sort through the growing floods of information, misinformation, and disinformation. Amidst all of that, how are drug developers to keep their heads above water?
This is the challenge that European health authorities are now beginning to take more seriously-as is demonstrated by a flurry of recent initiatives, and most notably by a mid-February report from the big guns of the regulators. The European Medicines Agency (EMA) and the heads of more than a dozen of Europe’s national agencies have outlined their views on how to respond to the proliferation of threats and opportunities from big data.
Just as a taster, the initial recommendations relating specifically to clinical trials range from the obvious to the ambitious. “Data standardization activities are critical to increase data interoperability and facilitate data sharing,” concludes the study, unremarkably. So it is necessary to agree on data formats and standards for regulatory submissions of raw patient data, via strong support to the use of global data standards, and for alignment with other regulatory bodies.
A big upgrade to the use of individual patient-level data in regulatory processes is envisioned. Direct access should be routine during review of marketing authorization applications, and authorities should have greater capacity and skills for analyzing this material and better imaging expertise, says the report. And pilot studies should be set up to define innovative outcomes from imaging data, and to determine the validity of computer-aided evaluation of images.
Still more radically, the authorities recommend systematic sharing of clinical trial data submitted for regulatory assessment. They want to see “a data-sharing culture” in Europe, in which there is full recognition of the value of clinical data sharing for drug development. “The vast majority of clinical trials are never submitted as part of a regulatory submission and standardization activities are critical to increase data interoperability and facilitate data sharing,” the report points out. So here too, it urges pilot studies to show the value of sharing of clinical data that has regulatory relevance-such as identification of safety signals, product class comparisons, and indirect comparisons of closely related medicinal products.
The report is just the first output from the joint taskforce set up by European regulators, based on the conviction that “a regulatory strategy is required to determine when and how in the product lifecycle evidence derived from such data may be acceptable for regulatory decision-making.” It offers a definition of big data, and reviews the data landscape in genomics, bioanalytical ‘omics (with a focus on proteomics), clinical trials, observational data, spontaneous adverse drug reaction (ADR) data, and social media and mHealth data. It recognizes that data may reach regulatory authorities as supportive data in the margins of more traditional analyzed structured data, or may underpin a regulatory submission as a whole. “It is thus essential that the regulatory network understands its presence and the robustness by which it was generated in order to make a competent evaluation of the submission as a whole.”
Present limitations
The current deficiencies in Europe’s capacities are catalogued. A survey of national competent authorities revealed “very limited expertise in big data analytics at national level”; “Eight of 24 reported no in-house expertise in biostatistics;” and “maintaining sufficient expertise within the regulatory network will be an increasing challenge.” Concerns highlighted by industry in exploiting big data sets included data access, data integration, data validation, and data reproducibility, as well as data security and data protection.
The needs are as great beyond Europe as within Europe, the report notes. “From a regulatory perspective, global cooperation is important, as for many rare diseases and cancers or indeed rare ADRs, there may only be a handful of cases worldwide and these data need to be interoperable to derive meaningful insights.” But the particular challenges of operating across broader geographies are acknowledged, too. The report goes on to warn that “on a global level, it is important to ensure that extremely expensive and time-consuming initiatives do not pull in opposite directions but work together to achieve sustainable and global solutions.”
Similarly, there are “multiple barriers” to data-sharing in sufficient depth and detail so as to retain its utility, not least with increasingly complex data from multiple sources. Europe has a poor record in this: “Europe has failed to define a clear path to enable sustainability of many previous data-sharing efforts, particularly for observational healthcare data, and defining this should be a priority in the future,” the report states. But in addition, meeting data protection obligations on a global scale requires urgent development of global guiding principles and standards for data anonymization, it insists.
On top of that, there are more self-interested barriers: “Data sharing is additionally hindered by a reluctance to share data in order to promote individual career ambitions or protect potentially commercially valuable information,” and these demand some new mandatory elements to drive sharing.
The report takes the form of a summary of its reflections so far, published in order to gather comments and responses from stakeholders (you have until April 28 to take a look at the document and make your own views known). Meanwhile, a further group looking at cross-cutting data-processing and analytics is due to deliver a subsequent report in the first half of 2019. Early indications are that this will recommend formation of a standing advisory group to explore the applicability of big data analytic methodologies, standards, and IT architecture, so as to support the development, scientific evaluation, supervision, and monitoring of medicines. Validation of novel analytical approaches and the clinical relevance of the derived endpoints will be a key part in defining their acceptability, especially for algorithms, the European group is expected to urge.
Peter O'Donnell is a freelance journalist who specializes in European health affairs and is based in Brussels, Belgium