Estimates suggest that biopharmaceutical companies spend more than $150 million per year on data exchange. The industry needs to work smarter, faster, and with fewer resources.
At first glance, ensuring the efficient and effective transfer of clinical trial data between key stakeholders seems a matter just for the information technology (IT) department. Surely, given the ubiquitous presence of IT in clinical trials, data transfer between systems and companies is already seamless and fully optimized—or is it? Unfortunately, assuming that your systems can communicate with those run by third parties can prove to be an expensive mistake, resulting in longer and more expensive clinical trials.
Table 1. Work and groups involved in typical clinical study
Over the last few years, the application of technology has dramatically improved the conduct and management of clinical trials. Clinical data management systems (CDMS), clinical trial management systems (CTMS), electronic data capture (EDC), electronic patient-reported outcomes (ePRO), drug supply management (DSM), and interactive voice response systems (IVRS) help contain costs and shorten clinical development. These systems often affect common processes and use overlapping information. Yet they are generally used in isolation. As the number of IT clinical systems increases, there is a growing need to share data (Figure 1). By integrating technologies and processes, clinical trial sponsors can eliminate redundant tasks and accelerate the flow of critical information to key stakeholders.
Figure 1. Common data sets shared between clinical systems.
Despite clear benefits, biopharma companies lack a complete set of standards for data and its transmission. As a result, clinical trials typically use a variety of software and solutions to create, maintain, and share data. Furthermore, the same information is kept in multiple sources and formats. There are organizations addressing this key issue, such as CDISC (Clinical Data Interchange Standards Consortium) and Health Level 7 (HL/7), although it will probably be some time before a comprehensive set of standards are finalized. In the interim, some sponsors and vendors have developed internal data models to fill the gaps. Against this background, this article underscores the value of integration and interoperability. In particular, it highlights the importance of using flexible, standardized, robust systems that allow universal integrations and illustrates how considering integration can streamline business processes.
We need to begin by defining three key terms:
To illustrate one of the problems that can arise from poor integration, consider a project manager with access to online patient enrollment reports from the IVRS. Since investigators use IVRS to randomize patients, the enrollment figures are always up to date. But at the corporate level, the company tracks enrollment information in a CTMS to maintain high level and critical information for the entire clinical program. The clinical team is responsible for keeping the CTMS current. At the same time, the project manager's team has many other responsibilities. Since they already have the IVRS enrollment data, there is little incentive to do "extra" data entry. This means that the CTMS information is not in sync with real-time IVRS figures. Additionally, while entering or reconciling the information from many reports, they may make data entry errors and create discrepancies between systems.
But what does better integration mean to the pharma industry? Trials are getting very complex. This complexity contributes to the rapidly increasing cost of clinical trial programs. Between 1995 and 2000, for example, the cost of researching, developing, and launching a drug rose by nearly 50%. Over the same time, however, the cumulative success rate fell from 14% to 8%.
2
Currently, clinical studies account for more than half of the total cost of bringing a new drug to market (U.S. $467 million of $802 million; 2000 prices).
1
This complexity also means that data is created and shared among many different parties. For example, a typical trial could include: the sponsor, different CROs for Europe and the United States, a data safety monitoring board, a central lab, as well as vendors of ECG, EDC, and IVR for randomization, drug supply management, and ePRO data collection. All of these parties need site information to begin the trial. Typically, the sponsor holds site information inside a CTMS. But sharing the site information means that the sponsor needs to provide the data in different formats to all these parties.
Each of these study participants then needs to import, or in the worst case, rekey this information into their systems. Additionally, the originator of the data must remember to update all required parties using their method of notification for any additions or changes. Failure to do so could affect the supply of study medication, cause delays in patient enrollment or lead to other problems for the study.
Once all the sites are ready, patients are enrolled and randomized. For this example, investigators use EDC for screening and IVR for randomization. The patient is screened at the site and the eCRF is completed. The monitor then updates the CTMS at the CRO, typically by keying information. Then the CRO must send this information to the sponsor so it can be loaded or rekeyed into the sponsor CTMS system. The investigator then enters screening, demography, and other information into the IVR system to randomize the patient. Later, the randomization and dose information is rekeyed into the eCRF. Meanwhile, the investigator performs many other procedures that must be tracked by the CRO and sponsor to ensure accurate and timely payments are made to the site.
Again, the monitor needs to track this information in the CRO's CTMS, once again typically by keying information. Then the CRO needs to send this information to the sponsor so it can be loaded or rekeyed into the sponsor's CTMS system. The rekeying of information often leads to many data reconciliation errors, which must be queried or manually resolved. Such a lack of integrated data exchange and processes generally results in frustration at study sites, not to mention additional time, resources, and costs. The ultimate goal is to enter data in the most appropriate place and have it propagated or shared with all parties that need the information without the need for manual re-entry. To accomplish this, standardized data structures and transfer methods are critical.
A conservative estimate suggests that the biopharma industry spends more than $150 million per year on data exchange. These costs include the manual work required to import and export data between applications, as well as associated data reconciliation. Such manual manipulation of data is very slow and inefficient. They very often lead to delays in regulatory submissions, which are extremely costly in terms of lost sales opportunities. The biopharmaceutical industry needs to work smarter to do more, do it faster, and do it with fewer resources.
Analyzing processes and defining an optimal workflow is the key to working smarter. Some data workflows can be designed once on an enterprise level, such as CTMS or drug supply management system. Other optimal workflows are designed on a trial-by-trial basis, such as EDC or ePRO integrations. In the above trial, processes encompass enterprise and trial-level integrations. The key goal is a process that will reduce workload, reduce data error rates, and save time and money overall. Table 1 summarizes the work and groups involved for three steps in a clinical process.
By using efficient data workflow, all parties are up to date and receive correct information throughout the trial. An optimized workflow also reduces the workload and frustrations felt by study participants by reducing queries and duplication of data entry. Today, the whole picture may not be available, but an increasing number of pieces are falling into place.
As the above discussion suggests, successful integration relies on flexible, standards-based technology. Unfortunately, not all system providers and vendors offer this approach. Relatively few systems offer the ability to deliver real-time data exchanges across multiple platforms and databases. Some providers offer solutions that are integrated with their own in-house systems, but that will not necessarily work with their clients or third parties. Given the problems that can arise from poor integration, executives need to consider which potential partner is most likely to offer seamless integration. There are some companies that have developed proprietary technology based on today's standards to meets this need.
Nevertheless, there is more to seamless integration than IT alone. Experience is critical. To derive the most benefit, clients and vendors need to consider data and process integration while the study is being planned. The client needs to be open with the solution provider, allowing the latter's staff to communicate with the internal IT staff. These discussions can become rather technical, and it is likely that a suitable outsource partner will offer a team specializing in data integration that is able to apply best practices. Further, it is essential that the sponsor and all outsource providers meet to plan the details of any integration project, including both the systems and data that will be involved and the users that will be affected. Such meetings need to occur early to allow everyone to understand the issues.
These are predominately short-term, "tactical" issues. However, successful integration depends equally on a long-term "strategic" approach that understands and characterizes the sponsor company's business processes and workflow. Over the years, experience has shown that processes and workflow are sometimes unclear to clients. Sometimes it is because systems had developed over a long time; sometimes it is due to changes imposed by mergers and acquisitions. Further, emerging technologies are often adopted and implemented without standards. It is important to analyze existing processes to identify opportunities to streamline internal procedures. For example, a company may recognize that it holds the same data in several disparate places and that it always needs to rekey these items many times. In turn, this realization should lead the company to design a better workflow to reduce the redundant efforts.
Services and processes can, of course, always be improved—and integrated clinical trials are no exception. Organizations like CDISC and HL/7 will continue to help define standards for the electronic acquisition, exchange, submission, and archiving of data from clinical trials. Furthermore, over the next five to 15 years eCRFs and other data sources will become fully integrated with electronic patient records. This will make it much easier for clinicians and patients to participate in clinical studies. In the meantime, by ensuring that a company offers flexible, robust systems that allow universal integration and the expertise to analyze workflow and processes, executives can help contain rising costs and ensure that clinical studies finish on schedule and on budget.
1. J.A. DiMasi, R.W. Hansen, H.G. Grabowski, "The Price of Innovation: New Estimates of Drug Development Costs,"
J Health Econ,
22, 151-85 (2003).
2. A. Singh, J. Gilbert, P. Henske, "Rebuilding Big Pharma's Business Model (A#2003800191)" IN VIVO, 73 (November 2003).
3. D.B. Stein, M. MacDonald, B. Byrom, "Integrating eClinical Systems," European Pharmaceutical Contractor (Summer 2004).
Marie MacDonald, BSc, is director, strategic product development, ClinPhone, Inc., 650 Dundee Road, Northbrook, IL 60062 (847) 714-9709, fax (847) 714-9753, email: mmacdon@clinphone.com.