eClinical Technology Issues: Associated with Poor Operational Design & Relationships

Article

Applied Clinical Trials

The data analysis in this article will delve into Clinical SCORE’s normative database to generate predictive models, and qualitatively reason the findings.

While at eXl pharma’s CROWN congress, there were interesting discussions on how to measure site performance. In particular, Dr. Ross Weaver, CEO of Clinical SCORE (an analytical benchmarking solution for clinical trials) emphasized that study sites who experienced issues with eClinical software were also associated with clinical trials that yielded unsuccessful enrollment outcomes (read more here).  

With the rise of technology use and eClinical solution providers in the clinical trials industry, this finding seems a bit unusual, especially since eClinical technologies are designed to boost operational performance. Hence, this finding led us to research other factors impacting perceptions about eClinical software issues. The data analysis in this article will delve into Clinical SCORE’s normative database to generate predictive models, and qualitatively reason the findings.
 

Article and Data Analysis in a Nutshell

In order to uncover breakthrough findings, we leveraged advanced computer modeling in this article, and some of the figures may seem a bit complex. However, the brief video below should make interpretation easier. You can also see a detailed breakdown in a PDF,

.

About the Normative Database Clinical SCORE has conducted surveys with unique study coordinators (SCs) and principal investigators (PIs) in 403 unique clinical trials, 25 countries, and 28 medical specialties to generate a normative database containing clinical trial operational benchmarks. This data allows for predictive model generation, which can be used during study design and execution. Below is an example of how we used this data to generate a predictive model on factors impacting software issues at study sites.
 

Clinical Software Issues: More Prevalent with Coordinators than Investigators

Figure 1 illustrates histograms suggesting that investigators have less issues with software than coordinators do. The investigator histogram shows higher distribution towards having low/no issues with eClinical software (average = 4.15), whereas the coordinator histogram appears to be more widely distributed, indicating a higher average (4.38) in reported issues with eClinical software utilization.

Figure 1: Software Issue Levels: Investigators & Coordinators (P<0.05)

It is possible that coordinators experience more software related issues because they operate more software systems during clinical operations compared to investigators. Josh Schoppe, Senior Outreach Coordinator at Sidney Kimmel Cancer Network, Thomas Jefferson University explains, “the complexity and variability of Sponsor software systems continues to increase. Site research staff are forced to spend increasingly more time orienting themselves to various software systems with minimal support from the Sponsor or its designee.” Schoppe added, “the majority of Site Investigators tend to be hands off with the collection and submission of electronic data and rely heavily on study support staff. Thus, the burden on study coordinators is increased and it often falls on them to resolve all software related issues on behalf of the investigator and the site.”

 

 

Proper Setup and Training: Critical Impact on Performance

Figure 2 shows a heat map that identifies trends associated with software issues. The high concentration of respondents on the upper left-hand corner of the heat map suggests that those who had no issues with setup/training also did not experience issues with software. The faint blue circles on the map show trend formation that appears to suggest more software issues are also associated with more setup/training issues.

Figure 2: Impact of Setup/Training on Software Issues (N= 403, P<0.001)

According to study sites, improper training leads to software-related issues down the line. “Sponsors tend to require lengthy automated training for their respective software platforms without technical support. This leads to poor initial understanding of the system and creates an attitude of disengagement,” indicated Schoppe.
 

Software Issues are Associated with Poor Relationships

When evaluating software issue levels on study staff relationships, Figures 3 and 4 justify that the problem with software issues may originate from poor relationships with Sponsors and CROs. The heat map in Figure 3 indicates a statistical relationship between sites’ perceptions of issues with CRAs and software issues; trend analysis shows that more issues with CRAs appears to be closely linked with more software issues, however, does not have a strong impact; to demonstrate, most of the ‘heat’ appears to focus in the center of the Y-Axis (indicating medium issues with software), and does not increase more as CRA issues increase. Trends in Figure 4 also exhibit that study sites are less likely to work with sponsors in future studies if they experience software related issues.

Figure 3: Impact of CRA relationship on Software Issues (N= 403, P<0.01)

Figure 4: Impact of Software Issues on Likelihood of Working with Sponsor Again (N= 403, P<0.05)

Are Sites Willing to Forgive?

We dug a little deeper to further investigate more associations and impact of software issues through a multi-dimensional bubble chart and heat map (Figure 5) and findings identify that software issues are more highly associated with CRA issues than willingness to work with sponsors on future studies.

Figure 5: Impact of Software Issues on Issues with CRAs and Site Willingness to Work with Sponsors

Figure 5 demonstrates that as we move from left to right on the chart, the level of software issues increase (as represented by darker bubbles) as issues with CRAs reach the 6th level of the scale on the X-Axis (as illustrated by the faint red arrow on the chart). Moreover, as we move past the 6th level on the X-Axis, the concentration of bubbles shifts downwards on the Y-Axis, suggesting that sites who are exhibiting CRA issues are also less likely to work with Sponsors in future studies.

What is particularly interesting about this finding is that most of the heat/darkness with software issues move towards the middle of the Y-Axis (outlined by faint red circles), which may suggest that some study sites have a level of tolerance in continuing to work with sponsors despite having issues with CRAs and eClinical software.

“The impact of software related issues can be successfully mitigated by monitors who build positive relationships with the site research staff. These positive relationships are built primarily on the monitors’ ability to quickly address study software issues which in turn prevent future occurrences,” said Schoppe. Schoppe added, “the operational success of a clinical trial is far more likely when data expectations are set and avenues for resolution are clearly defined from activation of the trial. Site staff will tolerate some level of difficulty during the length of the trial, but, this does have an inverse effect on the quality and timeliness of the data submitted.”
 

How Can Sponsors and CROs Mitigate Risk with this Predictive Model?

In this analysis, we’ve identified that eClinical software issue perception is associated to issues with Sponsor personnel relationships (i.e., CRAs), and operational design (i.e., training setup, and ongoing eClinical software support). Correspondingly, it is possible that poor operational success may negatively impact whether a study site wants to work with a Sponsor in future trials. 

With the rise in eClinical technology solutions comes the need to invest in proper training, software support systems, and establishing strong relationships between sponsor/CRO staff (i.e. CRAs), and study site staff; and that starts with good operational design tailored to individual sites’ needs.

Asides from ensuring that sufficient resources go into operationalizing new technology implementation, it is important for sponsors to constantly assess their clinical trials by surveying study sites, and leveraging established benchmarks to assess Sponsor performance. Doing so can enhance operational performance through optimized and data-driven decision making.
 

Related Content
© 2024 MJH Life Sciences

All rights reserved.