MCC has 100 different metrics relating to clinical trials, from timeliness and cycle time metrics to quality, efficiency, and cost metrics.
MCC has 100 different metrics relating to clinical trials, from timeliness and cycle time metrics to quality, efficiency, and cost metrics. This month we explore a metric, which is critical to timely site initiation: the percentage of cardiopulmonary equipment that is shipped on time to sites.
Why this metric is important: Once a site is ready to initiate, everything must be in place at the site in order to achieve first patient first visit. The on-time shipping metric helps determine the effectiveness of your processes and suppliers in getting equipment prepared for shipping, shipped, through customs, and to the site. In addition, you can determine if the core lab can provide the required start-up supplies per the timeline. Finally, you have the means to assess the sponsor’s ability to provide the required data (names, addresses, delivery dates, etc.) in a timely manner. Using information gleaned from this metric will assist sponsors and core labs in understanding the caveats in shipping to different regions of the world.
Definition: At MCC, we define On-time Equipment Shipments as the percentage of sites that received their equipment by the agreed upon receipt date (based on defined expectations between the sponsor and core lab). This metric should be stratified by study and also include an overall average for the sponsor. Note that timelines must be discussed, established proactively and agreed to by both parties to make this a meaningful metric.
How to calculate this metric: Divide the (total N of sites that received equipment by the expected date listed by study) by the (total N of sites that require equipment) and multiply the result by 100.
Example: 1,000 sites require equipment shipments and 990 received their equipment by the expected date.
Result: 990/1000 x 100 = 99% received machines within expectations.
What you need in order to measure this: You need the total number of sites requiring equipment shipments, as well as expected and actual delivery dates for each site (this data is needed to calculate whether delivery is “on time”). If possible, you should sort the site data by protocol and country location.
What makes performance on this metric hard to achieve: The shipping metric spans many aspects inside and outside the control of the core lab (but mostly under its control). Besides the core lab’s capabilities, the shipper and customs entities in each country can impact performance. In addition, the sponsor must provide the core lab with the date the equipment is needed and the correct address for each site in a timely manner.
What you can do to improve performance: Once you have identified performance problems with this metric, you can drill down to determine whether you have a process problem (e.g., correct data is not getting from the sponsor to the core lab with enough lead time for the core lab to be able to initiate the shipment process in a timely manner) or a supplier problem (e.g., core lab doesn’t understand the customs requirements in different countries). This analysis will allow you to implement changes that will eliminate late deliveries and speed site startup and enrollment.
Companion metrics: Other metrics that you should consider in tandem with this metric include: (1) average number of days from study award to contract signature, (2) average number of days from signed technical specifications document to core lab ready to receive 'samples' and (3) percentage of equipment failures as determined by the site.
Example: In the graph below, we have plotted percent on-time equipment shipments by country and by study. As you can see, Country B has a particularly poor on-time percentage. Since all of our studies are performing poorly for Country B (not just one or two), we might conclude that there is a country-specific problem (e.g., clearing through customs takes longer than expected or there is a local transportation problem). Likewise, Study 4 has consistently poor performance, leading us to suspect a problem with either the protocol or the team.
Dave Zuckerman, CEO, Metrics Champion Consortium, [email protected]
Linda Sullivan, COO, Metrics Champion Consortium, [email protected]