What it takes to get all stakeholders on the same page and keep project timelimes on track.
Given the years of experience and thousands of clinical studies that have been performed using Electronic Data Capture (EDC), one could be forgiven for thinking that EDC processes are as firmly established as other data management activities. Surely EDC has become as routine as other clinical processes like coding or protocol development, hasn't it?
(GLOW IMAGES, GETTY IMAGES)
As any project manager who has worked on an EDC study will attest, nothing could be further from the truth. Given the variety of combinations of sponsor, CROs, and EDC vendors, plus the inherent communication challenges, many companies find themselves reinventing the wheel with each EDC implementation.
Many have tried to alleviate this problem by working with the same CROs and EDC vendors repeatedly. But even this approach does not ensure that the current project is going to be handled the same way as previous ones.
Many EDC platforms are updated more than once a year. Some of these changes are relatively minor, while others are major platform changes such as adding a clinical trial management system or improving development tools.
Moreover, sponsor, CRO, and EDC companies are constantly updating their internal processes in an effort to improve productivity or as a result of recent audit findings.
Last year's successful partnerships do not guarantee that this year's partnerships will also succeed. New software versions, new project teams, and new procedures can all put this year's projects at risk of failing to meet study objectives. As EDC systems continue to evolve, the management of these tools continues to mature as well.
This article will focus on issues related to defining deliverables and identifying responsibilities. It will also highlight some areas where confusion is likely, including planning documentation, roles and responsibilities for testing, change control communication, and evolving processes.
Finally, the article will describe measures to mitigate the risks posed by the these issues. It assumes that the project will include both a CRO and a separate EDC vendor. Obviously there are other possible structures, but many of the same issues will be found in sponsor/CRO and sponsor/EDC vendor relationships.
When there are multiple team members relying on each other for critical path deliverables, it is essential that there is a common understanding of the deliverables and the underlying details.
Too often and too late it becomes apparent that there is a discrepancy between what different team members are expecting. Requirements Specifications are a good example of where stakeholders frequently get confused.
Nontechnical staff may expect Requirements Specifications to be high-level documents that lack technical detail. In technical parlance, these are often referred to as User Requirements Specifications. Conversely, developers might expect Technical Requirements Specifications, which are significantly more detailed and often contain technical information that would be foreign to most nontechnical staff.
When a deliverable is simply described as Requirements Specifications, it can mean two very different things to staff. This may result in key tasks taking longer than expected, as people reconcile their expectations about what the document should contain and who should produce key pieces.
One example of poorly defined deliverables involved a CRO, EDC vendor, and sponsor who worked together on a large patient registry that has long since finished.
The CRO was responsible for approving the design specifications. The CRO's data management lead was expecting the EDC vendor to translate 10 pages of requirements into roughly 100 pages of very detailed information that would be considered the final design specifications. This was consistent with the CRO's experience on other EDC studies with other vendors. The project timeline allowed two days to review and approve these specifications.
A few weeks after forwarding the requirements documentation to the EDC vendor, their technical staff returned over 500 pages of technical documents that constituted the design specifications. The vendor assumed that the CRO data management lead would need the complete technical documentation, which supplied every detail of the configuration in a manner that database administrators and programmers needed, rather than something comprehensible to a knowledgeable lay reader.
Conversely, the CRO data management lead was expecting a document that described the system in detail for a lay audience. This miscommunication about the specifics of this intermediate deliverable jeopardized timelines for the entire project.
This sort of confusion could have been easily avoided had the appropriate discussion surrounding expectations of the deliverable, the review process, and timelines been initiated earlier in the study.
Very often study team members assume that their colleagues and partners will use processes and documents similar to their own. Moreover, team members often believe that individuals in other organizations define terms identically. Confusion is inevitable as these differing expectations are realized as deliverables are exchanged.
It is crucial that these differing expectations are identified and reconciled in the planning stages even when partners have worked together before. Clear and concise communication is critical in the planning stages in order to deliver a successful implementation.
Defining roles and responsibilities seems like one of the most basic steps to managing any project. The contract and statement of work define deliverables, timelines, and milestones over the life of the project. While roles and responsibilities are defined at a high level, the details are often unclear.
Many tasks involve multiple parties (see Figure 1), and the expectations of each may not be clear. EDC system testing is a particularly good example. Each EDC vendor has its own unique tools and processes; this variety results in different expectations regarding testing. These differences in testing expectations can adversely affect projects.
Figure 1.
There are numerous projects where the sponsor, CRO, and EDC vendor had differing expectations of what was expected during User Acceptance Testing (UAT). The sponsor or CRO may expect testing to be a straightforward exercise that focuses on ensuring that navigation and question wording are appropriate. Conversely, the EDC vendor may anticipate a more detailed approach that includes testing that edit checks work with a variety of inputs. Often these issues do not become apparent until the start of testing, leaving little time to muster the necessary testing resources.
Frequently, these misunderstandings can be worked out by throwing people at the problem as testers; however, there are other occasions where timelines must be moved and overall project expectations adjusted. This sort of problem can be easily addressed early in the process by agreeing on the various testing phases as well as clearly defining the expectations around each testing phase (see sidebar).
Mini Dictionary of Major Terms
In practice, definitions and expectations vary between vendor, CRO, and sponsor. Project success depends on establishing clear expectations for each testing scenario.
For example, is the EDC vendor expecting that deviations from design specification (e.g., errors in edit checks or user roles) will be found at UAT? If so, that requires the tester to develop and execute a detailed test plan. Is the EDC vendor expecting that all errors are corrected before UAT? If so, then the time to perform UAT should be significantly shorter than if extensive testing were required.
Conversely, sponsors need to be aware of what changes are in scope at all phases of the development and testing process. Typically, some data elements or edit checks may change at Beta Testing, while only text changes may be acceptable at UAT. It is a simple matter to develop an application regardless of what processes are in place. It is more complicated to ensure that the team members involved in defining system requirements and performing testing clearly understand what is expected of them and what authority they have to request changes.
Clearly communicating the details of a complicated process to all stakeholders is essential, as well as closely managing expectations throughout the development phase.
Tripartite communication can be particularly challenging on any project. This problem can be exacerbated when one of the key deliverables is a software system and the parties involved differ greatly in their technical expertise.
One example of a challenging communication process was a study where the sponsor managed the EDC vendor, while the CRO was tasked with analyzing data. Frequent changes were requested to the eCRFs. The sponsor assumed that the EDC vendor was communicating system changes to the CRO. The EDC vendor assumed that the sponsor was notifying the CRO directly. When the CRO's statisticians needed to analyze the data and report results in a very short time frame, they found that many of their programs malfunctioned as a result of new or missing fields.
The solution to this problem is straightforward but requires significant discipline to address. Many of the other solutions mentioned in this article require one time actions and are therefore easy to implement. However, establishing and adhering to appropriate communication channels is more challenging. It requires that new team members are made aware of the communication policy, that existing staff habitually adhere to the policy, and that all parties police their own activities and that of the others.
One promising approach is for all parties to treat the communication plan as a controlled document that is auditable by the various QA departments involved. Once the appropriate communication plan is established, each party adopts it as a study specific procedure that augments their existing change control policies.
For example, if the sponsor, CRO, and EDC vendor each adopt a study specific procedure that dictates that all requests come from the CRO after written approval from the sponsor, it will force a change control procedure that keeps all relevant parties aware of impending changes to the data structure.
One common mistake when initiating EDC projects is to assume that you have worked with the vendor, sponsor or CRO enough times to know their processes. Strategic partnerships are designed to avoid the challenges and miscommunications that result from working with different vendors on different studies. However, even with the most established partnerships, new technologies, new staff, new business needs, and new audit findings may lead to new processes.
It is important to accommodate the possibility that the study that you are about to begin will be impacted by a change in one of the parties involved.
An example of an evolving process is when a CRO starts work with an EDC vendor on two studies that begin only a few months apart. Then, during the intervening months between the start of the first study and the start of the second, the requirements documents change significantly. This can present a significant resourcing challenge to the CRO's data management group if the issue is not identified very early in EDC development for the second study. Changing resource expectations in that manner impacts many deliverables, not just the EDC system. When dealing with senior staff working on multiple projects, this sort of shift in expectations has a tremendous impact on the sponsor's (or CRO's) ability to meet the needs of other projects.
It is vital to ensure that the processes for a given project are very well-defined at the start of a study—even if they are the same processes and the same staff that were involved in a nearly identical project. Reviewing the processes will help identify those changes not considered worth mentioning at the start of development, the ones that end up causing numerous resource allocation problems down the road.
Rather than trying to address each of the areas discussed individually, regular project management processes can be altered to address all of them. It is very important that key issues are reviewed during the contract negotiation stage. In particular, detailed roles and responsibilities should be discussed. While no contract can address all of the details, broad guidelines of expectations should be established.
Also an EDC project kickoff meeting should be held to sort out details not addressed in the contract. This meeting can be a part of a broader kickoff meeting or, better still, a separate meeting with stakeholder attendees from the EDC vendor, sponsor, and CRO.
The meeting should strive to blueprint every task on the project and who is involved. It should also strive to answer the following five questions:
1. What are the various tasks involved in the project, and who is responsible? Examples include generating Requirements Specifications, approving design specifications, performing beta and alpha testing. A RACI matrix, which defines roles and responsibilities, should be developed (see Table 1).
Sample RACI Diagram for EDC Studies
2. What are the key documents that will be used in the project? Who generates them, who approves them, and how large are the documents? Samples should be distributed to give everyone on the project an idea of what the documents will look like and the amount of effort required to generate and approve them. The review and approval process should be clearly outlined and agreed upon prior to work commencing.
3. How will testing occur? At what points will prototypes be shared? What kinds of changes are considered in scope at each development stage? What are the expectations for alpha, beta, and UAT? Who is responsible for errors in the system after testing is complete?
4. How will new staff be trained/oriented to the project?
5. Who is responsible for creating the communication plan for the project? How will it be enforced?
A strong project manager who has a solid understanding of managing technical vendors is important to a successful EDC project. Understanding nuances and accepting that such projects will inevitably encounter new and unique challenges enables them to proactively identify and mitigate risks.
With deliberate and clear communication, proper expectation setting, and detailed plans to address each problem area, EDC projects can live up to their promise.
Richard F. Pless is the director of data operations for the Lifecycle Sciences Group of ICON Clinical Research, 150 S. Saunders Road, Suite 100, Lake Forest, IL 60045, email: Richard.Pless@iconplc.com
1. R. Robertson and K. Gustafson, Cross Project Cohesion between Vendors, Global Biopharmaceutical Outsourcing Conference, Philadelphia, February 25-26, 2008.
2. A Guide to the Project Management Body of Knowledge, Third Edition (Project Management Institute, 2004).
3. Wikipedia, RACI diagram, http://en.wikipedia.org/wiki/RACI_diagram.
4. Applied Clinical Trials Web site, "EDC Vendor Selection: The Data Says...," http://appliedclinicaltrialsonline.findpharma.com/appliedclinicaltrials/EDC/EDC-Vendor-selection-The-Data-Says/ArticleStandard/Article/detail/550468?searchString=EDC%20Vendor%20Selection.
5. B. Anderson, "How We Fail to Use CROs Effectively (And What You Can Do About It)," Applied Clinical Trials, August 2008, 42-48.