In this video interview, Mark Melton, vice president of biospecimen data and operations, Slope, talks secure data transfer and risk mitigation.
In a recent video interview with Applied Clinical Trials, Mark Melton, vice president of biospecimen data and operations, Slope, discussed navigating data challenges in clinical trials, emphasizing the need for understanding complex data sources and ensuring a proper chain of custody for samples. Melton highlighted the importance of data mapping to standardize reporting across different labs and the necessity of secure data transfer to protect patient privacy.
A transcript of Melton’s conversation with ACT can be found below.
ACT: What are some best practices for data governance and integrity?
Melton: That's a great question. First and foremost, we have to leverage the tools that we have, and part of that is documentation of what data is coming from, where and what format, the cadence, things like the file names, how it's transmitted over. You can look through all the regulatory sort of statements, and Europe's always been at the forefront of this. It's going to sound that it's not related, but it actually is. A few years ago, Facebook lost a lawsuit in Europe around data privacy. That actually had impacts on clinical trials, because the concept is that if I'm coming in as a patient, I should have the ability to give my blood, my tissue, the things that are obviously inherent to my body, but not tie it to the data that's associated with me, meaning you shouldn't be able to look at that and see who my siblings are or what genetic issues I have. You shouldn't be able to do that. The bulk of data right now is still emailed. It's kind of crazy, right? They'll take data and they'll email it out, here are the samples we collected, here's where they went, and then they'll send a separate email with a password. Well, if my email gets hacked, then whoever hacks that has access to both the password and the file. The first is, I think pretty logical. We control data movement. It has to be secure and secure portals with access restrictions and controls. We have to get away from emailing data, is the first thing. The second thing is governing how that data moves. There's something called data transfer specifications or agreements. The terms are different, but the basics of that document are saying, here's what I'm providing you, here's how often I'm providing it, here are the user restrictions, and here's what we're going to call the file. You would think that's pretty common practice. It's not as much as people would think, and that any data movement has to fall under that document, and then additionally, we have to have data management plans. What data are we collecting? How does that data flow through different databases? What's the quality control of every system that touches that data. Who has access to it? What are the oversight mechanisms to ensure that if there is a breach, or that if it falls outside that process, it's recognized in real time, and then identified and controlled? The basis of it is planning. These documents have signatures where those who are in charge have to sign off on it, but then on top of the documentation, you have to have the process behind that documentation that says, we have the governing documentation, but then we have the process to actually implement what we said in that document. You can start at how we move data around, to then how we govern, so how it's physically moved, then what are the restrictions on that, how we manage that data, and then ultimately, best practices. What do we do when things go wrong? Right now, while some of these concepts are generally known, if you were to go talk to someone else, they'll know what a data transfer spec is, but I would challenge that concept to say, well, when is it implemented? When's the data management plan implemented? How are you leveraging that with your vendors. The governance part is key. While the AI and the other technology comes along, we have to do our responsibility to govern it as best we can right now while the technology continues to evolve to allow us to do that better. We have to risk mitigate in the interim.
For me, if I could just encourage everyone, follow the basic best practices, make sure you're transmitting data in a safe and secure way. The second is that you're governing that through official documentation. The third is that you have a comprehensive plan for anyone providing data or touching your data. Who has what access? How is it controlled? Then ultimately, what do we do if something goes wrong? That is where we should all stick as an industry, and hopefully we can continue to influence regulators to enforce those policies.