In this video interview, Mark Melton, vice president of biospecimen data and operations, Slope, discusses how AI can be used to streamline sample collection and tracking.
In a recent video interview with Applied Clinical Trials, Mark Melton, vice president of biospecimen data and operations, Slope, discussed navigating data challenges in clinical trials, emphasizing the need for understanding complex data sources and ensuring a proper chain of custody for samples. Melton highlighted the importance of data mapping to standardize reporting across different labs and the necessity of secure data transfer to protect patient privacy.
A transcript of Melton’s conversation with ACT can be found below.
ACT: Is there any potential for the use of artificial intelligence (AI) or machine learning (ML) when it comes to data management or sample collecting/tracking?
Melton: I think so. Obviously, we all know it's still pretty new. I think the biggest impact in AI that people talk about is analysis of those samples. I'll give an example, because it's easy for anyone in the industry who has some experience in this to get this example. If I'm doing genomic testing of some sort—to keep it high level—a singular sample could theoretically produce thousands of data points. I think initially, the concept, which was smart, was to say, well, even if a trial is going really well and we're collecting all of this assay data, the ability for scientists to work through that and come to a result, or what it said about the treatment, but also what it could say about future treatment, is really difficult. The forethought in the industry, at least in my opinion, is AI is looking at the analysis of the data and coming to conclusions and looking at all these different data sources to pinpoint new biomarkers or new treatments, new approaches to treatment, but from a sample management standpoint, we're still really novel in that, still looking at how does that work.
Right now, between AI and machine learning, machine learning was a little bit of head of trying to translate all the different data sources that came in to help track the samples. I'd like to see AI grow into the aspect of some of what we do, no matter if it's programmatic, through something called SAS and R and Python programming that is primarily used in the industry to take these divergent sources and map them to a common data dictionary, even if they don't map, try to make sense of it. AI has the ability to step in ahead of time and to interpret that data before they're ever tested, and we haven't really gotten there yet. There's a few companies who are trying it, but I think we should leverage it more around essentially being intuitive. It has a lot of more analytical capability than say, a human does, of course, so the ability to say, well, we have all these divergent data sources. What are the patterns like? How does it report? Learning about the database reporting, learning about the common errors you see in that, and having the ability to flag that before you ever get to testing. If you're able to utilize artificial intelligence to do that, that will streamline all of the functions, the people that the industry is hiring to do that manually.