At ExL’s CROWN Congress, Cassandra Smith, Associate Director, Investigator & Patient Engagement at Janssen, discussed results from a study they conducted with patients on consent content modification.
There is much talk in the industry about moving towards eConsenting, however, not many have done research into consent content restructuring and its impact on patient understanding. Janssen has taken the move to study how patients interact with consent content and has generated insights on better understanding patient reactions. At ExL’s CROWN Congress, Cassandra Smith, Associate Director, Investigator & Patient Engagement at Janssen, discussed results from a study they conducted with patients on consent content modification. ExL’s 9th Proactive GCP Compliance Conference is occurring from March 19-21, 2018 in Philadelphia.
Moe Alsumidaie: Is there a difference between patient centricity and patient engagement?
Cassandra Smith: There is a difference between patient centricity and patient engagement. Patient centricity focuses on better process design and what is presented in front of the patient, whereas patient engagement is changing the clinical trial platform to be more interactive with the patient, a back and forth engagement. For example, when looking at informed consent design and implementation, patient centricity looks at our current process, which is a lengthy paper-based document that has technical terms and realizing the opportunity to make the document simpler by reducing the language and text. From a patient engagement perspective, it means changing that document from paper into an electronic document, so a patient can interact with it, read it, and get further description on the technical terms we introduce.
MA: What issues did you discover with the way consents are currently written?
CS: There were a few central themes that we identified in our patient focus groups.
The first one was that our informed consent document could be better organized. Information on a topic were found in bits and pieces on different pages. It was clear that the patient wanted to understand everything that we were trying to convey to them about a study aspect in one place. Patients also felt the language was verbose and really drawn out; the consent had a lot of running paragraphs instead of tables or other ways to make the text a little bit easier to digest. Additionally, patients wanted us to be clearer and more concise with what we were trying to tell them in the consent. Another central theme that came out was that patients didn't want to have to read through the entire consent to get a good understanding of what we were trying to tell them; they wanted an executive summary where they could look at a few paragraphs that would inform them, on a high level, about the study. This could help them make a snap decision about study participation, and whether they wanted to look through the remainder of the consent.
MA: Can you describe the technology used to measure how the patients progressed through the consent?
CS: Janssen has an internal market research group called Consumer Experience Center, which is well versed in conducting clinical and consumer studies for our products. When we first came to them with this proposal of helping us study our informed consent, they came up with four things that they felt would help us measure outcomes. The first was an eye tracker; they gave the patients the informed consent to read, and tracked patient eye movements, so they can see how they read the document, what words they focused on, and how long it took them to read it. This data collection methodology confirmed that patients were reading some of the areas more thoroughly than others, and highlighted areas where patients got hung up and went back to multiple times. The group also used an emotional dial that was married up with the eye tracker software. As the patients read through the informed consent document, when they came upon words or phrases or sections that invoked some sort of emotional response in them, they could either turn the dial to green (to show that they felt positive about whatever they read), or to red (if they felt negative or didn't like something). Through this methodology, we quantitatively identified sections with negative and positive feelings. We also collected data via a consent comprehension survey, as we wanted to make sure that patients walked away with a basic understanding
of the study. For example, say the patients have to come into the clinic for 10 visits for a 12-week study. We wanted to make sure they walked away understanding that and could recall that information. We did that both for the baseline informed consent and the revised one, to see what the delta is and if they understood one better than the other. The last data collection tool we used were emojis, which measured how patients felt when they reviewed the consent form. This measurement was set up to be more qualitative not quantitative and we wanted to show a directional change in how patients felt during the review of the initial document vs. during the review of the revised document.
MA: How did you use these learnings to modify the consent?
CS: We agreed right away with what the patients were saying. In addition to the patient focus group, we hired a health literacy agency to hear patient feedback and then rewrite the documents for us based on what they heard. The agency observed that there was a lot of information broken out into different areas, so, they aggregated topics in the informed consent and put it all within the same area. We also cut down on some of the text; for example, we usually discuss the risks to an unborn fetus in many places in the consent. In the revised consent we say it once in the risks section. Additionally, we simplified the language in the revised document; originally, we would explain consent components in three sentences. In the revised consent, we cut down the components into one simple sentence and focused on exactly what needed to be said to the patient. We also organized some of our content; instead of explaining components in a long paragraph, we tabled them with bullets.
MA: What results did you observe after you modified the consent?
CS: Our measurements showed that patients felt more positive towards the revised consent. The average comprehension score in the revised informed consent increased by 7%, but, more importantly, the low score improved by 27%, which tells us that more people walked away with a better understanding of the document than in the baseline consent. Additionally, listening in the back room to the patient focus group we observed the patients having a calmer reaction to the revised document. So, there was a more positive response for the revised document, and improved comprehension. We learned from this study that we need to simplify content in consents, so that patients can better understand the content and the requirements of study participation.
Moe Alsumidaie, MBA, MSF is Chief Data Scientist at Annex Clinical, and Editorial Advisory Board member for and regular contributor to Applied Clinical Trials.