Sunday, April 18, 2021
Home News Campus News Lauren Wilcox Discusses Participatory Artificial Intelligence (AI) in the Medical Field

Lauren Wilcox Discusses Participatory Artificial Intelligence (AI) in the Medical Field

Update 4/6/2021: This article has been updated to clarify quotes and distinguish work done in Google and Georgia Tech.

UCI’s Department of Information & Computer Sciences welcomed Georgia Tech associate professor Lauren Wilcox to hold an informative seminar for the evolution of computing technology in health care on March 5. 

Wilcox, who was previously a research lead in Google Health, opened the seminar with a brief introduction about recent and emerging advancements that have led to research in Artificial Intelligence (AI).

“While we see a lot of potential for AI to be helpful, we also see challenges. For example, in the worst case, we find that system performance, once deployed, doesn’t necessarily improve overall accuracy — as well as many other challenges,” Wilcox said to further explicate the role of AI research in the medical field.

According to Wilcox, we need to research “sociotechnical systems,” to account for the inextricable relationship between the “social” and “technical” components of a system. 

“We know technology shapes work flows, mediates human communication, it can disrupt social norms, and contexts of care. [These factors] can impact how people use technology,” Wilcox said. 

Wilcox then gave the audience a real example of how a deep learning system performed once deployed in clinics. Specifically, the system is able to detect diabetic retinopathy, which is a “complication of diabetes that affects the eyes.” The researchers observed nurses’ use of the system for eight months. 

In order to better understand and refine the screening technology that would be able to detect diabetic retinopathy, researchers interviewed the nurses and monitored their use of said technology. Patients who agreed to partake in the trial-run of this screening technology also consented to a study, which included further observation in order to determine the effectiveness of the clinic’s method of gathering information. 

According to Wilcox, one finding of the study had to do with gradability. It refers to the process of “reading an image” and then “making assessments” of the image, which could be affected by factors such as the image quality. 

“These factors are important because they affected system performance as a whole. We also studied some protocol problems due to emphasis on performance in the lab. Together, these shortcomings affected trust in the system and willingness [for a patient] to use [these systems],” Wilcox said. 

In order to combat the distrust in AI systems among health patients, Wilcox said that the deep learning system will update the protocols around the use of the system to better meet the needs of patients and clinicians as a result of the study. This will simultaneously occur alongside more human-centered studies that take place before, during and after the deployment of the AI system.

Wilcox presented an article by Spirit AI, a company committed to improving AI systems, that contained recommendations about what type of protocols to include with AI when dealing with clinical issues. Their main recommendation for AI was to provide a “clear description” to describe things like the setting in which the AI is evaluated and details about how humans interact with the AI. 

Wilcox went on to present some of her work at Georgia Tech. Children’s Healthcare of Atlanta (CHOA) partnered with Wilcox’s lab for five years in order to improve “patient engagement” in health care. The research participants included over 59 families and 34 clinical caregivers.

“We looked at how patients, family members and clinicians worked together to manage a patient’s health. When doing field study data analysis, we went bottom up from broad interviews and observation notes to progressively higher levels of meaning,” Wilcox said.

Phase two of the study was focused primarily on “patient portal log analysis, surveys and interviews” and lasted over a 19-month period. The second phase documented how both teen patients and their parents handle health information. 

“So taking these findings together, we saw some key themes emerge from this study. Both teens and parents had unmet needs related to information access and communication — often mentioning diagnostic radiology data. Families also faced difficulties understanding and communicating about the patient’s felt experience,” Wilcox said.

After phase two, phase three shifted the focus onto developing “patient-friendly” tools to navigate radiology reports. The researchers, as part of the research to understand what patients information needs were, analyzed these health forum posts. Over 1,600 posts online were taken into consideration for the analysis, including reports from MedHelp, HealthBoards and Cancer Survivors Network. 

Next was domain knowledge elicitation, where Wilcox and her lab sampled common phrases found in the reports, such as “clinical correlation is needed” and “cannot be completely excluded.” These are common phrases that radiologists use to express that more tests are needed or that other clinical work is needed before they can come to a conclusion. They were then explained to patients in order to not confuse them with the topic.

“At the end of these two studies, we found 13 concept categories that indicated important functions of the radiology reports that could be supported through design. [The question was] how do we make [these reports] more understandable [for patients]? So, we developed a prototype called Rapport,” Wilcox said.

Rapport, which let the patient and their family review their radiology report with tools to help them understand phrases that might be confusing, was integrated with multiple functions that were easy for patients to use. Functions include displaying high-quality images of their exams for them to view, providing explanations as a way to clarify “medical jargon” present on their reports and a tab for patients to jot down any questions or personal comments for their clinician to see. 

After Rapport was up and running, phase three also included patients testing the system. Both teens and parents were allowed to use it during “clinical consolation to discuss [their] results,” as Wilcox said it was important for the patients to be “unguided” and encouraged to “think aloud” when using the system.

“In the clinic, we found improved patient-to-clinician communication and patient engagement. It led to really dynamic and disbursed reports and interactions. Several patients and parents told us that they appreciated [the system] … it met their needs,” Wilcox said.

Aside from Rapport, Visual Observations of Daily Living (ODLs), which was built upon previous research on ODLs, was also created to expose “social and environmental context;” it helped the researchers to understand health status indicators of patients as well as behavioral indicators through phrases such as “how am I feeling?” or “what have I done?”

Next Wilcox told the audience about bringing patients and family members into the process of creating Visual ODLs.

“Our goal is for [the patient] to lead us and create a narrative … in order to tell us about their illness experience. You asked them how their pain level is and they say ‘It’s fine,’ ‘It’s ok.’ We used Co-Design to reconstruct their daily experiences,” Wilcox said. This system was a better way for patients to express themselves; the goal was to reconstruct the experience in the design sessions to better understand how to design communication tools based on what patients experiences are actually.

Co-Design, which is a series of drawn pictures used to show how the patient is feeling, can be given to teen patients in order for them to properly explain how they are feeling to the clinician. According to Wilcox, it encourages “recognition over recall” and allows clinicians to “resolve discrepant teen/parent reports about the patient’s health status.” To implement the method, Wilcox and her team used drawn pictures to collaborate with patients.

Along with Co-Design, Wilcox introduced other ways of helping teen patients understand what they are feeling, such as “diary probes.” Diary probes are part of a diary study that involves “two-week take-home diary kits” to help teens pay attention to and explain how they are feeling.

“We provided diaries to families to use to log their experiences so that there was some generative input too. We found that, particularly, for things such as emotion logging, teens wanted expressive capabilities; they did not want to tell you how they were feeling based off a scale. Some wanted to take photos or draw pictures of how they were feeling, so this input ended up being really important,” Wilcox said.

Wilcox then introduced the mobile system Co-op, which is an in-the-works project that would allow teen patients to describe how they are feeling through the use of text, images, and drawings to document how they are feeling. According to Wilcox, Co-op will help “associate illness experience with activity” and it is designed to allow patients to explain their observations without “burdening them.”

To close the seminar, Wilcox noted some open research questions. Wilcox listed factors that she thought were interesting research topics to work on in human-centered AI during this phase of the project. These factors were user trust, consent and the issue of making a regular member of the “care team.”

When talking about research on new advancements down the road, Wilcox brought up the question of how do researchers make sure that these new advancements are “designed to preserve privacy, to put family and community in the center?” In order to make the correct choices, the patient must be aware of their options. However, there are multiple factors that could complicate decision-making, such as the difference in knowledge from the AI, the doctor, and the patient. This is one of the reasons why the subject of AI still needs continuous research.

Kealani Quijano is a Campus News Intern for the winter 2021 quarter. She can be reached at kaquijan@uci.edu.