Authors

  1. McGraw, Mark

Article Content

With artificial intelligence (AI) being introduced to more X-ray departments, a team led by researchers from the University of Ulster in Coleraine, Ireland, has found that many radiographers in the United Kingdom have a limited understanding of how AI interprets images produced from X-rays, MRI, and computed tomography (CT) scans. With a national shortage of radiographers and radiologists, AI support in reporting "may help minimize the backlog of unreported images," the authors wrote in a study recently published in Radiography (2022; https://doi.org/10.1016/j.radi.2022.06.006).

  
X-Rays. X-Rays... - Click to enlarge in new windowX-Rays. X-Rays

"Modern AI is not well-understood by human end users," according to the investigators. "This may have ethical implications and impact human trust in these systems, due to over- and under-reliance."

 

With these potential repercussions in mind, the researchers sought to investigate the perceptions of reporting radiographers about AI, gather information to help explain how they might interact with AI in the future, and identify features perceived as necessary for appropriate trust in these smart computer systems.

 

As the authors pointed out, studies on the use of clinical decision support tools in different fields of health care have found that a user's response to the information gained from AI may differ based on several factors, such as the experience level of the user and the complexity of the task.

 

"Excessive trust, decreased levels of experience, and increased complexity of a task have been shown to increase the likelihood of the clinician changing their mind from their initial decision to agree with the AI," they wrote, noting that, while studies have reported impressive and even "human-exceeding performances of AI-enabled tools when used in image interpretation tasks, no system in use or development is flawless."

 

Incorrect automated diagnoses have been shown to negatively impact the decision making of expert and non-expert clinicians alike, the investigators continued, adding the importance of ensuring all clinicians exercise appropriate caution and judgment and use AI to assist and augment decision-making, but not to solely guide it.

 

The primary impetus for taking on this study was the SoR guidance document, said co-author Sonyia McFadden, FCR, PhD, a senior lecturer in Diagnostic Radiography and Imaging in the School of Health Sciences at the University of Ulster. "We wanted to get a snapshot of radiographers' perceptions of AI use in image interpretation.

 

To achieve this goal, McFadden and her colleagues designed a Qualtrics survey that was piloted by a team of U.K. AI expert radiographers. The study reported the third part of the survey, open to reporting radiographers only.

 

Overall, 86 responses were received, with 53 respondents (62%) expressing confidence "in how an AI [system] reached its decision," the authors noted. Less than one-third of respondents, however, said they would feel confident in communicating the AI decision to stakeholders, such as patients, caregivers, and other health care practitioners.

 

Among respondents, 49 (57%) indicated that affirmation from AI would improve confidence, with 60 study participants (70%) noting that disagreement would make respondents seek a second opinion.

 

"There is a moderate trust level in AI for image interpretation," the researchers wrote. "System performance data and AI visual explanations would increase trust."

 

Ultimately, the responses suggest that AI "will have a strong impact on reporting radiographers' decision making in the future," according to the authors. "Respondents are confident in how an AI [system] makes decisions, but less confident explaining this to others. Trust levels could be improved with explainable AI solutions."

 

In terms of implications for practice, the researchers say the survey helps clarify U.K. reporting radiographers' perceptions of AI, used for image interpretation, "highlighting key issues with AI integration."

 

AI is being rapidly integrated into imaging equipment, "with little consideration as to how it influences radiography practice and frontline services," said McFadden, noting the finding suggests radiographers are generally confident in AI's decision-making, but less sure of explaining those decisions to others.

 

McFadden pointed out that further workforce education, along with increasing transparency around AI systems, are required to help educate radiographers and ensure they can be sure of diagnosis and know how to discuss the role of AI with patients and other health care practitioners.

 

Explainable AI will ensure the user is provided with an indication of how the system reached its decision in a way that humans can comprehend, she said, such as a color-coded overlay of decision confidence levels on a radiographic image, for example.

 

"The clinician, as the end user, should be able to interact effectively with the AI whilst exercising due caution. All stakeholders representing a range of clinical modalities with backgrounds in clinical practice, research, academia, and industry should be involved in developing future education and training at undergraduate and post-graduate level," explained McFadden. "As the use of AI becomes more prevalent, consideration should be given to the expectations of patients and service users in the role of AI in radiographic image interpretation."

 

Mark McGraw is a contributing writer.