Home » Health » AI Simplifies Medical Scans: Reports Now Twice as Understandable for Patients

AI Simplifies Medical Scans: Reports Now Twice as Understandable for Patients

by Dr. Jennifer Chen

Artificial intelligence is showing promise in making complex medical scan reports more accessible to patients, potentially reducing anxiety and improving understanding of their health conditions. A major new study from the University of Sheffield suggests that rewriting radiology reports – for X-rays, CT scans, and MRIs – using AI systems like ChatGPT can nearly double patient comprehension without sacrificing clinical accuracy.

The research, published in The Lancet Digital Health, found that AI-rewritten reports lowered the reading level from that typically expected of a university student to that of a student aged 11-13. This simplification could be particularly beneficial for individuals with lower health literacy or those for whom English is not their first language.

Researchers analyzed 38 studies, encompassing over 12,000 radiology reports simplified using AI, published between 2022 and 2025. Evaluations by patients, members of the public, and clinicians assessed both how well patients understood the reports and whether the AI changes affected the clinical accuracy of the information.

The Challenge of Traditional Radiology Reports

Traditionally, radiology reports are written for physicians, not patients. They are often filled with specialized terminology and abbreviations that can be difficult for non-medical professionals to decipher. However, increasing patient access to these reports – driven by initiatives like the NHS App and policies promoting transparency – highlights the need for clearer communication.

“The fundamental issue with these reports is they’re not written with patients in mind,” explains Dr. Samer Alabed, Senior Clinical Research Fellow at the University of Sheffield and Honorary Consultant Cardio Radiologist at Sheffield Teaching Hospitals NHS Foundation Trust. “They are often filled with technical jargon and abbreviations that can easily be misunderstood, leading to unnecessary anxiety, false reassurance and confusion.”

This lack of clarity can disproportionately affect patients with lower health literacy or those who are not native English speakers. Clinicians often find themselves spending valuable appointment time explaining report terminology instead of focusing on treatment and care decisions. Even small reductions in this explanation time could have a significant positive impact on healthcare systems.

Balancing Clarity with Accuracy

While the study demonstrates a significant improvement in patient understanding, it also acknowledges the importance of maintaining clinical accuracy. Review by doctors found that approximately one percent of the AI-simplified reports contained errors, such as an incorrect diagnosis. This underscores the need for careful oversight and quality control when implementing AI-assisted report simplification.

It’s important to note that none of the 38 studies reviewed were conducted within the UK’s National Health Service (NHS). Dr. Alabed and his team are now focused on addressing this gap by conducting research within NHS clinical settings.

“This research has highlighted several key priorities,” says Dr. Alabed. “The most important is the need for real world testing in NHS clinical workflows to properly assess safety, efficiency, and patient outcomes.”

The researchers emphasize the importance of “human-oversight models,” where clinicians review and approve AI-generated explanations before they are shared with patients. The ultimate goal is not to replace clinicians, but to enhance communication and promote more equitable healthcare.

Looking Ahead: AI as a Communication Tool

The development of tools like RadGPT, a large language model developed at Stanford University, further illustrates the potential of AI in this area. RadGPT can extract key concepts from a radiologist’s report and provide plain-language explanations, along with suggestions for follow-up questions. This technology aims to empower patients to better understand their scan results and engage more actively in their care.

As patient access to medical records continues to expand, AI-assisted explanations could become a standard component of radiology reports, fostering greater transparency and trust within healthcare systems. However, ongoing research and careful implementation are crucial to ensure both clarity and accuracy, ultimately benefiting both patients and clinicians.

More information

Samer Alabed et al, Large language models for simplifying radiology reports: a systematic review and meta-analysis of patient, public, and clinician evaluations, The Lancet Digital Health (2026). DOI: 10.1016/j.landig.2025.100960

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.