Don't miss a thing from RSNA!

To ensure you receive all future messages, please add our new sender domain, info.rsna.org, to your contacts or safe sender list.

OK

Collaborative AI Targets Perceptual Misses in Chest Radiography

Multimodal system incorporates eye gaze, functions as second reader


Hien Van Nguyen, PhD
Nguyen
Carol Wu, MD
Wu
Akash Awasthi
Awasthi

Despite growing use of AI in radiology, one persistent challenge is that AI doesn’t know what the radiologist actually saw. A new system called Collaborative Radiology Expert (CoRaX) aims to address this gap. By integrating radiologists’ reports, eye gaze data and AI, CoRaX is designed to catch and correct missed findings on chest X-rays, the most common imaging exam.

CoRaX functions as a collaborative assistant, monitoring the radiologist’s interaction with the image, assessing their report and offering targeted feedback, particularly regarding perception errors.

“Observation of gaze is a free lunch; it doesn't interfere with radiologists’ workflow at all except that you have a small sensor in front of a screen and capture gaze points on the screen,” said Hien Van Nguyen, PhD, associate professor in the Department of Electrical and Computer Engineering at the University of Houston. “There’s no burden on the radiologist's workload, and eye gaze gives a lot of information to the system to collaborate and not overburden radiologists with too many decisions.”

Dr. Nguyen is co-creator of CoRaX and co-author of a retrospective study published in Radiology: Artificial Intelligence that evaluated the new tool. He said that using CoRaX as a second reader provides a more collaborative approach than standalone AI. The goal of the system is for the AI-radiologist team to be more effective and efficient than either AI or the radiologist alone.

“Currently, AI tools often analyze images independently from radiologists, and sometimes it takes more time to sort through the additional information, including false positives, from AI,” said study co-author Carol Wu, MD, a radiologist and professor at MD Anderson Cancer Center in Houston. “It's been a dream of mine to have a more collaborative system like CoRaX where the AI tool can incorporate information from the radiologist’s gaze and report to only alert radiologists when necessary.”

AI imaging in practice

Simulating Misses to Measure Impact

To test CoRaX’s performance, Drs. Nguyen, Wu and colleagues used publicly available eye tracking and eye gaze datasets, along with a classifier called ChexFormer, to predict abnormalities in chest X-ray images. They simulated errors using two approaches: random masking, which hides parts of the image to mimic genera mistakes or oversight and uncertainty-based masking, which targets harder-to-see finding to reflect real-world diagnostic challenges.

They applied these masks to five types of abnormalities: cardiomegaly, pleural effusion, atelectasis, lung opacity and edema. Random masking affected about 28% of the abnormalities, while uncertainty-based masking selectively altered nearly 44% of the more difficult cases.

By processing eye gaze patterns, CoRaX’s Spatio Temporal Abnormal Region Extractor module determines if the radiologist fixated on an abnormal area and can then alert the radiologist. The system predicts the region of interest and assigns a label for each abnormality.

“CoRaX is AI and the radiologist helping each otherbasically, it's giving a referral, not a decision,” said Akash Awasthi, CoRaX co-creator, lead study author and PhD candidate at University of Houston. “It’s a potential solution for striking a balance between over trust and under trust of AI.”

Performance in Error Detection

CoRaX corrected approximately 21% of the simulated errors in the random-masking dataset and nearly 35% in the uncertainty-masking dataset.

It accurately identified the correct regions of interest for missed abnormalities in 63% of the cases in the random error dataset and 58% in the uncertainty dataset. The system showed particularly strong performance in identifying missed cardiomegaly, the authors note.

In this retrospective analysis, radiologists accepted almost 86% of CoRaX’s AI referrals in the random masking dataset and about 78% in the uncertainty masking dataset. This suggests strong potential clinical relevance.

Dr. Nguyen and his team were impressed by the system’s ability to detect subtle findings and multiple missed abnormalities within a single case, particularly considering that satisfaction of search is a well-known contributor to diagnostic errors in radiology.

Across both datasets, CoRaX maintained relatively low false-positive rates—nine false referrals (4.6%) in the random masking dataset and 10 (6.4%) in the uncertainty masking dataset—while leaving 22 and 31 abnormalities uncorrected (false negatives), respectively.

“CoRaX is AI and the radiologist helping each other—basically, it's giving a referral, not a decision. It’s a potential solution for striking a balance between over trust and under trust of AI.”

— AKASH AWASTHI

Improvement Opportunities and Clinical Integration

“CoRaX is not an autonomous AI system like those widely present on the market,” wrote the authors of a related commentary in Radiology: Artificial Intelligence. “Those systems perform the work of a real radiologist in the background by presenting findings without any explanation, leaving the radiologist to accept or reject them.”

Still, the commentary authors cautioned that without strong collaboration, AI tools risk eroding radiologist trust. They offer a note of caution related to CoRaX’s detection limitations, suggesting areas for further refinement to maximize clinical benefit.

“Its correction performance was limited in subtler findings such as atelectasis and pleural effusion, in which increased false deferrals suggest challenges in confidently identifying low-contrast or overlapping abnormalities,” they wrote.

CoRax’s creators recognize these limitations and emphasize that next steps include real-world validation with radiologists. They also envision future applications for 3D imaging.

The team acknowledged that study’s main limitations are that it introduced synthetic perceptual errors rather than capturing the full range seen in clinical practice. It also encountered some minor technical misalignments and has yet to be validated with radiologists in clinical real-world scenarios.

Although the current work emphasizes system development, modular architecture enables future enhancements with more advanced models. “This approach lays the groundwork for robust, error-resistant AI systems and paves the way for future clinical trials and broader adoption,” the authors conclude.

For More Information

Access the Radiology: Artificial Intelligence study, “Collaborative Integration of AI and Human Expertise to Improve Detection of Chest Radiograph Abnormalities,” and the related editorial, “’You’ll Never Look Alone’: Embedding Second-Look AI into the Radiologist’s Workflow.

The research team also developed a web-based radiology education tool that provides personalized feedback for learners and uses the cursor as a proxy for eye gaze.

Read previous RSNA News stories on chest imaging: