Don't miss a thing from RSNA!

To ensure you receive all future messages, please add our new sender domain, info.rsna.org, to your contacts or safe sender list.

OK

AI Helps X-Rays Do More

One of the oldest imaging tools continues to play a vital role in modern radiology


Daiju Ueda, MD, PhD
Ueda

Radiographic images, particularly chest X-rays, remain a cornerstone of medical diagnostics and are essential tools for early and accurate disease diagnosis. But can X-rays do more?

With the help of AI, the answer is ‘yes’.

“AI can mine chest X-rays for signals that go beyond its traditional use of looking at the lungs and heart, turning a single exposure into a multi-condition screening,” explained Daiju Ueda, MD, PhD, a radiologist and part of Japan’s Osaka Metropolitan University Graduate School of Medicine’s Department of Artificial Intelligence.

Dr. Ueda recently co-authored a Radiology: Cardiothoracic Imaging study on using chest X-rays to detect hepatic steatosis, a liver disease characterized by the accumulation of fat in the liver that affects an estimated 25% of the world’s population.

“Chest X-rays are ubiquitous, low cost and already capture part of the liver,” he said. “If AI can extract liver-related signals from them, we can enable opportunistic screening without extra scans.” 

According to Dr. Ueda, the study set out to answer a relatively simple question: can a deep learning model read a routine frontal chest X-ray and flag hepatic steatosis?

The study retrospectively collected 6,599 posteroanterior chest X-rays linked to controlled attenuation parameter (CAP) exams from 4,414 patients at two institutions. While one site supplied training, tuning and internal-test images, the other served as an external test set.

“Chest X-rays are ubiquitous, low cost and already capture part of the liver. If AI can extract liver-related signals from them, we can enable opportunistic screening without extra scans.”

— DAIJU UEDA, MD, PHD

Developing, Training and Performance

Candidate AI models for diagnosing hepatic steatosis were developed using commercial convolutional neural networks (CNN) used in deep learning, especially for image recognition and computer vision tasks.

The CNNs were trained and tuned by updating the overall parameters of the pretrained ImageNet model, a neural network that has been trained on the ImageNet dataset, one of the largest and most widely used datasets in computer vision.

The researchers determined the optimal threshold for classifying images as positive or negative for steatosis by maximizing the Youden index on the tuning dataset. This index is a metric used to evaluate the effectiveness of a diagnostic tool. It helps identify the optimal threshold that balances sensitivity and specificity.

During training, the AI models received only X-rays labeled as being from patients with or without steatosis (according to CAP value) as input and identified radiographic features that could predict hepatic steatosis.

The deep learning model demonstrated good performance for detecting hepatic steatosis in both the internal (AUC 0.83) and external (AUC 0.82) test sets, with accuracy/sensitivity/specificity values of 77%, 68% and 82%, respectively, for the internal test images and 76% for all three value metrics for the external test images.

In an analysis that used only one exam per patient, whether they underwent multiple CAP exams or just one, AUCs were 0.86 for the internal test images and 0.83 for the external test images. Saliency maps highlighted regions at or below the diaphragm in 74.2% of external test images, consistent with the liver/diaphragm area. 

“These findings support opportunistic screening from existing chest X-rays, adding value without extra scanner time,” Dr. Ueda said. “A tool like this could triage patients who should proceed to dedicated liver assessment, helping radiology contribute earlier to metabolic liver disease care pathways.” 

AI patient privacy

Model Should Raise Suspicion, Not Stand Alone

While prior research has targeted the liver using US, CT or MRI, Dr. Ueda’s work is unique in that it evaluated standard chest X-rays, which are widely available and reproducible, and demonstrated external validation.

“To our knowledge, this is the first report to show that a chest X-ray-based model can detect steatosis with good performance,” he said.

Although this study may be groundbreaking, Dr. Ueda is quick to point out that more work needs to be done. This includes conducting prospective, multi-center validation; calibration across varying prevalence rates and ethnicities; and workflow studies.

It also means integrating clinical and lab data into the model to improve its overall performance and reduce false positives.

“If validated prospectively, this approach could identify at-risk patients earlier, prompt lifestyle counseling or definitive testing, and reduce unnecessary CT/MRI in low-risk groups—particularly in resource limited settings,” Dr. Ueda noted.

Until that happens, Dr. Ueda said the model should be used to raise suspicion rather than make a stand‑alone diagnosis.

Although a chest X-ray won’t replace dedicated liver imaging, with AI it can serve as an inexpensive, noninvasive and near universal front-door triage.

“AI expands X-ray’s role from diagnostic confirmation to continuous, population-scale risk stratification,” Dr. Ueda concluded. “While it may not exactly be a comeback story, it definitely gives a trusted and mature modality a new lease on life as a scalable screening platform enhanced by software.”

For More Information

Access the Radiology: Cardiothoracic Imaging study, “Performance of a Chest Radiograph-based Deep Learning Model for Detecting Hepatic Steatosis.”

Read previous RSNA News stories about incidental findings: