Eye on AI: The Potential and Reality of AI in Clinical Application
By Bryant M
Editor’s note: The potential of emerging artificial intelligence (AI) technologies and their clinical applications to expand the capabilities of radiology are the subject of much excitement and anticipation within medical imaging. This column is adapted from two AR Connect Expert Discussions at RSNA 2019 moderated by Lawrence Tanenbaum, MD, FACR, RadNet, Inc.
As artificial intelligence (AI) technologies move from theory to application, the field will become more dynamic and more engaging than ever. New and varied AI tools have the ability to improve scan speed, image quality, and radiologist workflow and productivity. The clinical implications, however, are less clear. Knowing which technologies and platforms are ready for practice and which are little more than hype can be hard to discern.
AI in Neuroimaging
Quantitative volumetric MRI tools, such as Cortech’s NeuroQuantTM (NQ) and LesionQuantTM (LQ), as well as icometrix’s icobrain, promise to add value for clinical indications such as dementia, epilepsy, multiple sclerosis (MS), traumatic brain injury, and pediatric brain development, said Suzie Bash, MD, Medical Director of Neuroradiology at San Fernando Valley Interventional Radiology at RadNet, and one of four key opinion leaders who participated in the discussion.
All three technologies segment substructures of the brain, then color codes and labels them before calculating and comparing their individual volumes to a large normative database. The technology is particularly useful for longitudinal tracking over time, as well as for reducing subjectivity and variation between readers in neuroradiology reports, Dr Bash added.
In patients with MS, Dr Bash noted, LesionQuant and icobrain products calculate the intracranial plaque volume and assess for disease progression by identifying the volume of new, enlarging, or shrinking plaques. In patients with symptoms of dementia, meanwhile, NeuroQuant and icobrain identify the lobar predilection and statistical significance of regional brain atrophy, helping radiologists to differentiate between mild cognitive impairment, Alzheimer’s disease, frontotemporal dementia, and Lewy Body Dementia.
“This kind of software is very useful because it actually quantifies substructures that are relevant to different clinical conditions,” said Dr Bash. “It will calculate the hippocampal volumes and tell you whether that degree of atrophy is commensurate with age or if it’s statistically significant for age. Our referrers love it. Once they use it, they don’t want to go back to not having it. It really does add a lot of value to reports.
AI in Mammography
Alyssa Watanabe, MD, meanwhile, highlighted medical imaging that leverages AI-based tools to improve mammography specificity and sensitivity. Such tools include CureMetrix’s cmAssist®, an investigational computer-aided detection (CAD) software that has the potential to quickly detect, quantify, and classify anomalies on a mammogram as suspicious. It assesses all views of a digital 2D mammogram and works across all breast densities, masses, and calcifications, and highlights any anomalies that require further analysis.
“By using artificial intelligence, our company developed a CAD that doesn’t present a lot of false flags, and is thus more valuable for the radiologist,” said Dr Watanabe, a Radiologist, Clinical Associate Professor at USC Keck School of Medicine, and Chief Medical Officer of CureMetrix, a global AI software company that focuses on medical imaging that leverages AI to improve mammography.
CureMetrix’s cmAssist is also powerful. “One study,” Dr Watanabe noted, “showed that our CAD could help radiologists detect breast cancers up to five years sooner than current detection methods.”
CureMetrix has also developed a product, the FDA-cleared cmTriage™, to triage mammography studies by assisting in sorting the worklist and enabling radiologists to focus first on suspicious cases, which are pre-read and flagged by the AI algorithm.
“Triage is extremely valuable to a breast imager because it boosts efficiency. We can segregate 40 percent to 60 percent of the worklist and flag suspicious studies with higher levels of accuracy compared to current methods,” said Dr Watanabe. “Radiologists are able to read 30 percent to 40 percent faster with a prioritized worklist over the traditional ‘first-in, first-out’ method. This technology is available now, and I expect that within the next few years, all X-rays may be presorted and pre-read by a machine.”
She said one study took three months of data from a breast center and compared the outcomes of the actual workflow to the outcomes of the potential triage workflow. With the triage workflow, 25 percent of those cases could have been sorted out as less suspicious, allowing the radiologist to focus on the rest of the worklist. “This gave the radiologist an extra hour every day when they could do something else, like take a lunch break,” Dr Watanabe said.
Kevin Lyman, Chief Executive Officer of Enlitic, a San Francisco-based company that launched a comprehensive AI software platform at RSNA 2019, also highlighted the ability of AI to help prioritize studies, which is also important in diagnostic radiography.
“Radiologists are inundated with too many scans, especially X-rays that take time [to evaluate] at the expense of higher order modalities that need more focus,” said Lyman. “Ultimately, X-rays aren’t slowing down in their volume, but they’ll definitely become less of a focus for human attention. AI is able to help.”
AI in Under-reporting Situations
Artificial intelligence can also help in under-reporting situations.
“There is potential value in using mammography to screen for heart disease, which is actually a much more significant cause of death in women than breast cancer,” Dr Watanabe said, adding that mammographers rarely describe breast arterial calcifications, despite their known association with heart and peripheral vascular disease. Her team developed software to quantify and score these calcifications with the so-called Bradley score.
Chest X-rays offer a similar opportunity for AI in these situations. “AI will often pick up aortic calcifications, but radiologists might choose not to mention it in the report,” Lyman said. “But there’s evidence that having knowledge of all of those chronic conditions does lead to better quality of care. Knowledge is power.”
AI in the Emergency Department
Triaging studies is also a goal for Melissa Davis, MD, MBA, most recently Assistant Professor of Radiology and Biomedical Imaging at the Yale School of Medicine.
Dr Davis’ team at Yale has been working with Aidoc for the past three years. The company’s AI product screens for acute critical findings, such as intracranial hemorrhage on noncontrast CT head scans. Yale implemented a platform in their emergency department and neuroradiology reading rooms that de-emphasizes exclusive “first-in, first-out” prioritization and instead flags positive cases for immediate review. Before such triaging studies, the doctors relied on technologists or residents to flag cases with acute findings. The new system automatically marks suspicious studies in the worklist and notifies radiologists of those with suspected hemorrhage that should be read first. Aidoc’s platforms for cervical spine fracture and pulmonary emboli detection have also been recently deployed.
The Yale team has evaluated the effect of triage on turnaround times (the time between placing the order in the health record to the time the final report appears in the health record), patient length of stay in the ED, and more. “In our level one trauma center, we actually had a significant reduction in turnaround time after the platform was implemented,” said Dr Davis, who is also a Clinical Lead at Yale’s Center of Outcomes Research & Evaluation, and leader of clinical operations at Nines AI.
The Yale team worked closely with Aidoc to ensure the platform would work seamlessly with current workflows, while still giving radiologists control over their workday. “It flags studies for us so we can choose to interact with it or not, then I can reprioritize my worklist myself,” said Dr Davis. “Right now, there’s about a 90 percent interaction with those notifications. We’re changing the behavior of our radiologists without physically moving things in the report.”
Evaluating AI Technologies
Lyman said facilities should first assess their needs to evaluate which AI technologies will provide the most value, whether it’s prioritizing studies, detecting abnormalities, or measuring structures. It’s also important to have a clear view of a system’s limitations and the data used to train these applications, reinforcing the need for diversity in data, he said.
Applications outside of clinical diagnostics, such as business intelligence technologies, can also help maximize workflow and revenue. Workflow and regulatory needs should also be considered. Dr Davis said that Yale evaluates AI by looking first at how it will be adopted by radiologists, then by assessing the value it can bring to radiologists and patients.
Ultimately, the right AI tools will give clinicians more weaponry to diagnose and treat patients.
“It’s incredibly important to really put the power in the hands of the physician using the tool to truly determine how it’s going to operate in their clinic,” Lyman said. “Ultimately, then it’s up to the clinician to decide which path they’re going to take.”
See the complete AR Connect | Expert Discussions digital article series, including Women in Radiology Leadership, AI | Transforming Radiology Practices, and AI | The Importance of Diversity in Data at appliedradiology.com/ar-connect/ar-connect-articles
McKenna Bryant is a freelance healthcare writer based in Nashotah, WI.