How to integrate AI into the cardiac imaging pipeline
Artificial intelligence may be perceived as a threat to some physicians, but, according to research presented at the Radiological Society of North America’s annual meeting in Chicago, it could have some real use for cardiologists.
Albert Hsiao, MD, PhD, a radiologist and professor at the University of California, San Diego, said that when the “new wave” of artificial intelligence hit the medical field a few years ago, it came with an onslaught of articles about how robots were going to take over his job. The idea of technology displacing radiologists has been a major point of contention in the field—and it remains an active debate—but, like many, Hsiao is embracing AI as a means to enhance his work, not replace it.
“You’ll see AI applications across the whole spectrum, from imaging acquisition on CT and MR to image reconstruction, applications in the data center...and analysis of any kind of image,” he said, noting AI is also being leveraged to simplify interactive image analysis and radiology reporting.
Hsiao said he sees the greatest potential for AI in the places where researchers are spending a lot of time, but not a lot of money. He cited cardiac MRI as an example, since hospitals spend up to two years training technicians who will likely leave to do something else in another couple of years. Trainees also need continuous supervision and feedback, he said—supervision that AI wouldn’t need.
Hsiao said cardiac MRI itself, a complex, multiplanar exam, is also “ripe for innovation.” Double oblique imaging planes, myocardial inversion time and phase-contrast for flow are all areas he said could be optimized with new technology. At his own UCSD lab, he and his colleagues have developed a strategy for automated acquisition of MRI using deep learning for serial prescription of imaging planes.
The team’s model reportedly prescribes planes from axial images to long axis images, recognizing landmarks along the way to prescribe each subsequent set of images. Hsiao said that by “stitching all that together, you can get a full exam of all the imaging planes.”
The researchers tested the analytical validity of their AI by collecting data on the accuracy of localization. They looked at the location of inference of their deep neural network and the location of the ground truth where radiologists might place a landmark for the apex, and calculated that distance over a series of multiple images. To train and test the model, they would take around 500 cases, use 80% to train the AI and use the other 20% to test it.
“You can arbitrarily select a different 80% and a different 20% for multiple folds of cross-validation,” Hsiao said. “If you do that, you can prove a particular architecture of neural network is sufficient to perform a particular task. That it wasn’t dependent on which 80% you chose for your training data.”
He said his team employed the same approach for a more complex problem, taking a stack of short axis images and marking valves to plan two-, three- and four-chamber images. To do that, they created a series of cascaded neural networks, each of which was responsible for a different problem. Those models could be cross-validated, too.
Hsiao said physicians can expect “a little bit of generalization” from neural networks, meaning they’ll work okay on data that they’ve never seen, but they’re not going to produce perfect results the first time around. If a model was trained on 3T MRI data, for example, and someone inputs 1.5T MRI data, it might not be able to analyze that information comprehensively. If some 1.5T data were fed into the model’s training algorithm, though, that could change.
According to Hsiao, all of this knowledge means little without clinical validation. He said he and his colleagues are working to integrate algorithms into the clinical environment such that a radiologist could hit a button and AI could auto-prescribe a set of images. Even better, he said, would be the ability to open up a series and have it auto-prescribe itself.
“That’s where we’re moving next, so you don’t have to hit any buttons at all,” he said.