Cardiac imaging challenges radiology environment
SEATTLE—Cardiac imaging poses workflow and analytic challenges to the radiologist, as well as potentially confounding the radiology infrastructure, according to a presentation last week by William W. Boonn, MD, at the Society of Imaging Informatics in Medicine (SIIM) conference.
Boonn, from the Hospital of the University of Pennsylvania in Philadelphia, noted that there are a number of imaging challenges when attempting to image the heart. First, the very nature of cardiac anatomy makes it a difficult organ to scan because orientation of the heart does not lie in the normal axial, sagittal and coronal planes, “which are the positions that we typically image the body in,” he said.
As a result, “when you image the heart, you need to find a way to orient your imaging device to the axis of the heart, or you need to obtain spatial resolution that’s high enough, so you can obtain isotropic voxels, so you can obtain 3D images.”
Another anatomy problem that arises when imaging the heart is its size. Boonn provided an example: when examining coronary arteries, “proximally, if you measure one milliliter off, you’ve changed a patient’s stenosis by 20 percent. And, for a distal measurement, if you mess up by one milliliter, it could mean a 30 or 50 percent difference.”
The radiologist attempting to image the heart not only has to confront cardiac motion, but also respiratory motion, he said. In order to combat cardiac motion, radiologists use ECG synchronization. Boonn explained that there are two ways to synchronize the ECG:
Using post-processing techniques, accuracy was found to be the highest in multi-planar reformatted images, followed by plain axial images. However, “3D volume-rendered images, which are seen in product brochures, are not good for effective diagnosis,” Boonn said.
There are informatics challenges with quantitative analysis because of the amount of space needed to store cardiac CTA images. Boonn provided a comparative example: CCTA produces between 1,200-8,000 images (average of 1.5GB per study) compared with chest x-ray with two images (20MB) and chest CT with 100-250 images (80MB). For other cardiac procedures: echo produces about 700-1,500 (350MB), SPECT produces much fewer with 50-100 images (8MB) and finally, cardiac MRI produces the most after CCTA with between 1,000-4,000 (200MB).
“Cardiac imaging studies continue to pose a number of challenges to existing radiology infrastructure: Storage, interpretation workflow and reporting…we need to continue to push vendors towards more interoperable solutions,” Boonn concluded.
Boonn, from the Hospital of the University of Pennsylvania in Philadelphia, noted that there are a number of imaging challenges when attempting to image the heart. First, the very nature of cardiac anatomy makes it a difficult organ to scan because orientation of the heart does not lie in the normal axial, sagittal and coronal planes, “which are the positions that we typically image the body in,” he said.
As a result, “when you image the heart, you need to find a way to orient your imaging device to the axis of the heart, or you need to obtain spatial resolution that’s high enough, so you can obtain isotropic voxels, so you can obtain 3D images.”
Another anatomy problem that arises when imaging the heart is its size. Boonn provided an example: when examining coronary arteries, “proximally, if you measure one milliliter off, you’ve changed a patient’s stenosis by 20 percent. And, for a distal measurement, if you mess up by one milliliter, it could mean a 30 or 50 percent difference.”
The radiologist attempting to image the heart not only has to confront cardiac motion, but also respiratory motion, he said. In order to combat cardiac motion, radiologists use ECG synchronization. Boonn explained that there are two ways to synchronize the ECG:
- Prospective triggering, which designates a certain phase of the cardiac cycle to image the heart (with an average of 65 percent of the RR interval with absolute time of 400 ms after R wave). “The advantage with this method is the patient will receive a lower radiation dose, because the tube is only on for a brief period of time, and is often used when obtaining a calcium score,” Boonn said.
- Retrospective gating, in which the tube is left on throughout the imaging process. “However, when you post-process the image, you tell the device what particular data sets you would like to retrieve, and the CT scanner can fetch the data only from that part of the cardiac cycle,” Boonn said. While this method exposes the patient to more radiation, “you now have the flexibility of creating phases elsewhere in the cardiac cycle,” he said. Images can then be retrospectively reconstructed and visualized in one of several ways.
Using post-processing techniques, accuracy was found to be the highest in multi-planar reformatted images, followed by plain axial images. However, “3D volume-rendered images, which are seen in product brochures, are not good for effective diagnosis,” Boonn said.
There are informatics challenges with quantitative analysis because of the amount of space needed to store cardiac CTA images. Boonn provided a comparative example: CCTA produces between 1,200-8,000 images (average of 1.5GB per study) compared with chest x-ray with two images (20MB) and chest CT with 100-250 images (80MB). For other cardiac procedures: echo produces about 700-1,500 (350MB), SPECT produces much fewer with 50-100 images (8MB) and finally, cardiac MRI produces the most after CCTA with between 1,000-4,000 (200MB).
“Cardiac imaging studies continue to pose a number of challenges to existing radiology infrastructure: Storage, interpretation workflow and reporting…we need to continue to push vendors towards more interoperable solutions,” Boonn concluded.