Accuracy and public reporting
Is public reporting all that it is cracked up to be? Or does it sometimes lead to a distorted view of the facts?
In this newsletter, we include two studies related to public reporting of hospital data. Karen Joynt, MD, MPH, of the Harvard School of Public Health in Cambridge, Mass., looked at whether states that require public reporting of PCI outcomes had lower rates of PCI use than states that had no such requirements. In a separate paper, Carl van Walraven, MD, of the Ottawa Hospital Research Institute in Canada, took aim at methods used to calculate the readmission rates that inform hospital rankings. In both scenarios, hospital performance is aired in public.
Joynt et al compared PCI and mortality rates for patients admitted with acute MI using three states that require reporting outcomes and seven states that do not have public reporting requirements. They found that Medicare patients from states requiring PCI reporting were less likely to receive PCI. But the authors articulated concerns about the flip side of public reporting, that it might give physicians a disincentive to treat high-risk patients.
"As opposed to overuse of PCI in the stable outpatient population, our concern is underuse in patients who could benefit most from this life-saving procedure," Joynt said in an interview. One purpose behind public reporting is to foster transparency, she added, but policies need to be improved to ensure those policies are fair and the reporting is accurate.
Van Walraven et al noted that 30-day readmissions and mortality are considered indicators of quality of care and they are used to determine hospital performance rankings that often become publicly available. But they wrote there is no consensus on what method best predicts readmission and mortality outcomes. Consequently, they created four different measures for unplanned hospital readmissions and death within 30 days of discharge to test how changes in methods used to calculate rates affected hospital rankings.
They found a great deal of variability and showed that a slight change in the method used could swing a hospital’s ranking one way or the other. They, too, called for more transparency, in this case, that the model’s accuracy be made clear “to inform readers of its effectiveness for leveling the playing field among hospitals.”
Public reporting, in principle, is beneficial but its value lies in the integrity of the information reported. How accurately do publicly available reports reflect your hospital or practice’s performance? What needs improvement, and what works as is? Let us know.
Candace Stuart
Cardiovascular Business, editor
cstuart@trimedmedia.com
In this newsletter, we include two studies related to public reporting of hospital data. Karen Joynt, MD, MPH, of the Harvard School of Public Health in Cambridge, Mass., looked at whether states that require public reporting of PCI outcomes had lower rates of PCI use than states that had no such requirements. In a separate paper, Carl van Walraven, MD, of the Ottawa Hospital Research Institute in Canada, took aim at methods used to calculate the readmission rates that inform hospital rankings. In both scenarios, hospital performance is aired in public.
Joynt et al compared PCI and mortality rates for patients admitted with acute MI using three states that require reporting outcomes and seven states that do not have public reporting requirements. They found that Medicare patients from states requiring PCI reporting were less likely to receive PCI. But the authors articulated concerns about the flip side of public reporting, that it might give physicians a disincentive to treat high-risk patients.
"As opposed to overuse of PCI in the stable outpatient population, our concern is underuse in patients who could benefit most from this life-saving procedure," Joynt said in an interview. One purpose behind public reporting is to foster transparency, she added, but policies need to be improved to ensure those policies are fair and the reporting is accurate.
Van Walraven et al noted that 30-day readmissions and mortality are considered indicators of quality of care and they are used to determine hospital performance rankings that often become publicly available. But they wrote there is no consensus on what method best predicts readmission and mortality outcomes. Consequently, they created four different measures for unplanned hospital readmissions and death within 30 days of discharge to test how changes in methods used to calculate rates affected hospital rankings.
They found a great deal of variability and showed that a slight change in the method used could swing a hospital’s ranking one way or the other. They, too, called for more transparency, in this case, that the model’s accuracy be made clear “to inform readers of its effectiveness for leveling the playing field among hospitals.”
Public reporting, in principle, is beneficial but its value lies in the integrity of the information reported. How accurately do publicly available reports reflect your hospital or practice’s performance? What needs improvement, and what works as is? Let us know.
Candace Stuart
Cardiovascular Business, editor
cstuart@trimedmedia.com