Medicare’s seven-year public reporting initiative for hospitals, Hospital Compare, had no impact on reducing death rates for two key health conditions and just a modest effect on a third. That’s the conclusion of a just-released study that raises questions about the initiative’s ability to improve the quality of care provided by the nation’s hospitals.
The study, published in the March issue of Health Affairs, shows that Hospital Compare produced no reductions beyond the existing trends in improvement of care of heart attacks and pneumonia. Authors found that hospitals might have improved on thirty-day mortality rates during the study, but attribute the change to ongoing innovations in clinical care, and not to any effect related to public reporting. At the same time, the researchers found a modest improvement in mortality rates for heart failure; though, they can’t prove that this was related to the public reporting initiative.
The findings help inform the ongoing debate about Hospital Compare, whose measures, critics say, do not necessarily reflect quality of care provided at hospitals. The study’s authors say this is one of the strongest studies to suggest that Medicare’s public reporting effort has made little or no impact on quality, at least using the current set of metrics.
“The jury’s still out on Medicare’s effort to improve hospital quality of care by posting death rates and other metrics on a public website,” says lead author Andrew M. Ryan, an assistant professor of public health at the Weill Cornell Medical College in New York City. “Additional studies must prove that public reporting does in fact push hospitals to raise the quality of care standard,” he said.
Hospital Compare was created to help Medicare patients rank or judge hospitals and other health care providers based on standards of care. The program allows consumers needing hospital care to go on a website and look for a hospital that meets or exceeds expectations when it comes to quality of care.
However, researchers have yet to definitively show that this kind of report card results in steps by hospitals that bring down death rates or address other factors that go into quality of care. In addition, it is unclear if consumers take advantage of the information to make the best choice about hospital care, Ryan says.
To try to answer those questions, Ryan and his colleagues used Medicare claims data from 2000 to 2008 to estimate the effect of Hospital Compare on thirty-day mortality rates for heart attacks, heart failure, and pneumonia. The team also looked for evidence that consumers used the information on the website to choose hospitals with a high quality-of-care ranking.
Ryan says further study must be done to demonstrate that the improvement in mortality rates for heart failure was really related to Hospital Compare and not to a yet unknown factor, one that was not adequately ruled out by the study.
Past surveys have suggested that quality report cards like Hospital Compare are underused by patients and ignored by referring physicians. This study adds to that evidence suggesting that consumers at least did not seem to be checking the Medicare website to make more informed choices about where to check in for an elective procedure.
“This study does have limitations,” Ryan says. “We looked at thirty-day mortality and not other outcome measures that might yet prove to be important in judging a hospital.”
The US Agency for Healthcare Research and Quality supported the publication of this and several other papers in the March issue on the subject of public reporting.