With unsustainably high costs and tremendous gaps in quality and patient safety, the health care system is ripe with opportunities for improvement. For years, many have seen quality measurement as a means to drive needed change. Private and public payers, public health departments, and independent accreditation organizations have asked health care providers to report on quality measures, and quality measures have been publicly reported or tied to financial reimbursement or both.
Throughout the Affordable Care Act (ACA), quality measures are tied to reimbursements in multiple programs. It is critical that the Department of Health and Human Services (HHS) move forward with a strategy for measure harmonization that will accommodate local and national needs to evaluate outcomes and value. Additionally, a standard for calculation measures such as the use of a minimal data set for the universe of measures should be considered.
The field of quality measurement is at a critical juncture. The Affordable Care Act (ACA)—which mentions “quality measures,” “performance measures,” or “measures of quality,” 128 times—heightened an already growing emphasis on quality measurement. With so much focus on quality, the resource burden on health care providers of taking and reporting measures for multiple agencies and payers is significant.
Furthermore, the field itself is being transformed with the continued adoption of electronic health records (EHRs). Traditional measures are largely based on administrative or claims data. The increased use of EHRs create the opportunity to develop sophisticated electronic clinical quality measures (eQMs) leveraging clinical data, which when linked with clinical decision support tools and payment policy, have the potential to improve quality and decrease costs more dramatically than traditional ones. Innovative electronic measures on the horizon include “delta measures” calculating changes in patient health over time and care coordination measures for the electronic transfer of patient information (i.e., hospital discharge summary or consultant note successfully transmitted to the primary care physician). Additionally, traditional data abstraction methodologies for clinical data require labor intensive, chart review processes, which would be eliminated if data could be electronically extracted.
The Measures Application Partnership
Recently, the National Quality Forum (NQF), the steward of the Measure Applications Partnership (MAP), released its final, pre-rulemaking report to the HHS on February 1st. Section 3014 of the ACA mandated the creation of the “pre-rulemaking process,” which includes the annual public release on December 1 of quality measures under consideration for HHS programs, as well as sufficient time for multi-stakeholder comment on the proposed measures. HHS contracted with the NQF to convene the multi-stakeholder group (named MAP by NQF) to provide analysis and strategic guidance on performance and public reporting measures, including on increasing alignment within HHS and between public and private payers. The pre-rulemaking report is one of several reports created by MAP to improve and streamline quality measurement and reporting. The pre-rulemaking report provides HHS with analysis and feedback for over 350 measures under consideration for almost 20 of its programs.
MAP’s work begins to lay the foundation for advancing quality measurement and reporting. As eQMs are developed and gradually replace more traditional measures, measure alignment will become paramount. To calculate and report an eQM requires the ability to capture structured data, extract those data elements from multiple sources within the EHR, and then run a measure logic engine to apply the rules of the measure. This is a complicated process fraught with many challenges and requiring appreciable investments by providers and EHR vendors. The life-cycle of these measures (development, validation, testing and programming of measure reporting tool) does not allow for rapid adaptation to provider workflow or the rapid incorporation of changing evidence.
Moreover, vocabulary standards for eQMs are currently being evaluated. The Office of the National Coordinator for Health IT (ONC) is working with NQF on developing the vocabulary standards for the electronic specifications for these eQMs. One of their goals is to create a standardized model (the Quality Data Model) to turn measure specifications into computable value sets, which then can be used for quality measurement.
The following are some suggestions to mitigate some the mentioned barriers to the rapid and efficient use of quality measures to improve performance:
1) Harmonize, harmonize, harmonize. As mentioned, measure harmonization is critical to the future of quality measurement and reporting. In a health care system with increasingly limited resources, it is important to shift resources from quality measurement and reporting to quality improvement, which is the ultimately goal of measurement. The burden on providers and vendors is immense, and harmonization among private and public payers, public health departments, and independent accreditation agencies will enable providers to focus on increasing value in the most high impact areas. Providers and hospitals will participate in a multitude of national, state and local programs such as PQRS, Value Based Purchasing (VBP), Inpatient Hospital Quality Reporting System, Patient Centered Medical Homes, Accountable Care Organizations (ACOs), HEDIS, state initiatives and commercial private payer initiatives. It seems logical to find a common set of measures based on common clinical priorities such as the Million Hearts Initiative or Partnership for Patient Safety. Meaningful Use Stage 1 has used a core plus menu model, which allows for a combination of standardized and customized reporting.
2) Develop a standardized minimal data set to enable calculation of 80 percent of the measures. Another challenge is the discrepancy between the data captured by various EHRs. The MAP’s pre-rulemaking report and many others in the measurement community often speak of the need for a “core set” of measures that could be broadly applied across multiple settings and providers. For an eQM to be part of this core there must be a standardized, minimum set of data that all EHRs capture. Much work is needed in identifying this data set and then integrating data capture seamlessly with workflow. With hundreds of certified EHR products in use, a standardized minimal data set will enable the calculation of 80 percent of the measures and will allow for rapid implementation and extraction of data within provider workflows. It will also allow State and local measure reporting requirements be aligned with Federal requirements.
3) Use data intermediaries to report quality measures. Even if strides are made toward measure alignment and data capture, there still will likely be a significant reporting burden placed on health care providers. Given the potential complexity of calculating eQMs, one solution would be foster the creation of data intermediaries. These organizations could import data from disparate sources from providers, calculate quality measures, and then feed the results back to the provider for quality improvement and to the relevant third parties. The Physician Quality Reporting System (PQRS) currently has a feedback loop to providers using claims data that averages 18 months; this lengthy delay impedes on any real time quality improvement. Data intermediaries could be piggybacked to State Health Information Exchanges, Regional Extensional Centers, or Quality Improvement Organizations to service ACOs and health homes enabling real time improvement while helping providers/hospitals report to Federal and State agencies.
The field of eQMs is still in its infancy, but we believe it is the future of quality measurement. ONC, CMS and NQF deserve accolades for their extensive work creating the foundation for this future. The realization of the potential of eQMs will require a number of challenges to be addressed through creative and innovative solutions. Measure harmonization, the creation of minimal data set, and fostering the development of data intermediaries are steps toward this realization.
Editor’s note: This post represents solely the views of the authors, not those of the University of Pennsylvania or the Office of the Governor of Hawaii.