Professional Documents
Culture Documents
can inform decisions about whether and how programs should be modified
to best promote quality improvement (QI). During the past decade, programs for reporting hospital quality performance have proliferated, from local programs to broader initiatives such as Leapfrog's patient-safety program.' In 2002
the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) began requiring such reporting.^ In 2003 the Centers for Medicare and Medicaid Services (CMS) launched its voluntary Hospital Quality Initiative (HQI). Hospitals'
response was anemic at first, until the Medicare Prescription Drug, Improvement,
and Modernization Act (MMA) established that nonparticipating hospitals
would not receive a 0.4 percent annual payment update. Participation then increased dramatically, and now nearly all eligible hospitals are reporting.^
Several studies document the benefits of providing hospitals feedback on their
quality performance, particularly comparative information.'' An evaluation of the
HQI pilot found that hospitals responded by placing higher priority on quality
Hoangmai Pham (mpham@hschange.org) is a senior health researcher at the Centerfor Studying Health System
Change (HSC) in Washington, D.C. Jennifer Coughlan is a research assistant at Mathematica Policy Research,
also in Washinffon. Ann O'Malley is a senior health researcher at HSC.
1412
DOI 10.1377/hlthaff.25.5.1412 C2006 Project HOPE-Ue People-to-Peopk Heahh Foundation, Inc.
September/October
2006
Q U A L I T Y
R E P O R T I N G
performance, improving data collection and documentation, and diverting resources from other priorities but rarely engaging in new QI activities.^ Other studies suggest that reporting results in intensification of QI activity, a greater focus
on quality on the part of hospital management, and improved outcomes, especially
when data on performance are pubhcly disclosed.* Other studies found inconsistent effects on outcomes or QI activity.''
In this paper we build on earlier work by examining the interactions between
different quahty-reporting programs and their impact on operations at hospitals
in diverse urban communities, from the perspectives of hospital executives and
front-line staff, and focusing on care for patients with congestive heart failure
(CHF). We defined "quahty reporting" broadly to include programs that monitor
quality performance, with the potential to influence hospital management
through public reporting, private benchmarking, or specific incentives.
1413
D A T A W A T C H
identified axes along which programs vary, besides their local versus national na'
ture: (1) sponsorshipby purchaser (for example. Medicare), regulator (for example, state health departments), private insurer, professional groups (for example. Society of Thoracic Surgeons), or other private organizations (for example,
hospital consortia); (2) program typethose requiring hospitals to submit primary data for public reporting (for example, JCAHO), those requiring primary
data for private benchmarking (for example, hospital consortia), and those that
rank performance based only on secondary data such as claims or patient surveys
(for example, HealthGrades); (3) mandatory versus voluntarya spectrum rather
than distinct categories (most programs are voluntary), closely related to whether
incentives are attached to participation; (4) incentivesincentives explicitly tied
to participation perceived as rewards (for example, pay-for-performance bonuses)
or punitive (for example, loss of accreditation); (5) quality improvement support^whether programs provide prescriptive information to guide hospitals' Ql
activities (for example. Society of Thoracic Surgeons); and (6) inclusion of clinical
outcome measures^whether programs include outcomes or only structural or
process-of-care measures (for example, the 100,000 lives Campaign). We corroborated hospitals' participation in national reporting programs using data from
JCAHO's Quality Check and the CMS' Hospital Compare Web sites on hospital
performance."
Study Findings
Multiple programs. We found that hospitals respond to multiple reporting
programs: All thirty-six hospitals report to the CMS and JCAHO based on Hospital
Compare and Quality Check. However, respondents at every hospital also reported
participating in additional programsa mean of 3.3 (range, 1-7), often starting years
before CMS and JCAHO reporting. In total, respondents mentioned thirty-eight
unique programs. Exhibit 1 shows the number of hospitals that participate in reporting, by program characteristics.
Institutional support and attitudes. Quality officers, CEOs, and hospital association leaders concurred that linkages to payment, JCAHO accreditation, and
peer pressure from public benchmarking have made quality measurement and improvement higher priorities for hospital leadership. Some executives perceived the
potential for purchasers and consumers to eventually make care choices based on
performance data, but few believed that this was imminent. Heightened organizational attention to quality performance manifested in myriad ways, including (1) explicit inclusion of Ql priorities on trustees' agendas and in hospitals' formal strategic planning; (2) boards and senior management accepting more regular, defined
responsibilities for reviewing performance data and approving Ql strategies; (3) restructuring of executive compensation to include quahty performance-based incentives; (4) half of quahty officers reporting that it was easier to lobby management for
resources for quahty measurement and Ql; and (5) senior management actively ex-
1414
September/October 2006
Q U A L I T Y
R E P O R T I N G
EXHIBIT 1
Hospital Participation In Quality-Reporting Programs, By Program Characteristics,
2004-05
Program characteristic
Sponsorship
National public (CMS, JCAHO, Premier)
National private (IHI, Leapfrog, NQF)
Locai public (state, QIO)
Local private (heaith pians, purchasers)
Local/regionai consortia (acadennic)
Professional societies (ACC, STS)
Other
36
26
19
17
11
12
4
Program type
Public reporting using primary data
Private benchmarking of primary data
Soie use of secondary data
36
20
7
21
SOURCE: Community Tracking Study (CTS) Round Five site visit interviews, 2004-05.
NOTES: CMS is Centers for Medicare and Medicaid Services. JCAHO is Joint Commission on Accreditation of Healthcare
Organizations. IHI is Institute for Healthcare improvement. NQF is National Quality Forum. QiO is quality improvement
organization. ACC is American Coiiege of Cardiology. STS is Society of Thoracic Surgeons. ACS is American Coiiege of Surgeons.
ADHERE is Acute Decompensated Heart Failure National Registry. VHA is Voiuntary Hospitai Association.
'All hospitals participated in muitipie programs. N 36.
ercising leadership withfront-linestaff. For example, at an Orange County (California) hospital, Ql staff who met with resistancefromphysicians during a root-cause
analysis of poor performance asked the CEO to come to the care unit and intervene
in person, to immediate positive effect.'^
Respondents similarly credited CMS and JCAHO programs v^th improving
physicians' attitudes toward quality measurement and improvement. Aside from
stronger mandates from leadership, quality officers could leverage the payment
and accreditation consequences in these working relationships. Quality officers
and CEOs believed that physicians liked having focused sets of clinical priorities
on which to work. They believed that physicians responded to performance feedback, including individual profiling at some hospitals, and peer pressure from
public reporting of benchmarked performance data. Respondents found physicians more deferential to Ql staff than they were before participating in quality reporting and more engaged in Ql activities that some once dismissed.
Specific clinical conditions. Reporting programs have also affected the specific clinical conditions and quality measures on which hospitals focus. This influence was not always viewed positively, and it varied by both program type and
whether the program includes ongoing Ql support.
Many hospital staff who were engaged in long-standing Ql activities dismissed
CMS and JCAHO requirements as having a minor impact on Ql priorities. But
1415
D A T A W A T C H
CEOs and quality officers at most hospitals contended that reporting programs,
particularly those of the CMS and JCAHO, focus on an artificially limited number
of objectives. The opportunity cost was a shift in attention and resources away
from other important clinical areas.
Response to incentives. Hospitals respond to incentives, but they value ongoing support for QL Respondents perceived the impetus to participate in JCAHO
and CMS programs to be of a "push" nature, because these are effectively mandatory
and involve public disclosure. In response, hospitals directed resources specificaUy
to CMS and JCAHO core conditionsCHF, community-acquired pneumonia,
heart attacks, and stroke^but without taking standardized approaches to improving performance.
In contrast, respondents described the focus of some other programs as a "pull,"
because of the Ql support they offered in the form of prescribed changes in care
processes. Several quality officers cited as a reason for focusing on hospitalacquired pneumonia the Institute for Healthcare Improvement's (IHI's) "ventilator bundle," which recommends a set of evidence-based practices, such as daily
withdrawal of sedatives, to prevent ventilator-associated pneumonia. Respondents found these programs attractive both because they don't leave hospitals
flailing about trying to identify evidence-based interventions on their own and
because they encourage a culture of continuous Ql. Respondents most often mentioned programs sponsored by IHI, state quality improvement organizations
(QIOs), and professional organizations (Society for Thoracic Surgery, American
College of Cardiology).
Adequacy of resources. Resources for quality measurement and improvement have increased, but they remain inadequate. Nearly all respondents, including
representatives of reporting programs, believed that reporting increases hospitals'
costs, for both compliance and processes to improve performance. Half of quality officers reported staff increases of up to twelve full-time equivalents (FTEs) devoted
to reporting and Ql in the previous year. In markets such as northern New Jersey,
where hospitals are financially less healthy, respondents considered reporting to be
a particular cost burden.
At many hospitals, management diverted staff from other tasks, such as financial reporting. Other hospitals simply gave existing staff more responsibilities.
Half of clinical directors believed that reporting had resulted in a major increase in
their workload; only a minority reported no change.
Some respondents found it difficult to even assess the net cost burden of reporting because the associated costs are spread over a variety of hospital cost centers,
reporting requirements change over time, and the impact of improved outcomes
on finances is difficult to measure.
Hospitals have committed more resources to reporting and Ql activities largely
in the form of more staff equivalents, new or upgraded software, and contracting
with vendors to manage their data, but rarely in new hardware. Many hospitals
1416
S e p t e m b e r / O c t o b e r 2006
Q U A L I T Y
R E P O R T I N G
1417
D A T A W A T C H
gic goals, newly on par in importance with financial performance. Most hospitals
share their overall performance scores with their entire staffs.
At eight hospitals, reporting led to implementation of or planning for individual
physician profiling, sometimes using benchmarks based on specialty peer groups,
or written alerts identifying instances of missed care. But in nearly all cases, hospitals proved that they were sensitive to physicians' concerns by adjusting profiles
for case-mix severity and by reassuring physicians about data confidentiality. Respondents cited the HQI as the primary motivator for physician profiling; some
hospitals plan to profile care only for CMS conditions.
Hospitals restructured staff into multidisciplinary teams that were assigned to
track and improve overall performance and performance for individual conditions,
particularly those targeted by the CMS. These teams are more integrated across
different service areas than in the past. And while a few hospitals make a single
manager responsible for improvement in each clinical area, most are content to assign QI responsibility to teams, letting them select the best approaches.
At least six hospitals formally tied executive or physician compensation to
quality performance, again largely driven by the HQI. At hospitals in Miami, Orange County, and Phoenix, executives have 30-70 percent of their bonuses based
on CMS scores. At an Indianapolis hospital, sixty top managers are subject to bonus withholds for poor performance. Even hospitals that instituted such incentives before CMS reporting began modified their pay structures afterward to
incorporate CMS conditions.
Adoption and modification of specific Qi interventions. Hospital respondents were divided on whether reporting as a whole had much effect on the quantity
or type of specific process changes their hospitals made to improve care. Those who
felt that reporting had httle impact on QI interventions pointed out that their hospitals had been active in QI before participating in newer programs, or that their
choices of interventions were largely internally driven, or both.
According to clinical directors, reporting had a moderate impact on hospitals'
use of specific QI interventions to improve care for CHF patients. Hospitals often
adopted wholesale the recommendations of programs offering QI support. On average, specific reporting programs influenced the adoption or modification of haff
of the QI tools we examined (Exhibit 2).
Two cases typify how reporting influenced use of QI tools. At an Orange
County hospital, staff reevaluated an existing CHE program because of reporting,
creating new standing-order sets and critical pathways. But they also modified an
existing Web portal system to integrate data on patients from different hospital
units and to improve coordination. Another hospital in Syracuse was already using critical pathways for CHE but did not collect or review data on staff compliance v^th pathways until the hospital began quality reporting. After switching to
electronic data collection for reporting purposes, staff could analyze pathway
comphance and direct interventions.
1418
September/October 2006
Q U A L I T Y
R E P O R T I N G
EXHIBIT 2
Influence Of Quality-Reporting Programs On Hospitals' Use Of Quality improvement
(Qi) Toois For Congestive Heart Faiiure (CHF)
Number of
Number of hospitals
hospitals where
used/planned"
was Influenced by
Ql tool
(n = 26)
programs (n = 26)
20
10
20
Dissemination of practice
guidelines
Nonelectronic chart reminders
21
22
12
16
Standing orders
Critical pathways
22
19
11
8
25
15
10
9
22
12
19
10
Postdischarge foiiow-up*^
15
SOURCE: Community Tracking Study (CTS) Round Five site visit interviews, 2004-05.
NOTES: Programs are listed in descending order of the frequency with which respondents mentioned them. Brackets group
programs that were mentioned approximately the same number of times. Respondents could cite more than one program as
influencing use of a quality Improvement tool. JCAHO is Joint Commission on Accreditation of Healthcare Organizations. CMS is
Centers for Medicare and Medicaid Services. ACC is American College of Cardiology. STS is Society of Thoracic Surgeons. IHI is
Institute for Healthcare Improvement. ADHERE is Acute Decompensated Heart Failure National Registry. QIO is quality
improvement organization. VHA is Voluntary Hospital Association.
"Interviews with clinical directors for CHF quaiity improvement were conducted at twenty-six of the thirty-six hospitals in the
CTS site visits. Of the remaining ten hospitals, two did not have a specific Ql program for CHF, and interviews could not be
scheduled with eight.
"We distinguish between the CMS Hospital Quality Initiative (HQI) and certain programs supported by state Ql organizations
because although QIOs act as agents for the CMS, state-level initiatives are varied and distinct and may include Ql support,
unlike the HQI.
"Refers to postdischarge follow-up arranged with designated hospital staff.
1419
D A T A W A T C H
1420
September/October 2006
Q U A L I T Y
R E P O R T I N G
1421
D A T A W A T C H
The authors are grateful to Thomas Bodenheimer, Paul Ginsburg, and Cara Lesserfor comments on earlier drafts.
Community Tracking site visits are supported by a grantfrom the Robert Wood]ohnson Foundation to the Center
for Studying Health System Change
NOTES
1. J.K. Barr et al., "Public Reportmg of Hospital Patient Satisfaction: The Rhode Island Experience," Health
Care Financing Review 23, no. 4 (2002): 51-70; and D.B. Mukamel and A.I. Mushlin, "Quality of Care Information Makes a Difference: An Analysis of Market Share and Price Changes after Publication of the New
York State Cardiac Surgery Mortality Reports," Medical Care 36, no. 7 (1998): 945-954. For more information on the Leapfrog initiative, see its home page, http://www.leapfroggroup.org.
2. Joint Commission on Accreditation of Healthcare Organizations, "Ongoing Activities: 2000 to 2004 Standardization of Metrics," http://www.jointcommission.org/NR/rdonlyres/551576B9'4E5C'4C0D-ACA5'
6FC5D788A5D4/0/OngoingActivities.pdf (accessed 20 June 2006); and Centers for Medicare and Medicaid Services, "Hospital Quality Initiative Overview," December 2005, http://www.cms.hhs.gov/Hospital
QualityInits/downloads/HospitalOverview200512.pdf (accessed 20 June 2006).
3. Hospital Quality Alliance, "Participation," 24 May 2006, http://www.aha.org/aha/key_issues/quality
alliance/participation/participation.html (accessed 20 June 2006).
4. E.H. Bradley et al., "Data Feedback Efforts in Quality Improvement: Lessons Learned from U.S. Hospitals,"
Quality and Safety in Health Care 13, no. 1 (2004): 26-31; and R. Gibberd et al., "Using Indicators to Quantify
the Potential to Improve the Quality of Health Care," Intemational Joumal for Quality in Health Care 16, no. 1
Supp. (2004): B7-i43.
5. S. Fdt-Osk, M. Lee, and M. Maxfield, Hospitals' Early Expericnccwith the 'National Voluntary Hospital Reporting Initiative (Hanover, Md: Delmarva Foundation for Medical Care Inc., 2005).
6. J.H. Hibbard, J. Stockard, and M. Tusler, "Does Publicizing Hospital Performance Stimulate Quality Improvement Efforts?" HealtfiAfairs 22, no. 2 (2003): 84-94; E.F Ellerbeck et aL, "Impact of Quality Improvement Activities on Care for Acute Myocardial Infarction," Internationalpumalfor Quality in Health Care 12, no.
4 (2000): 305-310; M.N. Marshall et al., "The Public Release of Performance Data: What Do We Expect to
Gain? A Review of the Evidence," Journal of the American Medical Assodation 283, no. 14 (2000): 1866-1874; J.H.
Hibbard, J. Stockard, and M. Tusler, "Hospital Performance Reports: Impact on Quality, Market Share, and
Reputation," Health Afairs 24, no. 4 (2005): 1150-1160; andJ. Chen et al., "JCAHO Accreditation and Quality of Care for Acute Myocardial Infarction," Health Affairs 22, no. 2 (2003): 243-254.
7 A. Mehrotra, T. Bodenheimer, and R.A. Dudley, "Employers' Efforts to Measure and Improve Hospital
Quality: Determinants of Success," Health Ajnirs 22, no. 2 (2003): 60-71; and C.W. Pai, G.K. Finnegan, and
M J. Satwicz, "The Combined Effect of Public Profiling and Quality Improvement Efforts on Heart Failure
Management," Joint Commission Joumal on Quality Improvement 28, no. 11 (2002): 614-624.
8. C.S. Lesser et al., "The End of an Era: What Became of the 'Managed Care Revolution' in 2001?" Health Services Research 38, no. 1, Part 2 (2003): 337-355.
9. Advisory Council to Improve Outcomes Nationwide in Heart Failure, "Consensus Recommendations for
the Management of Congestive Heart Failure," American Joumal of Cardiology 83, no. 2A (1999): LA-38A.
10. Bradley et al., "Data Feedback Efforts"; Ellerbeck et al., "Impact of Quality Improvement Activities"; T.A.
Merritt, M. Gold, and J. Holland, "A Critical Evaluation of Clinical Practice Guidelines in Neonatal Medicine: Does Their Use Improve Quality and Lower Costs?" Joumal of Evaluation in Clinical Practice 5, no. 2
(1999): 169-177; and D.M. Yealy et al., "The Emergency Department Community-Acquired Pneumonia
Trial Methodology of a Quahty Improvement Intervention," Annals of Emergency Medicine 43, no. 6 (2004):
770-782.
11. Eor information on Quality Check, see its home page, http://www.quahtycheck.org; for information on
Hospital Compare, see its home page, http://www.hospitalcompare.hhs.gov.
12. "Sentinel Events: Approaches to Error Reducdon and Prevention," Joint Commission Journal on Quality Improvement 24, no. 4 (1998): 175-186.
13. S.C. Williams et al., "Quahty of Care in U.S. Hospitals as Reflected by Standardized Measures, 20022004," New En^and Joumal ofMedidne 353, no. 3 (2005): 255-264.
14. Ibid; and A. Jha et al., "Care in U.S. HospitalsThe Hospital Quality AUiance Program," New En^andJournal ofMedidne 353, no. 3 (2005): 265-274.
1422
September/Occober 2006