Author + information
- Received June 16, 2015
- Revision received November 4, 2015
- Accepted December 1, 2015
- Published online March 1, 2016.
- Adam D. DeVore, MDa,b,∗ (, )
- Bradley G. Hammill, DrPHa,
- N. Chantelle Hardy, MPHa,
- Zubin J. Eapen, MD, MHSa,b,
- Eric D. Peterson, MD, MPHa,b and
- Adrian F. Hernandez, MD, MHSa,b
- aDuke Clinical Research Institute, Durham, North Carolina
- bDepartment of Medicine, Duke University School of Medicine, Durham, North Carolina
- ↵∗Reprint requests and correspondence:
Dr. Adam D. DeVore, Duke Clinical Research Institute, P.O. Box 17969, Durham, North Carolina 27715.
Background In 2009, the Centers for Medicare & Medicaid Services (CMS) began publicly reporting 30-day hospital readmission rates for patients discharged with acute myocardial infarction (MI), heart failure (HF), or pneumonia.
Objectives This study assessed trends of 30-day readmission rates and post-discharge care since the implementation of CMS public reporting.
Methods We analyzed Medicare claims data from 2006 to 2012 for patients discharged after a hospitalization for MI, HF, or pneumonia. For each diagnosis, we estimated trends in 30-day all-cause readmissions and post-discharge care (emergency department visits and observation stays) by using hospitalization-level regression models. We modeled adjusted trends before and after the implementation of public reporting. To assess for a change in trend, we tested the difference between the slope before implementation and the slope after implementation.
Results We analyzed 37,829 hospitalizations for MI, 100,189 for HF, and 79,076 for pneumonia from >4,100 hospitals. When considering only recent trends (i.e., since 2009), we found improvements in adjusted readmission rates for MI (−2.3%), HF (−1.8%), and pneumonia (−2.0%), but when comparing the trend before public reporting with the trend after reporting, there was no difference for MI (p = 0.72), HF (p = 0.19), or pneumonia (p = 0.21). There were no changes in trends for 30-day post-discharge care for MI or pneumonia; however, the trend decreased for HF emergency department visits from 2.3% to −0.8% (p = 0.007) and for observation stays from 15.1% to 4.1% (p = 0.04).
Conclusions The release of the CMS public reporting of hospital readmission rates was not associated with any measurable change in 30-day readmission trends for MI, HF, or pneumonia, but it was associated with less hospital-based acute care for HF.
Reducing hospital readmissions has become a national priority for patients, providers, and policy makers. Nearly 1 in 5 Medicare beneficiaries discharged from a hospital is readmitted within 30 days, and the associated estimated costs of unplanned readmissions is >$17 billion annually (1). In response to this high number of readmissions, the Centers for Medicare & Medicaid Services (CMS) adopted a number of policy changes aimed at improving patients’ outcomes. One significant change occurred in June of 2009 (2), when the CMS began to report risk-standardized hospital readmission rates publicly for patients discharged with acute myocardial infarction (MI), heart failure (HF), or pneumonia on the Hospital Compare website (3). The reasons the CMS opted for public reporting were to increase transparency for consumers and to provide an additional incentive for hospitals to improve transitional care and reduce hospital readmissions.
Although increased transparency was anticipated to improve outcomes, limited data are available to assess the impact of this policy objectively (4–6). Our study was designed to assess for temporal changes in 30-day readmission rates and to evaluate whether these outcomes improved after the implementation of public reporting. We also evaluated for improvements in post-discharge care by assessing changes in outpatient visits, emergency department (ED) visits, and observation stays without readmission. Finally, we evaluated the potentially unintended impacts of this policy by assessing for temporal changes in post-discharge mortality rates.
We used research-identifiable administrative claims data from a 5% nationally representative sample of Medicare beneficiaries from 2005 to 2012. These data include inpatient claims, outpatient claims, carrier claims, and the associated denominator files. Inpatient files contain institutional claims submitted for facility costs associated with inpatient stays. Outpatient files contain similar institutional claims for outpatient services. Carrier files contain noninstitutional provider and other professional claims for services across all settings. Denominator files include beneficiary identifiers, date of birth, sex, race or ethnicity, date of death (if present), and information about program eligibility and enrollment.
We analyzed Medicare-enrolled patients ≥65 years of age who were discharged home from a hospitalization for acute MI, HF, pneumonia, chronic obstructive pulmonary disease (COPD), or diabetes between July 1, 2006 and June 30, 2012. These dates were chosen to permit analysis 3 years before and after the public reporting of risk-standardized hospital readmission rates in June 2009 (2). Because the Medicare public reporting program did not include COPD or diabetes, these hospitalizations served as comparator conditions. Both COPD and diabetes were selected as comparator conditions because neither disorder was directly affected by the impact of public reporting during this time period and both are common causes of 30-day all-cause readmissions among Medicare beneficiaries (1). Patients were allowed to be represented multiple times in the analysis if subsequent hospitalizations fit the inclusion and exclusion criteria and occurred >30 days after another hospitalization of the same type.
Hospitalizations of interest were identified on the basis of their primary discharge diagnoses: acute MI (International Classification of Disease-9th Revision-Clinical Modification [ICD-9-CM] codes 410.x0, 410.x1), HF (402.x1, 404.x1, 404.x3, 428.x), pneumonia (480.x, 481, 482.x, 483.x, 485, 486, 487.0, 488.11), COPD (491.21, 491.22, 491.8, 491.9, 492.8, 493.2x, 496, or a primary ICD-9-CM code of 518.81, 518.82, 518.84, 799.1 combined with a secondary ICD-9-CM code of acute exacerbation of COPD: 491.21, 491.22, 493.21, or 493.22), and diabetes (250.x). We excluded planned hospitalizations for all study groups and hospitalizations for acute MI that resulted in same-day discharges. We also excluded hospitalizations for patients who did not have fee-for-service Medicare coverage for ≥12 months before admission, to allow characterization of medical history, and for the 30 days after discharge, to allow complete ascertainment of outcomes.
The primary outcome of interest consisted of unplanned all-cause 30-day hospital readmissions. These readmissions were identified using inpatient claims data. Planned readmissions were defined similar to previous analyses (5,6). Other assessments of interest included all-cause mortality, ED visits, outpatient evaluation and management visits, and observation stays without readmission in the 30 days after discharge. Mortality was determined from the denominator file. ED visits were identified from inpatient and outpatient claims having any of the following revenue center codes: 0450 (emergency room – general classification), 0451 (emergency room – Emergency Medical Treatment & Labor Act emergency medical screening services), 0452 (emergency room – emergency room beyond Emergency Medical Treatment & Labor Act screening), 0456 (emergency room – urgent care), 0459 (emergency room – other), or 0981 (professional fees – emergency room). Outpatient evaluation and management visits were identified from carrier claims classified into any of the following Berenson-Eggers Type of Service (BETOS) codes: M1A office visits – new, M1B office visits – established, M4A home visit, M4B nursing home visit, M5A specialist – pathology, M5B specialist – psychiatry, M5C specialist – ophthalmology, M5D specialist – other, and M6 consultations (excluded codes from this category include M2x hospital and M3 ED codes). Observations stays were identified from outpatient claims having a revenue center code of 0762 (observation room). Only observation stays that did not lead to an inpatient stay were counted.
For each population (acute MI, HF, pneumonia, COPD, diabetes), we first summarized the annual observed outcomes and post-discharge care. From these data, we used regression models to estimate the unadjusted and adjusted trends for all-cause 30-day readmission. Because care may not have changed immediately after the implementation of public reporting of risk-standardized readmission rates, the slope of the trend line was allowed to vary in the model before and after implementation of public reporting. The trends are reported as relative changes per year. For example, if the pre-intervention readmission rate was 22.0%, then a −2.0% estimated change indicates a 1-year improvement of 0.44% (21.6%) in absolute terms. The pre-implementation and post-implementation trend estimates were compared using a Wald test. These models were specified with log link and Poisson errors, which allowed us to estimate the relative change in each outcome over time. We used generalized estimating equation methods to allow for the calculation of robust standard errors, which accounted for clustering of records within hospitals.
We plotted the risk-standardized quarterly readmission rates against the adjusted estimated trends for each population. In these plots, we show a predicted trend line, which is an extension of the pre-reporting trend to the end of the study period. We adjusted for the patients’ demographics (age, sex, race) and comorbid conditions (cancer, COPD, cerebrovascular disease, dementia, diabetes, HF, hypertension, ischemic heart disease, liver disease, peptic ulcer disease, peripheral vascular disease, renal disease, rheumatic disease, valvular heart disease) by using validated algorithms (7,8). We plotted similar data for post-discharge mortality for each study population. We used the outpatient and carrier files to assess patients’ comorbid conditions in the adjustment models. For all comparisons, p values <0.05 were considered statistically significant. Data were analyzed using SAS version 9.2 (SAS Institute, Cary, North Carolina). The institutional review board of the Duke University Health System (Durham, North Carolina) approved this study.
Between July 1, 2006 and June 30, 2012, at >4,100 hospitals in the United States, there were the following hospitalizations: 37,829 for acute MI; 100,189 for HF; 79,076 for pneumonia; 80,091 for COPD; and 17,907 for diabetes (Table 1). The 30-day readmission rates after acute MI declined slightly from 17.5% in year 1 to 16.1% in year 6. The 30-day readmission rates for HF, pneumonia, COPD, and diabetes were unchanged over time. The 30-day post-discharge mortality rates also decreased for acute MI from 3.8% in year 1 to 2.9% in year 6. There were small changes in the 30-day post-discharge mortality rate for HF, pneumonia, COPD, and diabetes, but no apparent trends.
Unadjusted and adjusted trends in 30-day all-cause readmission are displayed in Table 2, Central Illustration, and Figure 1. After adjustments for patient-related and hospital factors, we observed the following improvements post-implementation of public reporting in adjusted readmission rates: for MI, −2.3% (95% confidence interval [CI]: −5.1 to 0.6); for HF, −1.8% (95% CI: −3.3 to −0.2); and for pneumonia, −2.0% (95% CI: −4.1 to 0.2). In contrast, patients discharged with COPD had the largest relative improvement (−2.6%, 95% CI: −4.5% to −0.7%), and patients discharged with diabetes had no improvement (0.1%, 95% CI: −4.1 to 4.5). Nevertheless, when comparing the trend before public reporting with the trend after public reporting, there were no differences for MI (p = 0.72), HF (p = 0.19), or pneumonia (p = 0.21).
Unadjusted and adjusted trends in 30-day all-cause post-discharge mortality are displayed in Table 3 and Figure 2. In adjusted analyses, the trend in 30-day post-discharge mortality for patients with HF was −2.4 (95% CI −6.2% to 1.6%) before the implementation of public reporting and 3.1% (95% CI −1.3% to 7.6%) after implementation, although there was no statistically significant difference in the change in trend (p = 0.15). We noted no difference in the trend before and after the implementation of public reporting for the other study groups.
The observed rates of post-discharge care are displayed in Table 4. For all study groups, there was an increase during the entire 6-year study period in any 30-day post-discharge observation stays without readmission. For patients hospitalized with acute MI, this rate increased from 2.1% to 3.2%; for patients hospitalized with HF, this rate increased from 1.6% to 2.6%; for patients hospitalized with pneumonia, this rate increased from 1.3% to 2.3%; for patients hospitalized with COPD, this rate increased from 1.4% to 2.1%; and for patients hospitalized with diabetes, this rate increased from 1.6% to 2.5%. There were also increases in the number of any 30-day post-discharge ED visits for HF and diabetes, but not for the other study groups. The rates of any 30-day post-discharge outpatient visits for all study groups were unchanged over time.
Adjusted trends of post-discharge care are displayed in Table 5. We observed an increase in the trend for 30-day post-discharge outpatient visits for pneumonia from −0.2% (95% CI: −0.7 to 0.2) to 0.9% (95% CI: 0.3 to 1.4; p = 0.01), but not for the other study groups. We also found a decrease in the trend for HF-related ED visits from 2.3% (95% CI: 1.1 to 3.6) to −0.8% (95% CI: −2.1 to 0.5; p = 0.007) and observation stays from 15.1% (95% CI: 9.0 to 21.5) to 4.1% (95% CI: −1.6 to 10.0; p = 0.04), but not for the other study groups.
We analyzed Medicare claims data to assess trends of 30-day outcomes before and after the implementation of CMS public reporting and found that this intervention was not associated with any measurable change in 30-day readmission or 30-day post-discharge mortality trends. However, we did note associations with increased post-discharge outpatient visits for pneumonia and decreased ED visits and observation stays without readmission for HF. These indirect assessments of transitional care suggest some improvements related to the policy change, but they were not able to translate into actual improvements in patients’ outcomes. Our study is the first to specifically assess the impact of the 2009 CMS policy change on public reporting of readmissions, and it adds to the evidence base on the use of public reporting as a quality improvement tool.
Our findings build on previous national reports of readmission and post-discharge mortality rates (4–6,9,10). Data from a peer-reviewed analysis (5) and the 2014 Medicare Hospital Quality Chartbook (6), using Medicare claims data combined with Veterans Health Administration data, suggest recent improvements in national unplanned 30-day readmission rates for patients hospitalized with acute MI, HF, and pneumonia after implementation of public reporting (i.e., after June 2009). Our analysis extends these results by considering previous secular trends in care (i.e., before June 2009) to specifically assess the impact of the 2009 policy change. Similarly, previous data suggest modest increases in post-discharge mortality rates for patients with HF, no change for patients discharged with pneumonia, and modest improvements for patients discharged with acute MI (5,6,9,10). By considering previous secular trends, we observed similar patterns but again noted no change in trends after the implementation of public reporting.
Our findings suggest that the 2009 CMS policy decision to report hospital readmission rates publicly on the Hospital Compare website was not associated with improvements in outcomes. These findings have important policy implications and suggest that well-conducted trials of quality improvement interventions should be performed before widespread dissemination of these interventions. This is especially true considering that almost all hospitals in the United States participate in the Hospital Compare program; therefore, there are a limited number of available control groups for the evaluation of quality improvement interventions after implementation. Importantly, we must remain aware that the current CMS policy on public reporting for acute MI, HF, and pneumonia is only one early example of public reporting at the federal level. Readmission rates related to other conditions and procedures are also now reported on the Hospital Compare website and include COPD, stroke, coronary artery bypass graft surgery, hip or knee surgery, and overall readmission rates (3). Public reporting is also being extended to the provider level with the Physician Compare website (11), which is a newer program that will include information on physician-level quality metrics (12).
Previous retrospective observational data on the impact of public reporting of hospital performance have yielded conflicting results (13–19). Previous randomized trial data are limited to the EFFECT (Enhanced Feedback for Effective Cardiac Treatment) study, which assessed whether publicly released hospital report cards could improve hospital quality of care for patients hospitalized with acute MI or HF in Ontario, Canada (20). The investigators found no improvement in composite scores of performance measures for hospitals randomized to public report cards compared with controls. The reasons that public reporting may not actually improve readmission rates are not apparent from these earlier studies or from our data. Regardless, the benefit of public reporting of readmissions should be balanced by potential unintended consequences, including diverting attention and resources away from other quality improvement efforts.
Our results suggest that changes in post-discharge care are associated with this policy change, specifically, increases in outpatient visits and decreases in hospital-based acute care (i.e., ED visits and observation stays without readmission); these measures are indirect assessments of transitional care. For example, the transition of care from the hospital to outpatient setting may be improved by ensuring outpatient clinic follow-up after discharge, yet most studies have focused on follow-up early after discharge (i.e., within 7 days of discharge) (21). There is also growing interest in using post-discharge ED visits as a quality metric for transitional care (22), and our data suggest decreased ED visits for HF. Finally, earlier studies noted increases in observation stays during this time period (23,24), but our data suggest a decrease in the trend for post-discharge observation stays without readmission for HF. What is unclear is which interventions were used by hospitals to achieve these potential benefits in post-discharge care.
Our study has several limitations, in addition to those inherent in a retrospective analysis of claims data. First, we assessed for changes in trends in the month after public reporting of readmissions. Interventions aimed at reducing hospital readmissions implemented as a consequence of this policy decision may have taken effect before public reporting or in the months after this change. Second, our analysis focused on public reporting as an isolated intervention, but it is part of a larger policy agenda implemented over multiple years, which includes the recent extension of this policy to impose cuts in total Medicare reimbursements for higher-than-expected readmission rates (25). The impact of this policy change has yet to be evaluated, but it may be high, given the financial penalties. Finally, our analysis considers nationwide hospital performance, but it does not evaluate improvements at individual hospitals associated with implementation of public reporting.
In this analysis of Medicare claims data, we found no association between the 2009 CMS policy decision to report hospital readmission rates and changes in trends for readmission rates publicly, nor did we find changes in trends for post-discharge mortality rates. We did observe some improvements in post-discharge care. The current focus on readmissions as a quality measure is unlikely to change in the near future, and policy interventions aimed at reducing hospital readmissions should be evaluated before widespread dissemination.
COMPETENCY IN SYSTEMS-BASED PRACTICE: Public reporting of hospital readmission rates for acute MI, HF, and pneumonia was not associated with changes in readmission rates or post-discharge mortality rates.
TRANSITIONAL OUTLOOK: More work is needed to evaluate the potential impact of policy interventions aimed at reducing hospital readmissions before these interventions are disseminated.
The authors thank Erin Hanley, MS, for her editorial contributions to this manuscript. Ms. Hanley did not receive compensation for her contributions, apart from her employment at the institution where this study was conducted.
This project was supported in part by grant number U19HS021092 from the Agency for Healthcare Research and Quality (AHRQ). The content is solely the responsibility of the authors and does not necessarily represent the official views of the AHRQ. Dr. Eapen has membership on the advisory boards of Novartis, Cytokinetics, and Amgen; is a consultant for Amgen and SHL Telemedicine; and has received an honorarium from Janssen. Dr. Peterson has received grant support from the American College of Cardiology, American Heart Association, and Janssen; and has been a consultant for Bayer, Boehringer Ingelheim, Merck, Valeant, Sanofi, AstraZeneca, Janssen, Regeneron, and Genentech. Dr. Hernandez has received research funding from Amgen, Inc., AstraZeneca, Bayer Corporation US, Bristol-Myers Squibb, GlaxoSmithKline, Merck & Co., and Portola Pharmaceutical (all significant); has received personal income for consulting or other services (including CME) from Bayer Corporation US; and has received personal income for consulting or other non-CME services from Amgen, Inc., AstraZeneca, Bayer Corporation US, Eli Lilly & Co., Gilead Sciences, Inc., GlaxoSmithKline, Janssen, Merck & Co., Novartis Pharmaceutical Co., Ortho-McNeil-Janssen Pharmaceuticals, Inc., Pfizer, Pluristem Therapeutics, Inc., Sensible, and Myocardia. All other authors have reported that they have no relationships relevant to the contents of this paper to disclose.
- Abbreviations and Acronyms
- Centers for Medicare & Medicaid Services
- chronic obstructive pulmonary disease
- emergency department
- heart failure
- International Classification of Disease-9th Revision-Clinical Modification
- myocardial infarction
- Received June 16, 2015.
- Revision received November 4, 2015.
- Accepted December 1, 2015.
- American College of Cardiology Foundation
- ↵Centers for Medicare & Medicaid Services. Medicare.gov. Hospital Compare. Medicare.gov website. Available at: http://www.medicare.gov/hospitalcompare/search.html. Accessed June 16, 2015.
- Gerhardt G.,
- Yemane A.,
- Hickman P.,
- et al.
- Suter L.G.,
- Li S.X.,
- Grady J.N.,
- et al.
- ↵Yale New Haven Health Services Corporation Center for Outcomes Research and Evaluation. Medicare Hospital Quality Chartbook. performance report on outcome measures. September 2014. Centers for Medicare & Medicaid Services. Available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/Medicare-Hospital-Quality-Chartbook-2014.pdf. Accessed March 11, 2015.
- Bernheim S.M.,
- Grady J.N.,
- Lin Z.,
- et al.
- ↵Centers for Medicare & Medicaid Services. Medicare.gov: Physician Compare. Medicare.gov website. Available at: http://www.medicare.gov/physiciancompare/search.html. Accessed October 29, 2015.
- Joynt K.E.
- Peterson E.D.,
- DeLong E.R.,
- Jollis J.G.,
- Muhlbaier L.H.,
- Mark D.B.
- Ryan A.M.,
- Nallamothu B.K.,
- Dimick J.B.
- Waldo S.W.,
- McCabe J.M.,
- O’Brien C.,
- et al.
- Feng Z.,
- Wright B.,
- Mor V.