[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Figure 1.  Elixhauser Comorbidity Index– and Age-Adjusted Association Between Hospital-Level Do Not Resuscitate (DNR) Prevalence Quintile and Risk-Standardized Mortality Rate (RSMR)
Elixhauser Comorbidity Index– and Age-Adjusted Association Between Hospital-Level Do Not Resuscitate (DNR) Prevalence Quintile and Risk-Standardized Mortality Rate (RSMR)

30-Day RSMRs were significantly higher for hospitals in the highest present-on-admission DNR prevalence quintiles vs the lowest quintiles (all P < .001). The points represent the Elixhauser Comorbidity Index- and age-adjusted 30 day least square mean for RSMR in each of the 5 DNR quartiles. The error bars represent the Wald 95% CIs. 1 Indicates the lowest quintile and 5 the highest quintile; AMI, acute myocardial infarction; COPD, chronic obstructive pulmonary disease; and HF, heart failure.

Figure 2.  Elixhauser Comorbidity Index– and Age-Adjusted Association Between Hospital-Level Do Not Resuscitate (DNR) Prevalence Quintile and Risk-Standardized Readmission Rate (RSRR)
Elixhauser Comorbidity Index– and Age-Adjusted Association Between Hospital-Level Do Not Resuscitate (DNR) Prevalence Quintile and Risk-Standardized Readmission Rate (RSRR)

The highest present-on-admission DNR prevalence quintiles had the lowest 30-day RSRR (all P < .001). The points represent the Elixhauser Comorbidity Index- and age-adjusted 30 day least square mean for RSRR in each of the 5 DNR quartiles. The error bars represent the Wald 95% CIs. 1 Indicates the lowest quintile and 5 the highest quintile; AMI indicates acute myocardial infarction; COPD, chronic obstructive pulmonary disease; and HF, heart failure.

Figure 3.  Elixhauser Comorbidity Index– and Age-Adjusted Association Between Hospital-Level Present-on-Admission Do Not Resuscitate (DNR) Prevalence and Probability of Hospital Readmissions Reduction Program (HRRP) Penalization
Elixhauser Comorbidity Index– and Age-Adjusted Association Between Hospital-Level Present-on-Admission Do Not Resuscitate (DNR) Prevalence and Probability of Hospital Readmissions Reduction Program (HRRP) Penalization
Table 1.  Patient-Level Characteristics of Condition-Specific Cohorts, 2015-2018
Patient-Level Characteristics of Condition-Specific Cohorts, 2015-2018
Table 2.  Hospital-Level Characteristics of Condition-Specific Cohorts, 2015-2018
Hospital-Level Characteristics of Condition-Specific Cohorts, 2015-2018
1.
Castellucci  M.  CMS, Yale New Haven Health on hot seat over design of quality measures.   Modern Healthcare. January 19, 2019. Accessed April 22, 2019. https://www.modernhealthcare.com/article/20190119/NEWS/190119904/cms-yale-new-haven-health-on-hot-seat-over-design-of-quality-measuresGoogle Scholar
2.
Silber  JH, Satopää  VA, Mukherjee  N,  et al.  Improving Medicare’s Hospital Compare mortality model.   Health Serv Res. 2016;51(suppl 2):1229-1247. doi:10.1111/1475-6773.12478 PubMedGoogle ScholarCrossref
3.
O’Brien  SM, Jacobs  JP, Pasquali  SK,  et al.  The Society of Thoracic Surgeons Congenital Heart Surgery Database mortality risk model: part 1-statistical methodology.   Ann Thorac Surg. 2015;100(3):1054-1062. doi:10.1016/j.athoracsur.2015.07.014 PubMedGoogle ScholarCrossref
4.
Centers for Medicare and Medicaid Services, US Department of Health and Human Services. Hospital value-based purchasing. September 2017. Accessed April 22, 2019. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/downloads/Hospital_VBPurchasing_Fact_Sheet_ICN907664.pdf
5.
Centers for Medicare and Medicaid Services. Hospital Readmissions Reduction Program (HRRP). Accessed November 1, 2019. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Readmissions-Reduction-Program.html
6.
Thompson  MP, Waters  TM, Kaplan  CM, Cao  Y, Bazzoli  GJ.  Most hospitals received annual penalties for excess readmissions, but some fared better than others.   Health Aff (Millwood). 2017;36(5):893-901. doi:10.1377/hlthaff.2016.1204 PubMedGoogle ScholarCrossref
7.
Patel  K, Sinvani  L, Patel  V,  et al.  Do-not-resuscitate orders in older adults during hospitalization: a propensity score-matched analysis.   J Am Geriatr Soc. 2018;66(5):924-929. doi:10.1111/jgs.15347 PubMedGoogle ScholarCrossref
8.
Shepardson  LB, Youngner  SJ, Speroff  T, Rosenthal  GE.  Increased risk of death in patients with do-not-resuscitate orders.   Med Care. 1999;37(8):727-737. doi:10.1097/00005650-199908000-00003 PubMedGoogle ScholarCrossref
9.
Dunlay  SM, Swetz  KM, Redfield  MM, Mueller  PS, Roger  VL.  Resuscitation preferences in community patients with heart failure.   Circ Cardiovasc Qual Outcomes. 2014;7(3):353-359. doi:10.1161/CIRCOUTCOMES.113.000759 PubMedGoogle ScholarCrossref
10.
Baldwin  MR, Narain  WR, Wunsch  H,  et al.  A prognostic model for 6-month mortality in elderly survivors of critical illness.   Chest. 2013;143(4):910-919. doi:10.1378/chest.12-1668 PubMedGoogle ScholarCrossref
11.
Bruckel  J, Nallamothu  BK, Ling  F,  et al.  Do-not-resuscitate status and risk-standardized mortality and readmission rates following acute myocardial infarction.   Circ Cardiovasc Qual Outcomes. 2019;12(3):e005196. doi:10.1161/CIRCOUTCOMES.118.005196 PubMedGoogle Scholar
12.
Mehta  AB, Cooke  CR, Douglas  IS, Lindenauer  PK, Wiener  RS, Walkey  AJ.  Association of early do-not-resuscitate orders with unplanned readmissions among patients hospitalized for pneumonia.   Ann Am Thorac Soc. 2017;14(1):103-109. doi:10.1513/AnnalsATS.201608-617OC PubMedGoogle ScholarCrossref
13.
Bruckel  J, Mehta  A, Bradley  SM,  et al.  Variation in do-not-resuscitate orders and implications for heart failure risk-adjusted hospital mortality metrics.   JACC Heart Fail. 2017;5(10):743-752. doi:10.1016/j.jchf.2017.07.010 PubMedGoogle ScholarCrossref
14.
Centers for Medicaid and Medicare Services. Condition-specific measures, Updates and Specifications Report: Hospital-Level 30-Day Risk-Standardized Mortality Measures. Yale New Haven Health Services Corporation; 2018.
15.
Centers for Medicaid and Medicare Services. Condition-Specific Measures, Updates And Specifications Report: Hospital-Level 30-Day Risk-Standardized Readmission Measures. Yale New Haven Health Services Corporation; 2018.
16.
Advisory Board. Inpatient pay-for-performance final impact summary file 2013-2020. January 29, 2020. Accessed February 24, 2020. https://www.advisory.com/daily-briefing/2019/10/04/hrrp-map
17.
Elixhauser Comorbidity Software. Version 3.7. Healthcare Cost and Utilization Project (HCUP). Accessed April 22, 2019. https://www.hcup-us.ahrq.gov/toolssoftware/comorbidity/comorbidity.jsp
18.
Elixhauser  A, Steiner  C, Harris  DR, Coffey  RM.  Comorbidity measures for use with administrative data.   Med Care. 1998;36(1):8-27. doi:10.1097/00005650-199801000-00004 PubMedGoogle ScholarCrossref
19.
Healtcare Cost and Utilization Project (HCUP). Creation of Elixhauser comorbidity index scores: comorbidity index, version 1.0. Accessed April 22, 2019. https://www.hcup-us.ahrq.gov/toolssoftware/comorbidity/comindex2012-2015.txt
20.
Moore  BJ, White  S, Washington  R, Coenen  N, Elixhauser  A.  Identifying increased risk of readmission and in-hospital mortality using hospital administrative data: the AHRQ Elixhauser comorbidity index.   Med Care. 2017;55(7):698-705. doi:10.1097/MLR.0000000000000735 PubMedGoogle ScholarCrossref
21.
Filardo  G, Pollock  BD, Edgerton  J.  Categorizing body mass index biases assessment of the association with post-coronary artery bypass graft mortality.   Eur J Cardiothorac Surg. 2017;52(5):924-929. doi:10.1093/ejcts/ezx138 PubMedGoogle ScholarCrossref
22.
Walkey  AJ, Barnato  AE, Wiener  RS, Nallamothu  BK.  Accounting for patient preferences regarding life-sustaining treatment in evaluations of medical effectiveness and quality.   Am J Respir Crit Care Med. 2017;196(8):958-963. doi:10.1164/rccm.201701-0165CP PubMedGoogle ScholarCrossref
23.
Walkey  AJ, Weinberg  J, Wiener  RS, Cooke  CR, Lindenauer  PK.  Association of do-not-resuscitate orders and hospital mortality rate among patients with pneumonia.   JAMA Intern Med. 2016;176(1):97-104. doi:10.1001/jamainternmed.2015.6324 PubMedGoogle ScholarCrossref
24.
Goldman  LE, Chu  PW, Osmond  D, Bindman  A.  Accuracy of do not resuscitate (DNR) in administrative data.   Med Care Res Rev. 2013;70(1):98-112. doi:10.1177/1077558712458455 PubMedGoogle ScholarCrossref
25.
Krumholz  HM, Wang  Y, Mattera  JA,  et al.  An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with heart failure.   Circulation. 2006;113(13):1693-1701. doi:10.1161/CIRCULATIONAHA.105.611194 PubMedGoogle ScholarCrossref
26.
Giannopoulos  GA, Merriman  LR, Rumsey  A, Zwiebel  DS.  Malnutrition coding 101: financial impact and more.   Nutr Clin Pract. 2013;28(6):698-709. doi:10.1177/0884533613508788 PubMedGoogle ScholarCrossref
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    Original Investigation
    Statistics and Research Methods
    July 14, 2020

    Association of Do-Not-Resuscitate Patient Case Mix With Publicly Reported Risk-Standardized Hospital Mortality and Readmission Rates

    Author Affiliations
    • 1Department of Health Sciences Research, Mayo Clinic, Jacksonville, Florida
    • 2Department of Quality, Experience, and Affordability, Mayo Clinic, Rochester, Minnesota
    • 3Flying Buttress Associates, Charlottesville, Virginia
    • 4Section of Cardiovascular Medicine, Yale University School of Medicine, New Haven, Connecticut
    • 5Department of Obstetrics and Gynecology, Mayo Clinic, Rochester, Minnesota
    • 6Department of Critical Care Medicine, Mayo Clinic, Jacksonville, Florida
    • 7Department of Cardiovascular Diseases, Mayo Clinic, Rochester, Minnesota
    JAMA Netw Open. 2020;3(7):e2010383. doi:10.1001/jamanetworkopen.2020.10383
    Key Points español 中文 (chinese)

    Question  Are risk-standardized mortality and readmission rates used in financial penalties and quality star ratings associated with the case mix of patients with do-not-resuscitate status in US hospitals?

    Findings  In this cross-sectional study of mortality cohorts comprising 4 884 237 inpatient encounters and readmission cohorts including 4 450 378 inpatient encounters, hospitals with a greater relative volume of patients with present-on-admission do-not-resuscitate status had statistically significantly higher 30-day risk-standardized mortality rates and statistically significantly greater odds of avoiding a Hospital Readmission Reduction Program financial penalty.

    Meaning  Results of this study suggest that risk-standardized hospital outcomes should control for do-not-resuscitate status to avoid potential biases in pay-for-performance programs and hospital quality rankings.

    Abstract

    Importance  The Centers for Medicare and Medicaid Services’s (CMS’s) 30-day risk-standardized mortality rate (RSMR) and risk-standardized readmission rate (RSRR) models do not adjust for do-not-resuscitate (DNR) status of hospitalized patients and may bias Hospital Readmissions Reduction Program (HRRP) financial penalties and Overall Hospital Quality Star Ratings.

    Objective  To identify the association between hospital-level DNR prevalence and condition-specific 30-day RSMR and RSRR and the implications of this association for HRRP financial penalty.

    Design, Setting, and Participants  This cross-sectional study obtained patient-level data from the Medicare Limited Data Set Inpatient Standard Analytical File and hospital-level data from the CMS Hospital Compare website for all consecutive Medicare inpatient encounters from July 1, 2015, to June 30, 2018, in 4484 US hospitals. Hospitalized patients had a principal diagnosis of acute myocardial infarction (AMI), heart failure (HF), stroke, pneumonia, or chronic obstructive pulmonary disease (COPD). Incoming acute care transfers, discharges against medical advice, and patients coming from or discharged to hospice were among those excluded from the analysis.

    Exposures  Present-on-admission (POA) DNR status was defined as an International Classification of Diseases, Ninth Revision diagnosis code of V49.86 (before October 1, 2015) or as an International Statistical Classification of Diseases and Related Health Problems, Tenth Revision diagnosis code of Z66 (beginning October 1, 2015). Hospital-level prevalence of POA DNR status was calculated for each of the 5 conditions.

    Main Outcomes and Measures  Hospital-level 30-day RSMRs and RSRRs for 5 condition-specific cohorts (mortality cohorts: AMI, HF, stroke, pneumonia, and COPD; readmission cohorts: AMI, HF, pneumonia, and COPD) and HRRP financial penalty status (yes or no).

    Results  Included in the study were 4 884 237 inpatient encounters across condition-specific 30-day mortality cohorts (patient mean [SD] age, 78.8 [8.5] years; 2 608 182 women [53.4%]) and 4 450 378 inpatient encounters across condition-specific 30-day readmission cohorts (patient mean [SD] age, 78.6 [8.5] years; 2 349 799 women [52.8%]). Hospital-level median (interquartile range [IQR]) prevalence of POA DNR status in the mortality cohorts varied: 11% (7%-16%) for AMI, 13% (7%-23%) for HF, 14% (9%-22%) for stroke, 17% (9%-26%) for pneumonia, and 10% (5%-18%) for COPD. For the readmission cohorts, the hospital-level median (IQR) POA DNR prevalence was 9% (6%-15%) for AMI, 12% (6%-22%) for HF, 16% (8%-24%) for pneumonia, and 9% (4%-17%) for COPD. The 30-day RSMRs were significantly higher for hospitals in the highest quintiles vs the lowest quintiles of DNR prevalence (eg, AMI: 12.9 [95% CI, 12.8-13.1] vs 12.5 [95% CI, 12.4-12.7]; P < .001). The inverse was true among the readmission cohorts, with the highest quintiles of DNR prevalence exhibiting the lowest RSRRs (eg, AMI: 15.3 [95% CI, 15.1-15.5] vs 15.9 [95% CI, 15.7-16.0]; P < .001). A 1% absolute increase in risk-adjusted hospital-level DNR prevalence was associated with greater odds of avoiding HRRP financial penalty (odds ratio, 1.06; 95% CI, 1.04-1.08; P < .001).

    Conclusions and Relevance  This cross-sectional study found that the lack of adjustment in CMS 30-day RSMR and RSRR models for POA DNR status of hospitalized patients may be associated with biased readmission penalization and hospital-level performance.

    Introduction

    The Centers for Medicare and Medicaid Services (CMS) has invested more than $130 million to develop, refine, and disseminate risk-standardized quality outcome models.1 These models are considered the criterion standard for risk standardization and have been adopted by professional societies in the US.2,3 Currently, the condition-specific 30-day risk-standardized mortality rate (RSMR) model is used under value-based purchasing to calculate financial reimbursement for 3 conditions: acute myocardial infarction (AMI), heart failure (HF), and pneumonia.4 Similarly, the condition-specific 30-day risk-standardized readmission rate (RSRR) model is used to establish Hospital Readmissions Reduction Program (HRRP) penalties,5 which are levied annually against most US hospitals, for 4 conditions: AMI, HF, chronic obstructive pulmonary disease (COPD), and pneumonia.6 In addition to the financial implications, 30-day RSMR and RSRR together account for 44% weight in the CMS Overall Hospital Quality Star Ratings. The 30-day RSMRs and RSRRs are adjusted for approximately 20 to 40 condition categories, which include common comorbidities such as cancer, diabetes, renal failure, and psychiatric disorders. However, a patient’s do-not-resuscitate (DNR) status is not included in risk adjustment despite the greater severity of illness and mortality risk among patients with DNR status present on admission (POA).7-10

    Evidence is limited on the association between the prevalence of POA DNR status and the risk-standardized outcome rates at hospitals. Although hospital-level prevalence of POA DNR status varies, the DNR status of patients is documented reliably enough for inclusion in risk-standardization models and analyses, suggesting that adjustment for patient-level DNR status improves the predictive accuracy of 30-day RSMRs for AMI, HF, and pneumonia.11-13 To our knowledge, however, the association between hospital-level DNR prevalence and 30-day RSMRs and RSRRs has not been examined using national data from all Medicare-eligible US hospitals.

    In this cross-sectional study, we assessed the variation in hospital-level POA DNR status across the condition-specific cohorts used in value-based purchasing reimbursement, HRRP penalty, and the CMS Overall Hospital Quality Star Ratings. Our aim was to identify the association between the prevalence of POA DNR status and condition-specific 30-day RSMRs and RSRRs. We also evaluated the implication of POA DNR status prevalence for the HRRP financial penalties.

    Methods

    This cross-sectional study was deemed exempt by the Mayo Clinic Institutional Review Board because the data used have been previously collected and deidentified. Informed consent was not required for this secondary research as contact with participants was not possible. We followed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline.

    Data Source and Study Population

    We obtained data from the 2015 to 2018 CMS Limited Data Set Inpatient Standard Analytical File, which includes 100% of Medicare fee-for-service inpatient encounters across 4484 US hospitals. The condition-specific cohorts for mortality were AMI, HF, stroke, pneumonia, and COPD. To form these condition-specific 30-day mortality cohorts, we applied International Classification of Diseases, Ninth Revision (ICD-9) codes (from July 1, 2015, to September 30, 2015) and International Statistical Classification of Diseases and Related Health Problems, Tenth Revision (ICD-10) codes (from October 1, 2015, to October 31, 2018) to the index admissions, adhering to the inclusion criteria specified in the CMS inpatient quality-reporting, condition-specific mortality documentation method,14 and we excluded incoming acute care transfers and patients who were discharged against medical advice. In the AMI, pneumonia, and HF mortality cohorts, we also excluded patients who were discharged alive within 1 day of admission who were not transferred to an acute care facility. We did not exclude patients who arrived from hospice or who were discharged to hospice.

    The condition-specific cohorts for readmission were AMI, HF, pneumonia, and COPD. To form these condition-specific 30-day readmission cohorts, we followed the CMS inpatient quality-reporting 30-day readmission measures method15 and excluded all patients who died before discharge, were discharged against medical advice, or were discharged or transferred to an acute care facility. We did not exclude patients who arrived from hospice or were discharged to hospice. An important distinction is that CMS attributes 30-day readmissions to the discharging hospital, whereas 30-day mortalities are attributed to the admitting hospital. Thus, the transfer exclusion criteria for both RSMRs and RSRRs matched the exclusion criteria of the CMS method.

    The 30-day mortality study population consisted of all consecutive Medicare inpatient encounters from July 1, 2015, through June 30, 2018, with a principal diagnosis of AMI, HF, stroke, pneumonia, or COPD. The 30-day readmission study population included all consecutive Medicare inpatient encounters from July 1, 2015, through June 30, 2018, with a principal diagnosis of AMI, HF, pneumonia, or COPD. This period matched the data period used for the 30-day RSMRs and RSRRs publicly reported on the CMS Hospital Compare website as of August 2019.

    To ascertain the HRRP penalty status (yes or no) during fiscal years 2016 to 2018, we downloaded the Inpatient Pay-for-Performance Final Impact Summary File from the Advisory Board,16 which provided annual HRRP penalty status by hospital. Any hospital that was not penalized at any time during the 3 fiscal years (2016, 2017, or 2018) was considered to have the binary outcome of avoided HRRP penalization.

    Exposure and Outcomes

    We defined DNR status as an ICD-9 diagnosis code of V49.86 (before October 1, 2015) or an ICD-10 diagnosis code of Z66 (beginning October 1, 2015) if the DNR order was POA per the corresponding POA indicator on the inpatient claim. Patients without a POA DNR status but who acquired DNR status after admission were included in the analysis. The outcomes were the hospital-level condition-specific 30-day RSMRs for AMI, HF, COPD, pneumonia, and stroke and the 30-day RSRRs for AMI, HF, COPD, and pneumonia as reported on the Hospital Compare website in August 2019. Briefly, the CMS calculates hospital-specific RSMRs and RSRRs by taking each hospital’s predicted vs expected (P/E) 30-day mortality or readmission rates from hierarchical mixed-effects risk-standardization models and multiplying these P/E ratios by the national mean 30-day mortality or readmission rate. We matched at the hospital level the values on the Hospital Compare website to the Inpatient Standard Analytical File data using the 6-digit CMS provider numbers. Hospitals without reported RSMR or RSRR data were excluded.

    Risk Adjustment

    Using the Healthcare Cost and Utilization Project Elixhauser Comorbidity Software,17 version 3.7 and beta version, we assigned 29 binary comorbidity categories, known as Elixhauser comorbidities,18 to all patients on the basis of corresponding ICD-9 or ICD-10 diagnosis codes and Medicare severity diagnosis related group. We then used the Healthcare Cost and Utilization Project Comorbidity Index program to calculate the Elixhauser comorbidity index score from the weighted sum of these 29 comorbidity categories.19,20 The Elixhauser comorbidity index score is an integer ranging from –32 (lowest) to 99 (highest) comorbidity burden.

    Association Between DNR Status and Comorbidity Burden

    We hypothesized that DNR status would be associated with a patient’s comorbidity burden. However, we also wanted to assess the shape of this association to ensure that an unexpected decline in DNR status did not occur at the sickest or healthiest patient extremes, which might suggest systematic bias in documentation. To test this hypothesis, we estimated a logistic regression model by pooling the patient-level data from the 5 condition-specific 30-day mortality cohorts, with DNR status as the dependent variable regressed on Elixhauser comorbidity index and age (both modeled using 5-knot restricted cubic spline functions)21 and with a condition indicator as a third independent variable and an interaction term between the condition and the Elixhauser comorbidity index score. We visually assessed the association between estimated probability of DNR prevalence (y-axis) and comorbidity burden (x-axis) (eFigure 1 in the Supplement).

    Variation in Hospital-Level DNR Prevalence

    From the logistic regression model, we calculated hospital-level observed vs expected DNR ratios and visually assessed the variation in both unadjusted condition-specific hospital-level DNR prevalence and the observed vs expected value using density plots. This visualization allowed us to consider the extent of interhospital differences in DNR documentation.

    Association Between RSMR or RSRR and DNR Prevalence

    Within each condition-specific cohort, we used the hospital-level RSMR or RSRR reported by the CMS as dependent variables in distinct inverse-variance–weighted linear regression models with the following independent variables: (1) quintiles of hospital-level DNR prevalence, with prevalence defined as the percentage of patients with POA DNR status; (2) mean hospital-level age modeled with a 5-knot restricted cubic spline function; and (3) mean hospital-level Elixhauser comorbidity index score modeled with a 5-knot restricted cubic spline function.

    The inverse-variance weights were estimated from the CIs of the RSMR and RSRR in the Hospital Compare data, where SE = [(Upper CI – Lower CI) / (2 × 1.96)] and weight = 1 / SE.2

    The adjusted least-squared mean RSMR and RSRR (and 95% CIs) for each quintile of hospital-level DNR prevalence were reported along with the overall Wald P value for DNR quintiles. We also compared the condition-specific mean RSMR and RSRR of the fifth (highest) quintiles with those of the first (lowest) quintiles using a Bonferroni-corrected P value. We conducted a sensitivity analysis using a more inclusive definition of DNR prevalence, in which patients were considered to have DNR status regardless of whether the DNR order was POA.

    Association Between DNR Prevalence and HRRP Penalty

    To assess the association between hospital-level POA DNR status and HRRP penalty, we pooled patient-level data from the 4 readmission cohorts included in the HRRP penalty calculation and assessed hospital-level POA DNR status. Next, we used logistic regression, adjusting for age and Elixhauser comorbidity index score (using restricted cubic spline functions), to identify the association between POA DNR prevalence (independent variable) and avoidance of HRRP penalty (dependent variable).

    Statistical Analysis

    All analyses were performed with SAS, version 9.4 (SAS Institute Inc). Two-sided P < .05 indicated statistical significance. Data were analyzed from July 2019 to March 2020.

    Results

    This study analyzed 4 884 237 inpatient encounters across the 5 condition-specific 30-day mortality cohorts (patient mean [SD] age, 78.8 [8.5] years; 2 608 182 women [53.4%] and 2 276 055 men [46.6%]) and 4 450 378 inpatient encounters across the 4 condition-specific 30-day readmission cohorts (patient mean [SD] age, 78.6 [8.5] years; 2 349 799 women [52.8%] and 2 100 579 men [47.2%]). Within the mortality cohorts, DNR status was POA in 790 988 encounters (16.2%) and present at any time in 946 391 encounters (19.4%). Within the readmission cohorts, DNR status was POA in 674 030 encounters (15.1%) and present at any time in 776 071 encounters (17.4%). Condition-specific patient-level characteristics are presented in Table 1.

    Hospital-level median (interquartile range [IQR]) prevalence of POA DNR status in the mortality cohorts varied: 11% (7%-16%) for AMI, 13% (7%-23%) for HF, 14% (9%-22%) for stroke, 17% (9%-26%) for pneumonia, and 10% (5%-18%) for COPD. For the readmission cohorts, the hospital-level median (IQR) POA DNR prevalence was 9% (6%-15%) for AMI, 12% (6%-22%) for HF, 16% (8%-24%) for pneumonia, and 9% (4%-17%) for COPD.

    Across all condition-specific mortality cohorts, the Elixhauser comorbidity index score showed a statistically significant and generally nonlinear association with DNR status (eFigure 1 in the Supplement). Visual assessment indicated minimal potential systemic bias from lack of documentation of DNR status among patients at the extremes (sickest or healthiest).

    Unadjusted hospital-level condition-specific DNR prevalence (Table 2) was highest in the pneumonia mortality cohort (median [IQR)], 17% [9%-26%]) and lowest in the COPD readmissions cohort (median [IQR], 9%; [4%-17%]). Condition-specific hospital-level characteristics are presented in Table 2. For example, in the AMI mortality cohort, the median (IQR) hospital-level Elixhauser comorbidity index score was 5.7 (4.8-6.7) and the median (IQR) age was 77 (76-79) years.

    Across all condition-specific 30-day mortality cohorts, variation was found in observed to expected DNR prevalence ratios (eFigure 2 in the Supplement), with a median (IQR) observed vs expected DNR prevalence ranging from 0.79 (0.38-1.36) in the HF readmission cohort to 0.94 (0.61-1.38) in the AMI mortality cohort (Table 2). The median (IQR) observed vs expected DNR prevalence ratios in the readmission cohorts were approximately normally distributed for AMI (0.94 [0.61-1.38]) and stroke (0.90 [0.56-1.38]) cohorts, whereas observed vs expected DNR prevalence ratios were skewed to the right for HF (0.81 [0.41-1.37]), COPD (0.81 [0.37-1.42]), and pneumonia (0.85 [0.43-1.29]).

    In all condition-specific mortality cohorts, the RSMR was higher in the fifth DNR prevalence quintile vs the first quintile (AMI: 12.9 [95% CI, 12.8-13.1] vs 12.5 [95% CI, 12.4-12.7]; HF: 11.7 [95% CI, 11.5-11.9] vs 10.7 [95% CI, 10.5-10.9]; stroke: 14.0 [95% CI, 13.7-14.1] vs 13.0 [95% CI, 12.8-13.3]); pneumonia: 15.8 [95% CI, 15.6-16.0] vs 15.0 [95% CI, 14.7-15.3]; COPD: 8.5 [95% CI, 8.4-8.6] vs 8.1 [8.0-8.2]; all P < .001), and an association between RSMR and DNR prevalence quintile was found across all conditions (all Wald P < .001) (Figure 1). In all condition-specific readmission cohorts, the RSRR was lower in the fifth DNR prevalence quintile than in the first quintile (AMI: 15.3 [95% CI, 15.1-15.5] vs 15.9 [95% CI, 15.7-16.0]; HF: 20.7 [95% CI, 20.5-20.9] vs 22.1 [95% CI, 21.8-22.3]; pneumonia: 16.2 [95% CI, 16.0-16.3] vs 17.0 [95% CI, 16.9-17.1]; and COPD: 19.2 [95% CI, 19.0-19.3] vs 19.6 [95% CI, 19.4-19.7]; all P < .001), and an association between RSRR and DNR prevalence quintile was found across all conditions (all Wald P < .001) (Figure 2). The sensitivity analysis in which we removed the POA requirement for patients with DNR status showed similar results (eTable in the Supplement).

    Among the 2948 hospitals included in the analysis of the association between hospital-level POA DNR prevalence and HRRP penalty, 200 (6.8%) hospitals avoided an HRRP penalty from fiscal years 2016 to 2018. After adjustment for age and Elixhauser comorbidity index score, POA DNR prevalence was associated with HRRP penalty avoidance (odds ratio per 1% absolute increase in POA DNR prevalence, 1.06; 95% CI, 1.04-1.08; P < .001) (Figure 3).

    Discussion

    To our knowledge, this study is the first to use national Medicare claims data to assess the association between hospital-level prevalence of patients with POA DNR status and 30-day RSMRs and RSRRs across 5 conditions. A higher prevalence of patients with POA DNR status was associated with a higher RSMR. Conversely, a higher prevalence of patients with POA DNR status was associated with a lower RSRR. The 30-day RSMR and RSRR models do not adjust for POA DNR status, which may be a factor in unearned HRRP financial penalties and inaccurate CMS Overall Hospital Quality Star Ratings.

    The association between hospital DNR case mixes and quality outcomes has been examined previously, notably by Walkey and colleagues.22,23 Specifically, Walkey and colleagues22,23 found that hospitals with higher rates of DNR prevalence were more likely to be outliers for pneumonia mortality, and these authors hypothesized that the lack of adjustment for DNR status was associated with biased penalization of hospitals that in reality were providing high-quality, patient-centered care.22,23 Results of the present analysis not only support this hypothesis and provide expanded evidence of this pattern across multiple conditions and quality outcomes linked with CMS reimbursements but also offer the first direct evidence of the implications of this association for HRRP penalties. In addition, these results are consistent with findings from 2 studies of approximately 300 hospitals in the California State Inpatient Database (CA-SID). The CA-SID study reported an increase in the C statistic from 0.75 to 0.78 when comparing a CMS-based risk-standardized AMI mortality model to an improved risk model that adjusted for DNR status11 and a decrease in readmission rates for patients with pneumonia who had a DNR status.12 In the AMI analysis, Bruckel and colleagues11 found that 6 of 25 high-mortality outlier hospitals were reclassified as nonoutliers under the improved DNR risk-standardized model. Likewise, both the hospital-level prevalence and variation of POA DNR status reported in the CA-SID study (unadjusted median [IQR] DNR prevalence: 8% [4%-14%]) were similar to those in the CMS claims data (Inpatient Standard Analytical File). This similarity in DNR rates between CMS claims data and the CA-SID study, which showed an 84% agreement between the claims-based DNR status variable and medical record abstraction,24 suggests that POA DNR status is captured in the CMS claims data at acceptable rates to permit its inclusion in risk-standardized models.

    Patients with DNR status do not receive cardiopulmonary resuscitation during an acute event and are expected to have greater risks of 30-day mortality. Conversely, patients with DNR status are more likely to be discharged to hospice or end-of-life care, making 30-day readmission less likely.12 Under the current paradigm, hospitals that provide treatment to higher proportions of patients with complex conditions, terminal illness, and/or DNR status may be reported to have higher RSMRs and lower RSRRs than if an appropriate DNR adjustment was made.

    The omission of DNR status as a risk-adjustment variable was attributed to concerns about its availability and documentation in claims data,25 including whether DNR status acquired during hospitalization represented a preventable complication or natural history of disease. However, the same rationale exists for many of the clinical risk factors in the CMS models, and the variation may reflect true prevalence or accuracy of documentation. One example from the CMS models is protein-calorie malnutrition, a comorbidity for which reported hospital prevalence may reflect documentation rather than case mix.26 Furthermore, no evidence is available to suggest that hospitals could take advantage of the system by overdocumenting DNR status. Abusing the system for DNR status is less possible than for other comorbidities in the CMS models because, unlike clinical risk factors, the documentation of DNR status requires the additional burden of obtaining a patient’s legal consent and signature and/or even the presence of a witness in some jurisdictions. Conversely, no signatures or witnesses are needed for a physician to document risk factors, such as protein-calorie malnutrition.

    This analysis suggested that there are implications for HRRP penalization if POA DNR status was not controlled for during risk standardization. Specifically, for 2 hospitals of equivalent age and comorbidity case mixes among patients in the readmission cohorts, a hospital with a 10% greater absolute prevalence of POA DNR status (eg: 25% vs 15%) has nearly 2-fold better odds of avoiding HRRP penalization, implying that hospitals with low DNR prevalence could receive either a lower penalty or no penalty if POA DNR status were accounted for in risk standardization. Conversely, hospitals with high DNR prevalence could receive a larger HRRP penalty if POA DNR status were included in risk standardization.

    We did not assess the association of lower readmissions and higher mortality with value-based purchasing payments in hospitals with high DNR prevalence. Thus, although hospitals with high DNR prevalence may have an advantage under HRRP owing to the lack of risk adjustment for DNR status, it is possible that the reverse is true under value-based purchasing when the opposing effects of 30-day mortality are considered.

    Strengths and Limitations

    This study has some strengths. It used a large national sample of CMS claims data from more than 4000 US hospitals. It also closely adhered to the CMS condition-specific cohort inclusion and exclusion criteria.

    This study has some limitations. First, it had an ecological design, which makes it possible that patients with POA DNR status were not direct factors in the increased mortality rates at hospitals with high DNR prevalence. However, given that patients with a DNR status are known to have greater mortality risks, there is strong reason to suspect that these patients may be direct contributors to higher mortality rates. Second, some of the variation in DNR status may be associated with interinstitutional coding differences, physician preferences and comfort with discussing DNR status, and patient preferences and values. Given these differences, misclassification of DNR status may have occurred that could have biased the results to the null, particularly among several hospitals with 0% reported DNR prevalence. Third, although the hospital-level DNR prevalence rates in this study were similar to those in the CA-SID study,11 which was validated with medical record reviews, we did not validate DNR documentation in this analysis. Furthermore, the inclusion of hospice patients may have introduced bias because these patients were excluded from the CMS metrics. Nevertheless, less than 2% of all encounters were hospice exclusions under the CMS method, so it is unlikely that this small percentage of encounters changed the results.14 We did not assess the association between in-hospital DNR status acquisition and RSMR or RSRR, which could be an important area for future research given the increasing consideration of patient-centered palliative care in the hospital setting. Fourth, socioeconomic and demographic factors were likely associated with patients’ DNR status. We did not adjust for these factors in these analyses, but they may help explain the associations between hospital DNR case mixes and RSMR or RSRR.

    Conclusions

    Under the current CMS risk-standardization models for 30-day mortality and readmission, mortality rates appeared to be worse among hospitals with greater proportions of patients with POA DNR status, whereas readmission rates appeared to be better among hospitals with greater proportions of patients with POA DNR status. Findings from this study suggest that controlling for DNR status would improve estimates for 30-day RSMRs and RSRRs and would have direct implications for HRRP financial penalty and hospital performance rankings.

    Back to top
    Article Information

    Accepted for Publication: May 3, 2020.

    Published: July 14, 2020. doi:10.1001/jamanetworkopen.2020.10383

    Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2020 Pollock BD et al. JAMA Network Open.

    Corresponding Author: Benjamin D. Pollock, PhD, MSPH, Department of Health Sciences Research, Mayo Clinic, 4500 San Pablo Rd, Jacksonville, FL 32224 (Pollock.Benjamin@Mayo.Edu).

    Author Contributions: Dr Pollock had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

    Concept and design: Pollock, Herrin, Neville, Moreno Franco, Ting.

    Acquisition, analysis, or interpretation of data: All authors.

    Drafting of the manuscript: Pollock, Neville, Moreno Franco, Ting.

    Critical revision of the manuscript for important intellectual content: Pollock, Herrin, Neville, Dowdy, Shah, Ting.

    Statistical analysis: Pollock, Neville.

    Administrative, technical, or material support: Dowdy, Moreno Franco.

    Supervision: Pollock, Moreno Franco, Ting.

    Conflict of Interest Disclosures: Dr Herrin reported developing hospital measures under contract to the Centers for Medicare and Medicaid Services (CMS). Dr Shah reported receiving research support through Mayo Clinic from the CMS; grant U01FD004585 from the Food and Drug Administration; grants R01HS025164, R01HS025402, R03HS025517, and U19HS024075 from the Agency for Healthcare Research and Quality; grants R56HL130496 and R01HL131535 from the National Heart, Lung, and Blood Institute of the National Institutes of Health; and research support from the Medical Devices Innovation Consortium of the National Evaluation System for Health Technology, from the National Science Foundation, and from the Patient Centered Outcomes Research Institute. No other disclosures were reported.

    References
    1.
    Castellucci  M.  CMS, Yale New Haven Health on hot seat over design of quality measures.   Modern Healthcare. January 19, 2019. Accessed April 22, 2019. https://www.modernhealthcare.com/article/20190119/NEWS/190119904/cms-yale-new-haven-health-on-hot-seat-over-design-of-quality-measuresGoogle Scholar
    2.
    Silber  JH, Satopää  VA, Mukherjee  N,  et al.  Improving Medicare’s Hospital Compare mortality model.   Health Serv Res. 2016;51(suppl 2):1229-1247. doi:10.1111/1475-6773.12478 PubMedGoogle ScholarCrossref
    3.
    O’Brien  SM, Jacobs  JP, Pasquali  SK,  et al.  The Society of Thoracic Surgeons Congenital Heart Surgery Database mortality risk model: part 1-statistical methodology.   Ann Thorac Surg. 2015;100(3):1054-1062. doi:10.1016/j.athoracsur.2015.07.014 PubMedGoogle ScholarCrossref
    4.
    Centers for Medicare and Medicaid Services, US Department of Health and Human Services. Hospital value-based purchasing. September 2017. Accessed April 22, 2019. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/downloads/Hospital_VBPurchasing_Fact_Sheet_ICN907664.pdf
    5.
    Centers for Medicare and Medicaid Services. Hospital Readmissions Reduction Program (HRRP). Accessed November 1, 2019. https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Readmissions-Reduction-Program.html
    6.
    Thompson  MP, Waters  TM, Kaplan  CM, Cao  Y, Bazzoli  GJ.  Most hospitals received annual penalties for excess readmissions, but some fared better than others.   Health Aff (Millwood). 2017;36(5):893-901. doi:10.1377/hlthaff.2016.1204 PubMedGoogle ScholarCrossref
    7.
    Patel  K, Sinvani  L, Patel  V,  et al.  Do-not-resuscitate orders in older adults during hospitalization: a propensity score-matched analysis.   J Am Geriatr Soc. 2018;66(5):924-929. doi:10.1111/jgs.15347 PubMedGoogle ScholarCrossref
    8.
    Shepardson  LB, Youngner  SJ, Speroff  T, Rosenthal  GE.  Increased risk of death in patients with do-not-resuscitate orders.   Med Care. 1999;37(8):727-737. doi:10.1097/00005650-199908000-00003 PubMedGoogle ScholarCrossref
    9.
    Dunlay  SM, Swetz  KM, Redfield  MM, Mueller  PS, Roger  VL.  Resuscitation preferences in community patients with heart failure.   Circ Cardiovasc Qual Outcomes. 2014;7(3):353-359. doi:10.1161/CIRCOUTCOMES.113.000759 PubMedGoogle ScholarCrossref
    10.
    Baldwin  MR, Narain  WR, Wunsch  H,  et al.  A prognostic model for 6-month mortality in elderly survivors of critical illness.   Chest. 2013;143(4):910-919. doi:10.1378/chest.12-1668 PubMedGoogle ScholarCrossref
    11.
    Bruckel  J, Nallamothu  BK, Ling  F,  et al.  Do-not-resuscitate status and risk-standardized mortality and readmission rates following acute myocardial infarction.   Circ Cardiovasc Qual Outcomes. 2019;12(3):e005196. doi:10.1161/CIRCOUTCOMES.118.005196 PubMedGoogle Scholar
    12.
    Mehta  AB, Cooke  CR, Douglas  IS, Lindenauer  PK, Wiener  RS, Walkey  AJ.  Association of early do-not-resuscitate orders with unplanned readmissions among patients hospitalized for pneumonia.   Ann Am Thorac Soc. 2017;14(1):103-109. doi:10.1513/AnnalsATS.201608-617OC PubMedGoogle ScholarCrossref
    13.
    Bruckel  J, Mehta  A, Bradley  SM,  et al.  Variation in do-not-resuscitate orders and implications for heart failure risk-adjusted hospital mortality metrics.   JACC Heart Fail. 2017;5(10):743-752. doi:10.1016/j.jchf.2017.07.010 PubMedGoogle ScholarCrossref
    14.
    Centers for Medicaid and Medicare Services. Condition-specific measures, Updates and Specifications Report: Hospital-Level 30-Day Risk-Standardized Mortality Measures. Yale New Haven Health Services Corporation; 2018.
    15.
    Centers for Medicaid and Medicare Services. Condition-Specific Measures, Updates And Specifications Report: Hospital-Level 30-Day Risk-Standardized Readmission Measures. Yale New Haven Health Services Corporation; 2018.
    16.
    Advisory Board. Inpatient pay-for-performance final impact summary file 2013-2020. January 29, 2020. Accessed February 24, 2020. https://www.advisory.com/daily-briefing/2019/10/04/hrrp-map
    17.
    Elixhauser Comorbidity Software. Version 3.7. Healthcare Cost and Utilization Project (HCUP). Accessed April 22, 2019. https://www.hcup-us.ahrq.gov/toolssoftware/comorbidity/comorbidity.jsp
    18.
    Elixhauser  A, Steiner  C, Harris  DR, Coffey  RM.  Comorbidity measures for use with administrative data.   Med Care. 1998;36(1):8-27. doi:10.1097/00005650-199801000-00004 PubMedGoogle ScholarCrossref
    19.
    Healtcare Cost and Utilization Project (HCUP). Creation of Elixhauser comorbidity index scores: comorbidity index, version 1.0. Accessed April 22, 2019. https://www.hcup-us.ahrq.gov/toolssoftware/comorbidity/comindex2012-2015.txt
    20.
    Moore  BJ, White  S, Washington  R, Coenen  N, Elixhauser  A.  Identifying increased risk of readmission and in-hospital mortality using hospital administrative data: the AHRQ Elixhauser comorbidity index.   Med Care. 2017;55(7):698-705. doi:10.1097/MLR.0000000000000735 PubMedGoogle ScholarCrossref
    21.
    Filardo  G, Pollock  BD, Edgerton  J.  Categorizing body mass index biases assessment of the association with post-coronary artery bypass graft mortality.   Eur J Cardiothorac Surg. 2017;52(5):924-929. doi:10.1093/ejcts/ezx138 PubMedGoogle ScholarCrossref
    22.
    Walkey  AJ, Barnato  AE, Wiener  RS, Nallamothu  BK.  Accounting for patient preferences regarding life-sustaining treatment in evaluations of medical effectiveness and quality.   Am J Respir Crit Care Med. 2017;196(8):958-963. doi:10.1164/rccm.201701-0165CP PubMedGoogle ScholarCrossref
    23.
    Walkey  AJ, Weinberg  J, Wiener  RS, Cooke  CR, Lindenauer  PK.  Association of do-not-resuscitate orders and hospital mortality rate among patients with pneumonia.   JAMA Intern Med. 2016;176(1):97-104. doi:10.1001/jamainternmed.2015.6324 PubMedGoogle ScholarCrossref
    24.
    Goldman  LE, Chu  PW, Osmond  D, Bindman  A.  Accuracy of do not resuscitate (DNR) in administrative data.   Med Care Res Rev. 2013;70(1):98-112. doi:10.1177/1077558712458455 PubMedGoogle ScholarCrossref
    25.
    Krumholz  HM, Wang  Y, Mattera  JA,  et al.  An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with heart failure.   Circulation. 2006;113(13):1693-1701. doi:10.1161/CIRCULATIONAHA.105.611194 PubMedGoogle ScholarCrossref
    26.
    Giannopoulos  GA, Merriman  LR, Rumsey  A, Zwiebel  DS.  Malnutrition coding 101: financial impact and more.   Nutr Clin Pract. 2013;28(6):698-709. doi:10.1177/0884533613508788 PubMedGoogle ScholarCrossref
    ×