Hospital Characteristics Associated With Penalties in the Centers for Medicare & Medicaid Services Hospital-Acquired Condition Reduction Program | Acute Coronary Syndromes | JAMA | JAMA Network
[Skip to Navigation]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 34.204.186.91. Please contact the publisher to request reinstatement.
1.
Patient Protection and Affordable Care Act, 42 USC §18001 et seq, (2010).
2.
Scoring methodology: Hospital-Acquired Condition (HAC) Reduction Program. QualityNet. https://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier3&cid=1228774298601. Accessed May 27, 2015.
3.
Hospital-Acquired Condition (HAC) Reduction Program. Centers for Medicare & Medicaid Services. http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/HAC-Reduction-Program.html. Accessed March 13, 2015.
5.
Department of Health and Human Services. Federal Register. http://www.gpo.gov/fdsys/pkg/FR-2014-08-22/pdf/2014-18545.pdf. Accessed March 18, 2015.
6.
Rau  J. More than 750 hospitals face Medicare crackdown on patient injuries. Kaiser Health News. http://khn.org/news/patient-injuries-hospitals-medicare-hospital-acquired-condition-reduction-program/. Accessed March 1, 2015.
7.
Haut  ER, Pronovost  PJ.  Surveillance bias in outcomes reporting.  JAMA. 2011;305(23):2462-2463.PubMedGoogle ScholarCrossref
8.
Bilimoria  KY, Chung  J, Ju  MH,  et al.  Evaluation of surveillance bias and the validity of the venous thromboembolism quality measure.  JAMA. 2013;310(14):1482-1489.PubMedGoogle ScholarCrossref
9.
Dixon-Woods  M, Leslie  M, Bion  J, Tarrant  C.  What counts? an ethnographic study of infection data reported to a patient safety program.  Milbank Q. 2012;90(3):548-591.PubMedGoogle ScholarCrossref
10.
McGregor  JC, Harris  AD.  The need for advancements in the field of risk adjustment for healthcare-associated infections.  Infect Control Hosp Epidemiol. 2014;35(1):8-9.PubMedGoogle ScholarCrossref
11.
Rajaram  R, Barnard  C, Bilimoria  KY.  Concerns about using the Patient Safety Indicator-90 composite in pay-for-performance programs.  JAMA. 2015;313(9):897-898.PubMedGoogle ScholarCrossref
12.
Trick  WE.  Decision making during healthcare-associated infection surveillance: a rationale for automation.  Clin Infect Dis. 2013;57(3):434-440.PubMedGoogle ScholarCrossref
13.
National Healthcare Safety Network. Centers for Disease Control and Prevention. http://www.cdc.gov/nhsn/. Accessed March 23, 2015.
14.
Hospital-Acquired Condition Reduction Program. Hospital Compare. http://www.medicare.gov/hospitalcompare/HAC-reduction-program.html. Accessed January, 28, 2015.
15.
Fiscal year 2015 results for the CMS Hospital-Acquired Condition Reduction Program and Hospital Value-Based Purchasing Program. Centers for Medicare & Medicaid Services. http://www.cms.gov/Newsroom/MediaReleaseDatabase/Fact-sheets/2014-Fact-sheets-items/2014-12-18-2.html. Accessed March 24, 2015.
16.
Joynt  KE, Jha  AK.  Characteristics of hospitals receiving penalties under the Hospital Readmissions Reduction Program.  JAMA. 2013;309(4):342-343.PubMedGoogle ScholarCrossref
18.
Rajaram  R, Chung  JW, Jones  AT,  et al.  Association of the 2011 ACGME resident duty hour reform with general surgery patient outcomes and with resident examination performance.  JAMA. 2014;312(22):2374-2384.PubMedGoogle ScholarCrossref
19.
Hospital Compare. Medicare.gov. http://www.medicare.gov/hospitalcompare/search.html. Accessed March 16, 2015.
20.
American College of Surgeons National Surgical Quality Improvement Program. https://www.facs.org/quality-programs/acs-nsqip. Accessed March 13, 2015.
21.
Hospital participation. Surgical Care and Outcomes Assessment Program. http://www.scoap.org/hospitals. Accessed March 13, 2015.
22.
Member hospitals. Michigan Surgical Quality Collaborative. http://www.msqc.org/membership_hospitals.php. Accessed March 13, 2015.
23.
Ayanian  JZ, Weissman  JS.  Teaching hospitals and quality of care: a review of the literature.  Milbank Q. 2002;80(3):569-593.PubMedGoogle ScholarCrossref
24.
Bilimoria  KY, Bentrem  DJ, Stewart  AK, Winchester  DP, Ko  CY.  Comparison of commission on cancer-approved and -nonapproved hospitals in the United States: implications for studies that use the National Cancer Data Base.  J Clin Oncol. 2009;27(25):4177-4181.PubMedGoogle ScholarCrossref
25.
Birkmeyer  JD, Siewers  AE, Finlayson  EV,  et al.  Hospital volume and surgical mortality in the United States.  N Engl J Med. 2002;346(15):1128-1137.PubMedGoogle ScholarCrossref
26.
Demetriades  D, Martin  M, Salim  A, Rhee  P, Brown  C, Chan  L.  The effect of trauma center designation and trauma volume on outcome in specific severe injuries.  Ann Surg. 2005;242(4):512-517.PubMedGoogle Scholar
27.
Keeler  EB, Rubenstein  LV, Kahn  KL,  et al.  Hospital characteristics and quality of care.  JAMA. 1992;268(13):1709-1714.PubMedGoogle ScholarCrossref
28.
Needleman  J, Buerhaus  P, Mattke  S, Stewart  M, Zelevinsky  K.  Nurse-staffing levels and the quality of care in hospitals.  N Engl J Med. 2002;346(22):1715-1722.PubMedGoogle ScholarCrossref
29.
Schmaltz  SP, Williams  SC, Chassin  MR, Loeb  JM, Wachter  RM.  Hospital performance trends on national quality measures and the association with Joint Commission accreditation.  J Hosp Med. 2011;6(8):454-461.PubMedGoogle ScholarCrossref
30.
Cohen  ME, Liu  Y, Ko  CY, Hall  BL.  Improved surgical outcomes for ACS NSQIP hospitals over time: evaluation of hospital cohorts with up to 8 years of participation [published online February 26, 2015].  Ann Surg. doi:10.1097/SLA.0000000000001192.PubMedGoogle Scholar
31.
Hospital Compare Data Archive. Data.Medicare.gov. https://data.medicare.gov/data/archives/hospital-compare. Accessed March 16, 2015.
32.
National Provider Call: Hospital Value-Based Purchasing: fiscal year 2015 overview. https://www.cms.gov/Outreach-and-Education/Outreach/NPC/Downloads/HospVBP_FY15_NPC_Final_03052013_508.pdf. Accessed March 24, 2015.
33.
Chung  JW, Ju  MH, Kinnier  CV, Haut  ER, Baker  DW, Bilimoria  KY.  Evaluation of hospital factors associated with hospital postoperative venous thromboembolism imaging utilisation practices.  BMJ Qual Saf. 2014;23(11):947-956.PubMedGoogle ScholarCrossref
34.
Stone  PW, Pogorzelska-Maziarz  M, Herzig  CT,  et al.  State of infection prevention in US hospitals enrolled in the National Health and Safety Network.  Am J Infect Control. 2014;42(2):94-99.PubMedGoogle ScholarCrossref
35.
McBryde  ES, Brett  J, Russo  PL, Worth  LJ, Bull  AL, Richards  MJ.  Validation of statewide surveillance system data on central line-associated bloodstream infection in intensive care units in Australia.  Infect Control Hosp Epidemiol. 2009;30(11):1045-1049.PubMedGoogle ScholarCrossref
36.
Gilman  M, Adams  EK, Hockenberry  JM, Milstein  AS, Wilson  IB, Becker  ER.  Safety-net hospitals more likely than other hospitals to fare poorly under Medicare’s value-based purchasing.  Health Aff (Millwood). 2015;34(3):398-405.PubMedGoogle ScholarCrossref
37.
Sarrazin  MS, Rosenthal  GE.  Finding pure and simple truths with administrative data.  JAMA. 2012;307(13):1433-1435. PubMedGoogle ScholarCrossref
38.
How is the SIR adjusted for risk? Centers for Disease Control and Prevention. http://www.cdc.gov/HAI/surveillance/QA_stateSummary.html#b8. Accessed March 18, 2015.
39.
Dimick  JB, Ghaferi  AA, Osborne  NH, Ko  CY, Hall  BL.  Reliability adjustment for reporting hospital outcomes with surgery.  Ann Surg. 2012;255(4):703-707.PubMedGoogle ScholarCrossref
40.
Ju  MH, Ko  CY, Hall  BL, Bosk  CL, Bilimoria  KY, Wick  EC.  A comparison of 2 surgical site infection monitoring systems.  JAMA Surg. 2015;150(1):51-57.PubMedGoogle ScholarCrossref
41.
Readmissions reduction program. Centers for Medicare & Medicaid Services. http://cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Readmissions-Reduction-Program.html/. Accessed April 4, 2015.
Original Investigation
July 28, 2015

Hospital Characteristics Associated With Penalties in the Centers for Medicare & Medicaid Services Hospital-Acquired Condition Reduction Program

Author Affiliations
  • 1Surgical Outcomes and Quality Improvement Center (SOQIC), Department of Surgery, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
  • 2Center for Healthcare Studies in the Institute for Public Health and Medicine, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
  • 3Department of Surgery, Massachusetts General Hospital, Boston
  • 4Division of Quality, Northwestern Memorial HealthCare, Chicago, Illinois
  • 5Department of Surgery, Henry Ford Hospital, Detroit, Michigan
JAMA. 2015;314(4):375-383. doi:10.1001/jama.2015.8609
Abstract

Importance  In fiscal year (FY) 2015, the Centers for Medicare & Medicaid Services (CMS) instituted the Hospital-Acquired Condition (HAC) Reduction Program, which reduces payments to the lowest-performing hospitals. However, it is uncertain whether this program accurately measures quality and fairly penalizes hospitals.

Objective  To examine the characteristics of hospitals penalized by the HAC Reduction Program and to evaluate the association of a summary score of hospital characteristics related to quality with penalization in the HAC program.

Design, Setting, and Participants  Data for hospitals participating in the FY2015 HAC Reduction Program were obtained from CMS’ Hospital Compare and merged with the 2014 American Hospital Association Annual Survey and FY2015 Medicare Impact File. Logistic regression models were developed to examine the association between hospital characteristics and HAC program penalization. An 8-point hospital quality summary score was created using hospital characteristics related to volume, accreditations, and offering of advanced care services. The relationship between the hospital quality summary score and HAC program penalization was examined. Publicly reported process-of-care and outcome measures were examined from 4 clinical areas (surgery, acute myocardial infarction, heart failure, pneumonia), and their association with the hospital quality summary score was evaluated.

Exposures  Penalization in the HAC Reduction Program.

Main Outcomes and Measures  Hospital characteristics associated with penalization.

Results  Of the 3284 hospitals participating in the HAC program, 721 (22.0%) were penalized. Hospitals were more likely to be penalized if they were accredited by the Joint Commission (24.0% accredited, 14.4% not accredited; odds ratio [OR], 1.33; 95% CI, 1.04-1.70); they were major teaching hospitals (42.3%; OR, 1.58; 95% CI, 1.09-2.29) or very major teaching hospitals (62.2%; OR, 2.61; 95% CI, 1.55-4.39; vs nonteaching hospitals, 17.0%); they cared for more complex patient populations based on case mix index (quartile 4 vs quartile 1: 32.8% vs 12.1%; OR, 1.98; 95% CI, 1.44-2.71); or they were safety-net hospitals vs non–safety-net hospitals (28.3% vs 19.9%; OR, 1.36; 95% CI, 1.11-1.68). Hospitals with higher hospital quality summary scores had significantly better performance on 9 of 10 publicly reported process and outcomes measures compared with hospitals that had lower quality scores (all P ≤ .01 for trend). However, hospitals with the highest quality score of 8 were penalized significantly more frequently than hospitals with the lowest quality score of 0 (67.3% [37/55] vs 12.6% [53/422]; P < .001 for trend).

Conclusions and Relevance  Among hospitals participating in the HAC Reduction Program, hospitals that were penalized more frequently had more quality accreditations, offered advanced services, were major teaching institutions, and had better performance on other process and outcome measures. These paradoxical findings suggest that the approach for assessing hospital penalties in the HAC Reduction Program merits reconsideration to ensure it is achieving the intended goals.

×