[Skip to Navigation]
Sign In
Figure 1. 
Mean surgical process compliance, 2005-2006. Average hospital compliance with Surgical Care Improvement Project measures reported on the Centers for Medicare and Medicaid Services Hospital Compare Web site.

Mean surgical process compliance, 2005-2006. Average hospital compliance with Surgical Care Improvement Project measures reported on the Centers for Medicare and Medicaid Services Hospital Compare Web site.

Figure 2. 
Risk-adjusted mortality rate, venous thromboembolism, and surgical site infection by Surgical Care Improvement Project process compliance, 2005-2006. Medicare patient risk-adjusted outcome rates at hospitals in the lowest, middle, and highest quintiles of surgical compliance reported annually on the Hospital Compare Web site for 2005 and 2006. Outcomes are for patients undergoing the following 6 high-risk surgical procedures: abdominal aortic aneurysm repair, aortic valve repair, coronary artery bypass graft, esophageal resection, mitral valve repair, and pancreatic resection.

Risk-adjusted mortality rate, venous thromboembolism, and surgical site infection by Surgical Care Improvement Project process compliance, 2005-2006. Medicare patient risk-adjusted outcome rates at hospitals in the lowest, middle, and highest quintiles of surgical compliance reported annually on the Hospital Compare Web site for 2005 and 2006. Outcomes are for patients undergoing the following 6 high-risk surgical procedures: abdominal aortic aneurysm repair, aortic valve repair, coronary artery bypass graft, esophageal resection, mitral valve repair, and pancreatic resection.

Table 1. Hospital Characteristics and Surgical Process Volumea
Hospital Characteristics and Surgical Process Volumea
Table 2. Surgical Patient Characteristicsa
Surgical Patient Characteristicsa
Table 3. Odds of Risk-Adjusted Surgical Mortality Rate in High and Low SCIP Compliant Hospitals, 2005-2006
Odds of Risk-Adjusted Surgical Mortality Rate in High and Low SCIP Compliant Hospitals, 2005-2006
Table 4. Odds of Risk-Adjusted Venous Thromboembolism and Surgical Site Infection in High and Low SCIP Compliant Hospitals, 2006
Odds of Risk-Adjusted Venous Thromboembolism and Surgical Site Infection in High and Low SCIP Compliant Hospitals, 2006
1.
Khuri  SFDaley  JHenderson  W  et al.  Risk adjustment of the postoperative mortality rate for the comparative assessment of the quality of surgical care: results of the National Veterans Affairs Surgical Risk Study.  J Am Coll Surg 1997;185 (4) 315- 327PubMedGoogle Scholar
2.
Birkmeyer  JDSiewers  AEFinlayson  EVA  et al.  Hospital volume and surgical mortality in the United States.  N Engl J Med 2002;346 (15) 1128- 1137PubMedGoogle ScholarCrossref
3.
Birkmeyer  JDStukel  TASiewers  AEGoodney  PPWennberg  DELucas  FL Surgeon volume and operative mortality in the United States.  N Engl J Med 2003;349 (22) 2117- 2127PubMedGoogle ScholarCrossref
4.
Centers for Medicare and Medicaid Services Reporting Hospital Quality Data for Annual Payment Update.  US Department of Health and Human Services Web site. http://www.cms.hhs.gov/HospitalQualityInits/08_HospitalRHQDAPU.asp#TopOfPage. Accessed January 9, 2009
5.
Jha  AKLi  ZOrav  EJEpstein  AM Care in U.S. hospitals–the Hospital Quality Alliance program.  N Engl J Med 2005;353 (3) 265- 274PubMedGoogle ScholarCrossref
6.
Marshall  MNShekelle  PGLeatherman  SBrook  RH The public release of performance data: what do we expect to gain? a review of the evidence.  JAMA 2000;283 (14) 1866- 1874PubMedGoogle ScholarCrossref
7.
Chassin  MR Achieving and sustaining improved quality: lessons from New York State and cardiac surgery.  Health Aff (Millwood) 2002;21 (4) 40- 51PubMedGoogle ScholarCrossref
8.
Bratzler  DWHouck  PMRichards  C  et al.  Use of antimicrobial prophylaxis for major surgery: baseline results from the National Surgical Infection Prevention Project.  Arch Surg 2005;140 (2) 174- 182PubMedGoogle ScholarCrossref
9.
Bratzler  DWHouck  PMSurgical Infection Prevention Guideline Writers Workgroup, Antimicrobial prophylaxis for surgery: an advisory statement from the National Surgical Infection Prevention Project.  Am J Surg 2005;189 (4) 395- 404PubMedGoogle ScholarCrossref
10.
Rubin  HRPronovost  PDiette  GB The advantages and disadvantages of process-based measures of health care quality.  Int J Qual Health Care 2001;13 (6) 469- 474PubMedGoogle ScholarCrossref
11.
Pronovost  PJMiller  MWachter  RM The GAAP in quality measurement and reporting.  JAMA 2007;298 (15) 1800- 1802PubMedGoogle ScholarCrossref
12.
Werner  RMBradlow  ETOrav  EJEpstein  AM Relationship between Medicare's hospital compare performance measures and mortality rates.  JAMA 2006;296 (22) 2694- 2702PubMedGoogle ScholarCrossref
13.
Hospital Quality Alliance 2004–2007 Measure Build Out Table.  US Department of Health and Human Services Web site. http://www.cms.hhs.gov/HospitalQualityInits/Downloads/HospitalHQA2004_2007200512.pdf. Accessed January 13, 2009
14.
Centers for Medicare & Medicaid Services and the Joint Commission Specifications Manual for National Hospital Inpatient Quality Measures (Specifications Manual). http://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier2&cid=1141662756099. Accessed February 6, 2009
15.
Centers for Medicare & Medicaid Services Hospital Compare downloadable database: years 2005-2008 archive.  2006. US Department of Health and Human Services Web site. http://www.cms.hhs.gov/HospitalQualityInits/11_HospitalCompare.asp#TopOfPage. Accessed November 23, 2008
16.
Charlson  MEPompei  PAles  KLMacKenzie  CR A new method of classifying prognostic comorbidity in longitudinal studies: development and validation.  J Chronic Dis 1987;40 (5) 373- 383PubMedGoogle ScholarCrossref
17.
Quan  HSundararajan  VHalfon  P  et al.  Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data.  Med Care 2005;43 (11) 1130- 1139PubMedGoogle ScholarCrossref
18.
Werner  RMBradlow  ETAsch  DA Does hospital performance on process measures directly measure high quality care or is it a marker of unmeasured care?  Health Serv Res 2008;43 (5, pt 1) 1464- 1484Google ScholarCrossref
19.
Werner  RMBradlow  ET Relationship between Medicare's hospital compare performance measures and mortality rates.  JAMA 2006;296 (22) 2694- 2702PubMedGoogle ScholarCrossref
Original Article
October 18, 2010

Hospital Process Compliance and Surgical Outcomes in Medicare Beneficiaries

Author Affiliations

Author Affiliations: Institute for Social Research (Dr Nicholas) and Department of Surgery, University of Michigan (Drs Osborne, Birkmeyer, and Dimick), Michigan Surgical Collaborative for Outcomes Research and Evaluation, Ann Arbor.

Arch Surg. 2010;145(10):999-1004. doi:10.1001/archsurg.2010.191
Abstract

Objectives  To determine whether high rates of compliance with perioperative processes of care used for public reporting and pay-for-performance are associated with lower rates of risk-adjusted mortality and high-risk surgical complications.

Design  Retrospective analysis of Medicare inpatient claims data (from January 1, 2005, through December 31, 2006). Hierarchical logistic regression models assessed the relationship between adverse outcomes and hospital compliance with the surgical processes of care reported on the Hospital Compare Web site.

Setting  Two thousand US hospitals.

Participants  Beneficiaries who underwent 1 of 6 high-risk operations in 2005 and 2006.

Main Outcome Measures  Thirty-day postoperative mortality rate, venous thromboembolism, and surgical site infection.

Results  Process compliance ranged from 53.7% in low compliance hospitals to 91.4% in high compliance hospitals. Risk-adjusted outcomes did not vary at high compliance hospitals relative to medium compliance hospitals for mortality rate (odds ratio, 0.98; 95% confidence interval, 0.92-1.05), surgical site infection (1.01; 0.90-1.13), or venous thromboembolism (1.04; 0.89-1.20). Outcomes also did not vary at low compliance hospitals. Stratified analyses by operation type confirm these trends for the 6 procedures individually.

Conclusions  Currently available information on the Hospital Compare Web site will not help patients identify hospitals with better outcomes for high-risk surgery. The Centers for Medicare and Medicaid Services needs to identify higher leverage process measures and devote greater attention to profiling hospitals based on outcomes to improve public reporting and pay-for-performance efforts.

As variations in surgical quality are increasingly observed, payers are escalating efforts to reduce them.1-3 The Centers for Medicare and Medicaid Services (CMS), the largest public payer, now mandates public reporting of 2 sets of the Surgical Care Improvement Project (SCIP) measures covering infection and venous thromboembolism. Hospitals are required to submit data quarterly, which are posted on the Hospital Compare Web site (http://www.hospitalcompare.hhs.gov/), to receive annual Medicare payment updates.4 This reporting is believed to aid patients and payers in choosing high-quality hospitals and to stimulate quality improvement among reporting hospitals.5-7

It is unclear whether these efforts will translate into better outcomes for surgical patients. Although the SCIP measures were selected because of strong evidence linking them to certain outcomes, there is reason to be skeptical that improved compliance will result in significant improvements in the most important outcome, risk-adjusted mortality rate. Namely, SCIP processes are associated with outcomes that are rare (eg, deep venous thrombosis and pulmonary embolism) or considered secondary (eg, superficial surgical site infections).8,9 It is unknown whether measured processes of care are important determinants of surgical outcomes. If there is a weak link between process compliance and surgical outcomes, CMS public reporting and pay-for-performance efforts will be unlikely to stimulate important improvements or to help patients find the safest hospitals.10,11

In this context, we sought to determine whether hospital compliance rates for targeted surgical processes of care reported on the Hospital Compare Web site are related to risk-adjusted mortality rate, venous thromboembolism, and surgical site infection. We used national Medicare claims data to focus on patient outcomes after 6 high-risk surgical procedures.

Methods
Hospital compare data

The Hospital Compare Web site reports of medical care process compliance have been described previously.12 Hospitals began reporting 2 SCIP measures in 2005: the rate of prophylactic antibiotic receipt within 2 hours of surgery and the rate of prophylactic antibiotic therapy discontinuation within 24 hours of surgery. Three additional measures were added in the 2006 reports: rate of correct antibiotic administration to prevent infection, recommended venous thrombosis prophylaxis ordered, and recommended venous thrombosis prophylaxis ordered within 24 hours of surgery.13 Hospitals report the number of patients eligible for each process and the percentage receiving each process. The SCIP measures are collected across a broad range of procedures, including cardiac, orthopedic, vascular, general, and gynecologic surgery.14

We obtained archived Hospital Compare data covering the period from January 1, 2005, through December 31, 2006. Data are posted with a 9-month lag.15 To assess hospital compliance, we calculated an opportunity score based on the number of times a hospital complies with recommended measures for each eligible patient on as many as 5 SCIP measures for each year of Hospital Compare data (2 infection measures introduced in 2005 and 3 infection and 2 venous thromboembolism measures collected in 2006). Hospitals are classified into quintiles of composite process compliance score. In sensitivity analyses, we examined infection and venous thromboembolism scores separately.

Of the total hospitals, 39.8% (performing 43.3% of operations) reported SCIP compliance in 2005. Nonreporting hospitals had significantly lower annual procedure volume (78 vs 90) and were more likely to be government owned or nonprofit. No difference was found in nonsurgical composite compliance rates across reporters and nonreporters (97.0% of SCIP nonreporters reported other measures). By 2006, virtually all hospitals performing study procedures reported SCIP compliance. Only 15 hospitals performing 142 procedures did not report.

Medicare inpatient data

We identified all Medicare fee-for-service hospitalizations for 6 high-risk surgical procedures in the 100% Medicare Provider Analysis and Review data set from January 1, 2005, through December 31, 2006. Eligible admissions include abdominal aortic aneurysm repair, aortic valve repair, coronary artery bypass graft, esophageal resection, mitral valve repair, and pancreatic resection. These hospitalizations are well suited to our study because they are sufficiently common and high risk to reveal variation in surgical mortality and complication rates across hospitals.

During the study period, 325 052 fee-for-service Medicare beneficiaries aged 65 to 99 underwent 1 of the included procedures at 2189 hospitals nationwide. Contemporaneous SCIP compliance data are available for 229 665 admissions at 2038 hospitals. We identified 3 focal surgical outcomes in the Medicare data: 30-day postoperative mortality rate, postoperative deep venous thrombosis or pulmonary embolism, and postoperative surgical site infection.

Statistical analysis

We estimate risk-adjusted surgical outcomes overall and for each procedure using hierarchical linear models, including hospital-level indicators of quality known to relate to patient outcomes (procedure volume and indicators for highest SCIP compliance quintile and lowest SCIP compliance quintile) and patient characteristics (age, race, sex, severity of comorbid conditions classified using the Charlson comorbidity index,16 patient zip code median income from the 2000 US Census, whether the admission was scheduled (relative to emergent and urgent admissions), and year of admission).16,17 Hospital random effects are included to account for clustering of patients in hospitals.

The base analysis examines the relationship between contemporaneous surgical process compliance and mortality rate. We also examine lagged SCIP compliance and mortality rate, essentially testing whether the historical information posted on the Hospital Compare Web site effectively provides patients with information about their risk of an adverse surgical event.

We conducted additional analyses using only the 2006 data, for which a more comprehensive set of SCIP measures was available. In these data, we examine whether the collected measures provide information about patient risks of experiencing targeted outcomes associated with the SCIP measures of venous thrombosis and postoperative surgical site infection.

Results

Mean surgical compliance rates varied considerably, ranging from 53.7% in hospitals in the lowest compliance quintile to 91.4% in hospitals in the highest compliance quintile (Figure 1). Hospitals in the lowest quintile of process compliance were less likely to be accredited or to have an emergency department (Table 1). These hospitals also had lower rates of nonsurgical process compliance and lower surgical volume.

Unadjusted 30-day mortality varied from 3.9% for abdominal aortic aneurysm repair to 11.3% for mitral valve replacement. Little variation was seen in patient characteristics across levels of SCIP compliance, although hospitals in the lowest compliance quintile consistently serve Medicare beneficiaries from the lower-income zip codes (Table 2).

We found little evidence of a consistent relationship between hospital compliance with processes of care and operative mortality rate (Figure 2). In univariate analysis, mortality rates in the lowest compliance hospitals were statistically indistinguishable from those in the highest quintile of compliance for all procedures studied except aortic valve replacement, in which the highest compliance hospitals had lower mortality rates. Hospitals that did not report SCIP compliance had rates of risk-adjusted mortality similar to those in the highest quintile of SCIP compliance.

In multivariate analysis, relative to middle compliance, risk-adjusted mortality rates did not vary amongst lowest compliance hospitals (odds ratio [OR], 1.06; 95% confidence interval [CI], 0.97-1.16) or highest compliance hospitals (0.98; 0.92-1.05) (Table 3). Hospital compliance with the SCIP measures accounts for only 3.3% of the hospital variance in mortality. Stratified analyses by operation type also fail to show a significant association between hospital process compliance and mortality rate. Prior year SCIP compliance quintiles provide similar inference with wider CIs, reflecting greater statistical noise from a lagged measure and the smaller number of hospitals reporting 2005 data (Table 3).

Unadjusted complication rates are lower among hospitals in the lowest quintile of compliance with SCIP measures than those in the highest compliance quintile for deep venous thrombosis and pulmonary embolism (low compliance, 0.43%; high compliance, 0.59%) and infection (low compliance, 1.1%; high compliance, 1.9%).

These relationships persist in multivariate analysis (Table 4), in which we found no significant relationship between quintile of compliance and risk of venous thromboembolism (highest compliance OR, 1.04; 95% CI, 0.89-1.20; lowest compliance OR, 0.93; 0.73-1.20) or infection (highest compliance OR, 1.01; 0.90-1.13; lowest compliance OR, 0.96; 0.80-1.16).

We conducted several additional analyses to test the robustness of these findings. Results were unchanged when we replaced our SCIP compliance composites with outcome-specific measures; risk of infection did not vary with hospital compliance with SCIP infection compliance, and risk of venous thromboembolism did not significantly vary with use of venous thromboembolism prophylaxis. We also eliminated hospitals in the middle 3 quintiles of compliance and directly compared the highest compliance hospitals with the lowest compliance hospitals. There is no difference in risk-adjusted mortality (OR, 0.88; 95% CI, 0.78-1.01), venous thromboembolism (1.12; 0.85-1.48), or surgical site infection (1.04; 0.85-1.28) at highest compliance hospitals compared with lowest compliance ones.

We also consider extended length of stay, which could result from a number of postoperative complications. Extended hospital stays, in the highest quintile of procedure-specific inpatient days, are less likely at more compliant hospitals. Patients at the highest compliance hospitals are 12.0% less likely to experience an extended stay relative to middle compliance hospitals (OR, 0.88; 95% CI, 0.81-0.94), although no difference was found between lowest and middle compliance hospitals (1.05; 0.95-1.17).

Comment

The risk of patient death and the higher costs of Medicare associated with adverse surgical events emphasize the importance of providing beneficiaries with information that facilitates the choice of a high-performing hospital. There is a clear business case for increased use of high-quality hospitals for surgical patients. Although compliance with surgical process measures varies widely, we find little evidence that SCIP measures reliably correlate with risk-adjusted surgical outcomes. As a result, patients who choose their hospital based on high rates of process compliance will not improve their chance of survival or complications. However, patients choosing high compliance hospitals reduce their risk of experiencing an extended stay.

In contrast to our findings for surgery, previous research by Werner and colleagues18,19 have found mixed evidence of a relationship between process compliance and mortality rate. These authors have shown that medical process compliance rates reported on the Hospital Compare Web site, particularly those for acute myocardial infarction care, are related to inpatient mortality rate and reflect other dimensions of quality. Similar to our own results, however, medical process compliance explains little of the variation in mortality rate.

Our study has several important limitations primarily related to our use of administrative Medicare data. Hospitals are required to report process measures for all eligible admissions, not just Medicare admissions. Using all-payer data, we have generally found that hospitals' Medicare surgical mortality rates are highly correlated with their overall surgical mortality rates. However, it is possible that we failed to find a relationship between process compliance and surgical outcomes because of insufficient sample size. Relatively few of the high-risk procedures we examined are performed in the low compliance hospitals. Findings are robust to alternative categorizations of compliance and across multiple measures of adverse outcomes.

Our study also faces the well-known limitation of risk adjustment in administrative data. However, this will only bias our results if unobserved patient acuity is systematically related to hospital process compliance, which seems unlikely, especially because surgical process compliance rates were not generally known before their reporting on the Hospital Compare Web site during our study period. Furthermore, for rates of process compliance to be meaningful as indicators of hospital quality, their relationship to patient outcomes should not be heavily dependent on complex risk adjustment techniques, which will be inaccessible to most Hospital Compare Web site users.

There are several reasons why public reporting based on SCIP measures may be inadequate to differentiate quality of inpatient surgical care. The SCIP measures are low leverage because they relate to secondary and relatively less important outcomes. Even when processes are tied to an important outcome such as pulmonary embolism, these events are rare and offer insufficient variation to differentiate between high- and low-quality hospitals.

Unlike other surgical quality measures, such as procedure volume and inpatient mortality rate, which can be calculated from existing administrative data, reporting process measures imposes additional compliance costs on hospitals. It is important to determine whether tracking these measures provides useful information relative to the cost of data collection. If not, the CMS and other payers and policy makers targeting surgical compliance measures may want to consider other reporting options. Direct reporting of surgical outcomes would be an alternative. The Leapfrog Group, a large coalition of health care purchasers, encourages use of high-quality hospitals based on a composite measure of mortality rate and volume for several high-risk surgical procedures.

Despite the intentions of the CMS to provide patients with information that will facilitate patient choice of high-quality hospitals, currently available information on the Hospital Compare Web site will not help patients identify hospitals with better outcomes for high-risk surgery. The CMS needs to identify higher leverage process measures and devote greater attention to profiling hospitals based on outcomes for improved public reporting and pay-for-performance programs. Future research should ascertain whether process measures become more useful as indicators of surgical quality as public reporting programs mature.

Back to top
Article Information

Correspondence: Lauren H. Nicholas, PhD, Institute for Social Research, 426 Thompson St, Ann Arbor, MI 48104 (lnichola@umich.edu).

Accepted for Publication: September 2, 2009.

Author Contributions:Study concept and design: Nicholas, Osborne, Birkmeyer, and Dimick. Acquisition of data: Osborne and Dimick. Analysis and interpretation of data: Nicholas, Osborne, Birkmeyer, and Dimick. Drafting of the manuscript: Nicholas, Birkmeyer, and Dimick. Critical revision of the manuscript for important intellectual content: Osborne, Birkmeyer, and Dimick. Statistical analysis: Nicholas and Osborne. Administrative, technical, and material support: Birkmeyer. Study supervision: Birkmeyer and Dimick.

Funding/Support: This study was supported by training grant AG000221-17 from the National Institute on Aging to Dr Nicholas, a training grant from the Robert Wood Johnson Foundation to Dr Osborne, grant R01 CA098481 from the National Cancer Institute to Dr Birkmeyer, and career development award K08 HS017765 from the Agency for Healthcare Research and Quality to Dr Dimick.

References
1.
Khuri  SFDaley  JHenderson  W  et al.  Risk adjustment of the postoperative mortality rate for the comparative assessment of the quality of surgical care: results of the National Veterans Affairs Surgical Risk Study.  J Am Coll Surg 1997;185 (4) 315- 327PubMedGoogle Scholar
2.
Birkmeyer  JDSiewers  AEFinlayson  EVA  et al.  Hospital volume and surgical mortality in the United States.  N Engl J Med 2002;346 (15) 1128- 1137PubMedGoogle ScholarCrossref
3.
Birkmeyer  JDStukel  TASiewers  AEGoodney  PPWennberg  DELucas  FL Surgeon volume and operative mortality in the United States.  N Engl J Med 2003;349 (22) 2117- 2127PubMedGoogle ScholarCrossref
4.
Centers for Medicare and Medicaid Services Reporting Hospital Quality Data for Annual Payment Update.  US Department of Health and Human Services Web site. http://www.cms.hhs.gov/HospitalQualityInits/08_HospitalRHQDAPU.asp#TopOfPage. Accessed January 9, 2009
5.
Jha  AKLi  ZOrav  EJEpstein  AM Care in U.S. hospitals–the Hospital Quality Alliance program.  N Engl J Med 2005;353 (3) 265- 274PubMedGoogle ScholarCrossref
6.
Marshall  MNShekelle  PGLeatherman  SBrook  RH The public release of performance data: what do we expect to gain? a review of the evidence.  JAMA 2000;283 (14) 1866- 1874PubMedGoogle ScholarCrossref
7.
Chassin  MR Achieving and sustaining improved quality: lessons from New York State and cardiac surgery.  Health Aff (Millwood) 2002;21 (4) 40- 51PubMedGoogle ScholarCrossref
8.
Bratzler  DWHouck  PMRichards  C  et al.  Use of antimicrobial prophylaxis for major surgery: baseline results from the National Surgical Infection Prevention Project.  Arch Surg 2005;140 (2) 174- 182PubMedGoogle ScholarCrossref
9.
Bratzler  DWHouck  PMSurgical Infection Prevention Guideline Writers Workgroup, Antimicrobial prophylaxis for surgery: an advisory statement from the National Surgical Infection Prevention Project.  Am J Surg 2005;189 (4) 395- 404PubMedGoogle ScholarCrossref
10.
Rubin  HRPronovost  PDiette  GB The advantages and disadvantages of process-based measures of health care quality.  Int J Qual Health Care 2001;13 (6) 469- 474PubMedGoogle ScholarCrossref
11.
Pronovost  PJMiller  MWachter  RM The GAAP in quality measurement and reporting.  JAMA 2007;298 (15) 1800- 1802PubMedGoogle ScholarCrossref
12.
Werner  RMBradlow  ETOrav  EJEpstein  AM Relationship between Medicare's hospital compare performance measures and mortality rates.  JAMA 2006;296 (22) 2694- 2702PubMedGoogle ScholarCrossref
13.
Hospital Quality Alliance 2004–2007 Measure Build Out Table.  US Department of Health and Human Services Web site. http://www.cms.hhs.gov/HospitalQualityInits/Downloads/HospitalHQA2004_2007200512.pdf. Accessed January 13, 2009
14.
Centers for Medicare & Medicaid Services and the Joint Commission Specifications Manual for National Hospital Inpatient Quality Measures (Specifications Manual). http://www.qualitynet.org/dcs/ContentServer?c=Page&pagename=QnetPublic%2FPage%2FQnetTier2&cid=1141662756099. Accessed February 6, 2009
15.
Centers for Medicare & Medicaid Services Hospital Compare downloadable database: years 2005-2008 archive.  2006. US Department of Health and Human Services Web site. http://www.cms.hhs.gov/HospitalQualityInits/11_HospitalCompare.asp#TopOfPage. Accessed November 23, 2008
16.
Charlson  MEPompei  PAles  KLMacKenzie  CR A new method of classifying prognostic comorbidity in longitudinal studies: development and validation.  J Chronic Dis 1987;40 (5) 373- 383PubMedGoogle ScholarCrossref
17.
Quan  HSundararajan  VHalfon  P  et al.  Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data.  Med Care 2005;43 (11) 1130- 1139PubMedGoogle ScholarCrossref
18.
Werner  RMBradlow  ETAsch  DA Does hospital performance on process measures directly measure high quality care or is it a marker of unmeasured care?  Health Serv Res 2008;43 (5, pt 1) 1464- 1484Google ScholarCrossref
19.
Werner  RMBradlow  ET Relationship between Medicare's hospital compare performance measures and mortality rates.  JAMA 2006;296 (22) 2694- 2702PubMedGoogle ScholarCrossref
×