[Skip to Navigation]
Sign In
Figure 1. RSMRs for fee-for-service Medicare beneficiaries admitted for AMI, HF, and PNE, stratified by hospital location in the US states or the US territories. The upper boundaries of the boxes represent the 75th percentile; the black horizontal line within each box, the median or 50th percentile; and the lower boundaries of the boxes, the 25th percentile. AMI indicates acute myocardial infarction; HF, heart failure; PNE, pneumonia; and RSMRs, 30-day risk-standardized rates for all-cause mortality.

Figure 1. RSMRs for fee-for-service Medicare beneficiaries admitted for AMI, HF, and PNE, stratified by hospital location in the US states or the US territories. The upper boundaries of the boxes represent the 75th percentile; the black horizontal line within each box, the median or 50th percentile; and the lower boundaries of the boxes, the 25th percentile. AMI indicates acute myocardial infarction; HF, heart failure; PNE, pneumonia; and RSMRs, 30-day risk-standardized rates for all-cause mortality.

Figure 2. RSRRs for fee-for-service Medicare beneficiaries admitted for AMI, HF, and PNE, stratified by hospital location in the US states or the US territories. The upper boundaries of the boxes represent the 75th percentile; the black horizontal line within each box, the median or 50th percentile; and the lower boundaries of the boxes, the 25th percentile. AMI indicates acute myocardial infarction; HF, heart failure; PNE, pneumonia; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 2. RSRRs for fee-for-service Medicare beneficiaries admitted for AMI, HF, and PNE, stratified by hospital location in the US states or the US territories. The upper boundaries of the boxes represent the 75th percentile; the black horizontal line within each box, the median or 50th percentile; and the lower boundaries of the boxes, the 25th percentile. AMI indicates acute myocardial infarction; HF, heart failure; PNE, pneumonia; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 3. State-level and territorial-level mean RSMRs and RSRRs for fee-for-service Medicare beneficiaries admitted for AMI, presented by performance quintile. Quintiles were determined for each outcome measure. The fifth quintile represents the poorest performing states and/or territories on average; the first quintile, the best performing states and/or territories on average. AMI indicates acute myocardial infarction; RSMRs, 30-day risk-standardized rates for all-cause mortality; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 3. State-level and territorial-level mean RSMRs and RSRRs for fee-for-service Medicare beneficiaries admitted for AMI, presented by performance quintile. Quintiles were determined for each outcome measure. The fifth quintile represents the poorest performing states and/or territories on average; the first quintile, the best performing states and/or territories on average. AMI indicates acute myocardial infarction; RSMRs, 30-day risk-standardized rates for all-cause mortality; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 4. State-level and territorial-level mean RSMRs and RSRRs for fee-for-service Medicare beneficiaries admitted for HF, presented by performance quintile. Quintiles were determined for each outcome measure. The fifth quintile represents the poorest performing states and/or territories on average; the first quintile, the best performing states and/or territories on average. HF indicates heart failure; RSMRs, 30-day risk-standardized rates for all-cause mortality; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 4. State-level and territorial-level mean RSMRs and RSRRs for fee-for-service Medicare beneficiaries admitted for HF, presented by performance quintile. Quintiles were determined for each outcome measure. The fifth quintile represents the poorest performing states and/or territories on average; the first quintile, the best performing states and/or territories on average. HF indicates heart failure; RSMRs, 30-day risk-standardized rates for all-cause mortality; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 5. State-level and territorial-level mean RSMRs and RSRRs for fee-for-service Medicare beneficiaries admitted for PNE, presented by performance quintile. Quintiles were determined for each outcome measure. The fifth quintile represents the poorest performing states and/or territories on average; the first quintile, the best performing states and/or territories on average. PNE indicates pneumonia; RSMRs, 30-day risk-standardized rates for all-cause mortality; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Figure 5. State-level and territorial-level mean RSMRs and RSRRs for fee-for-service Medicare beneficiaries admitted for PNE, presented by performance quintile. Quintiles were determined for each outcome measure. The fifth quintile represents the poorest performing states and/or territories on average; the first quintile, the best performing states and/or territories on average. PNE indicates pneumonia; RSMRs, 30-day risk-standardized rates for all-cause mortality; and RSRRs, 30-day risk-standardized rates for all-cause readmission.

Table 1. Patient Demographic Characteristics, Cardiovascular Medical History, and Comorbid Conditions for Fee-for-Service Medicare Beneficiaries Admitted for AMI, HF, and PNE (July 2005–June 2008), Stratified by Condition and Hospital Location in the US State or in the US Territoriesa
Table 1. Patient Demographic Characteristics, Cardiovascular Medical History, and Comorbid Conditions for Fee-for-Service Medicare Beneficiaries Admitted for AMI, HF, and PNE (July 2005–June 2008), Stratified by Condition and Hospital Location in the US State or in the US Territoriesa
Table 2. Hospital Characteristics for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008), Stratified by Hospital Location in the US States or in the US Territories
Table 2. Hospital Characteristics for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008), Stratified by Hospital Location in the US States or in the US Territories
Table 3. Performance on Outcome Measures and Core Process Measures for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008), Stratified by Hospital Location in the US States or in the US Territories
Table 3. Performance on Outcome Measures and Core Process Measures for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008), Stratified by Hospital Location in the US States or in the US Territories
Table 4. Adjusteda RSMR and 30-Day RSRR by Condition for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008) in the US States and in the US Territories
Table 4. Adjusteda RSMR and 30-Day RSRR by Condition for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008) in the US States and in the US Territories
Table 5. Percentage of Hospitals in Each Decile of Performancea on RSMR and RSRR Outcome Measures for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008), Stratified by Hospital Location in the US States or in the US Territories
Table 5. Percentage of Hospitals in Each Decile of Performancea on RSMR and RSRR Outcome Measures for Hospitals That Admitted Fee-for-Service Medicare Beneficiaries for AMI, HF, and PNE (July 2005–June 2008), Stratified by Hospital Location in the US States or in the US Territories
1.
US Census Bureau.  US Census 2000. http://www.census.gov/main/www/cen2000.html. Accessed April 4, 2011
2.
Agency for Healthcare Research and Quality.  National Healthcare Quality Report 2008Rockville, MD: Agency for Healthcare Research and Quality, US Dept of Health and Human Services; 2009. AHRQ publication 09-0001
3.
Agency for Healthcare Research and Quality.  National Healthcare Disparities Report 2008Rockville, MD: Agency for Healthcare Research and Quality, US Dept of Health and Human Services; 2009. AHRQ publication 09-0002
4.
Smedley BD, Stith AY, Nelson AR. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academic Press; 2003
5.
The Dartmouth Institute for Health Policy and Clinical Practice.  The Dartmouth Atlas of Health Care. http://www.dartmouthatlas.org/. Accessed April 4, 2011
6.
Centers for Medicare and Medicaid Services.  CMS Announces Guidelines for Reporting Hospital Quality Data. Baltimore, MD: Centers for Medicare and Medicaid Services; 2004
7.
National Quality Forum.  Measuring performance. Washington, DC: National Quality Forum; 2010. http://www.qualityforum.org/Measuring_Performance/Measuring_Performance.aspx. Accessed January 4, 2011
8.
National Quality Forum.  National Voluntary Consensus Standards for Hospital Care: Additional Priority Areas 2005-2006: Pneumonia Mortality Supplement. Washington, DC: National Quality Forum; April 2007
9.
Keenan PS, Normand SL, Lin Z,  et al.  An administrative claims measure suitable for profiling hospital performance on the basis of 30-day all-cause readmission rates among patients with heart failure.  Circ Cardiovasc Qual Outcomes. 2008;1(1):29-3720031785PubMedGoogle ScholarCrossref
10.
Krumholz HM, Merrill AR, Schone EM,  et al.  Patterns of hospital performance in acute myocardial infarction and heart failure 30-day mortality and readmission.  Circ Cardiovasc Qual Outcomes. 2009;2(5):407-41320031870PubMedGoogle ScholarCrossref
11.
Krumholz HM, Wang Y, Mattera JA,  et al.  An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with an acute myocardial infarction.  Circulation. 2006;113(13):1683-169216549637PubMedGoogle ScholarCrossref
12.
Bratzler DW, Normand SL, Wang Y,  et al.  An administrative claims model for profiling hospital 30-day mortality rates for pneumonia patients.  PLoS One. 2011;6(4):e1740121532758PubMedGoogle ScholarCrossref
13.
Krumholz HM, Lin Z, Drye EE,  et al.  An administrative claims measure suitable for profiling hospital performance based on 30-day all-cause readmission rates among patients with acute myocardial infarction.  Circ Cardiovasc Qual Outcomes. 2011;4(2):243-25221406673PubMedGoogle ScholarCrossref
14.
Lindenauer PK, Normand SL, Drye EE,  et al.  Development, validation, and results of a measure of 30-day readmission following hospitalization for pneumonia.  J Hosp Med. 2011;6(3):142-15021387551PubMedGoogle ScholarCrossref
15.
US Department of Health and Human Services.  Medical Hospital Quality Compare. Washington, DC: US Department of Health and Human Services; 2009. http://www.hospitalcompare.hhs.gov/hospital-search.aspx?AspxAutoDetectCookieSupport=1. Accessed April 4, 2011
16.
Joint Commission on Accreditation of Healthcare Organizations.  Performance measurement initiatives: current specification manual for national hospital quality measures.  2006. http://www.cms.gov/HospitalQualityInits/downloads/HospitalHQA2004_2007200512.pdf. Accessed April 4, 2011
17.
US Department of the Interior. Definitions of insular area political organizations. Washington, DC: US Department of the Interior; 2007. http://www.doi.gov/oia/Islandpages/political_types.htm. Accessed April 4, 2011
18.
US Census Bureau.  US Census 2000: The island areas. http://www.census.gov/population/www/cen2000/islandareas/index.html. Accessed April 4, 2011
19.
Ross JS, Cha SS, Epstein AJ,  et al.  Quality of care for acute myocardial infarction at urban safety-net hospitals.  Health Aff (Millwood). 2007;26(1):238-24817211034PubMedGoogle ScholarCrossref
20.
Ross JS, Normand SL, Wang Y, Nallamothu BK, Lichtman JH, Krumholz HM. Hospital remoteness and thirty-day mortality from three serious conditions.  Health Aff (Millwood). 2008;27(6):1707-171718997230PubMedGoogle ScholarCrossref
21.
Schultz MA, van Servellen G, Litwin MS, McLaughlin EJ, Uman GC. Can hospital structural and financial characteristics explain variations in mortality caused by acute myocardial infarction?  Appl Nurs Res. 1999;12(4):210-21410589110PubMedGoogle ScholarCrossref
22.
Thiemann DR, Coresh J, Oetgen WJ, Powe NR. The association between hospital volume and survival after acute myocardial infarction in elderly patients.  N Engl J Med. 1999;340(21):1640-164810341277PubMedGoogle ScholarCrossref
23.
Bradley EH, Herrin J, Elbel B,  et al.  Hospital quality for acute myocardial infarction: correlation among process measures and relationship with short-term mortality.  JAMA. 2006;296(1):72-7816820549PubMedGoogle ScholarCrossref
24.
Rubin DB. A non-iterative algorithm for least squares estimation of missing values in any analysis of variance design.  J R Stat Soc Ser A. 1972;C:136-141Google Scholar
25.
Rubin DB. Multiple Imputation for Nonresponse in Surveys. New York, NY: Wiley; 1987
26.
Barnard J, Rubin DB. Small-sample degrees of freedom with multiple imputation.  Biometrika. 1999;86(4):948-955Google ScholarCrossref
27.
Schafer JL. Analysis of Incomplete Multivariate Data. Boca Raton, FL: Chapman & Hall/CRC; 1997
28.
Scott C. Federal Medical Assistance Percentage (FMAP) for Medicaid: CRS Report for Congress. Washington, DC: Congressional Research Service. Publication RS21262
29.
 US Insular Areas: Multiple Factors Affect Health Care Funding: Report to Congressional Requesters. Washington, DC: US Government Accountability Office; 2005. GOA publication GOA-06-075
30.
Scanlon WJ. Medicare and Medicaid: Meeting Needs of Dual Eligibles Raises Difficult Cost and Care Issues: Testimony Before the Special Committee on Aging, US Senate. Washington, DC: US General Accounting Office; 1997. GOA publication GOA/T-HEHS-97-119
31.
Social Security Administration.  Annual statistical supplement, 2006. Baltimore, MD: Social Security Administration; 2006. http://www.ssa.gov/policy/docs/statcomps/supplement/2006/oasdi.html. Accessed April 4, 2011
32.
Evans M. Healthcare's place in the sun. Execs, GAO find sandy beaches, salty Medicaid funding in US territories.  Mod Healthc. 2006;36(29):28-2916958380PubMedGoogle Scholar
33.
US Electoral College.  Frequently asked questions. Washington, DC: US National Archives and Records Administration. http://www.archives.gov/federal-register/electoral-college/faq.html#territories. Accessed September 23, 2010
34.
Bratzler DW, Nsa W, Houck PM. Performance measures for pneumonia: are they valuable, and are process measures adequate?  Curr Opin Infect Dis. 2007;20(2):182-18917496578PubMedGoogle ScholarCrossref
35.
Fonarow GC, Peterson ED. Heart failure performance measures and outcomes: real or illusory gains.  JAMA. 2009;302(7):792-79419690314PubMedGoogle ScholarCrossref
36.
Werner RM, Bradlow ET. Relationship between Medicare's hospital compare performance measures and mortality rates.  JAMA. 2006;296(22):2694-270217164455PubMedGoogle ScholarCrossref
37.
Normand ST, Wang Y, Krumholz HM. Assessing surrogacy of data sources for institutional comparisons.  Health Serv Outcomes Res Methodol. 2007;7(1-2):79-96Google ScholarCrossref
Original Investigation
Sep 26, 2011

Quality of Care in the US Territories

Author Affiliations

Author Affiliations: Section of General Internal Medicine (Dr Nunez-Smith), Robert Wood Johnson Clinical Scholars Program (Drs Nunez-Smith, Curry, and Krumholz), Section of Cardiovascular Medicine (Drs Herrin and Krumholz), Department of Medicine, Yale University School of Medicine, New Haven, Connecticut; Section of Health Policy and Administration, Yale University School of Public Health, New Haven (Drs Bradley and Curry); Health Research and Educational Trust, Chicago, Illinois (Dr Herrin); Division of General Internal Medicine, Department of Medicine, Montefiore Medical Center, Bronx, New York (Dr Santana); Department of Health Care Policy, Harvard Medical School, Boston, Massachusetts (Dr Normand); Department of Biostatistics, Harvard School of Public Health, Boston (Dr Normand); and Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven (Dr Krumholz).

Arch Intern Med. 2011;171(17):1528-1540. doi:10.1001/archinternmed.2011.284
Abstract

Background Health care quality in the US territories is poorly characterized. We used process measures to compare the performance of hospitals in the US territories and in the US states.

Methods Our sample included nonfederal hospitals located in the United States and its territories discharging Medicare fee-for-service (FFS) patients with a principal discharge diagnosis of acute myocardial infarction (AMI), heart failure (HF), or pneumonia (PNE) (July 2005–June 2008). We compared risk-standardized 30-day mortality and readmission rates between territorial and stateside hospitals, adjusting for performance on core process measures and hospital characteristics.

Results In 57 territorial hospitals and 4799 stateside hospitals, hospital mean 30-day risk-standardized mortality rates were significantly higher in the US territories (P < .001) for AMI (18.8% vs 16.0%), HF (12.3% vs 10.8%), and PNE (14.9% vs 11.4%). Hospital mean 30-day risk-standardized readmission rates (RSRRs) were also significantly higher in the US territories for AMI (20.6% vs 19.8%; P = .04), and PNE (19.4% vs 18.4%; P = .01) but was not significant for HF (25.5% vs 24.5%; P = .07). The higher risk-standardized mortality rates in the US territories remained statistically significant after adjusting for hospital characteristics and core process measure performance. Hospitals in the US territories had lower performance on all core process measures (P < .05).

Conclusions Compared with hospitals in the US states, hospitals in the US territories have significantly higher 30-day mortality rates and lower performance on every core process measure for patients discharged after AMI, HF, and PNE. Eliminating the substantial quality gap in the US territories should be a national priority.

The United States has jurisdiction over several unincorporated territories, including the Commonwealth of Puerto Rico, Guam, American Samoa, the Commonwealth of the Northern Mariana Islands, and the US Virgin Islands. These US territories are home to almost 5 million residents, almost all of whom self-identify as racial/ethnic minorities.1

Despite a national commitment to eliminate health disparities, the territories are largely absent in national reports of health care equity and quality.2-4 Studies about hospital quality of care in the US typically exclude hospitals in the US territories or combine them with other US regional areas, masking potential differences between quality of care between the territories and states.5 The recent initiatives by the Centers of Medicare & Medicaid Services (CMS) to measure hospital outcomes (ie, short-term mortality and readmission rates) and hospital processes of care (ie, performance on a set of core process measures) for 3 acute conditions—acute myocardial infarction (AMI), heart failure (HF), and pneumonia (PNE)—provide an opportunity to assess the overall quality of care in the territories.6 We sought to compare performance on the outcome and core process measures between hospitals in the US territories and those located in the US states. We also investigated whether hospital characteristics and core process measures accounted for differences in performance on outcome measures.

Methods
Study cohort

The study cohort included hospitals in the US territories and in the US states, inclusive of the District of Columbia, that discharged at least 1 Medicare fee-for-service (FFS) adult patient with a primary diagnosis of AMI, HF, or PNE between July 2005 and June 2008. Additional patient inclusion criteria included at least 12 months of continuous Medicare FFS coverage prior to the index admission in order to accurately capture patient comorbidity. We randomly selected 1 admission per year for patients with multiple admissions for the same diagnosis within any study year. Patients transferred between hospitals were assigned to the referring hospitals. Diagnosis was based on International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes (eTable).

Performance measures

We included 2 types of performance measures, outcome measures and core process measures. The outcome measures included the hospital-specific 30-day risk-standardized rates for all-cause mortality (RSMR) and for all-cause readmission (RSRR), which were developed for CMS and endorsed by the National Quality Forum.7,8 These rates are derived from CMS administrative data and have previously been demonstrated to produce estimates that are reliable approximations of models based on medical records; the detailed method for the derivation and validation of risk-standardized mortality and readmission rates included territorial data and is published elsewhere.9-14 Briefly, the risk-standardized rates are determined for each hospital and condition (AMI, HF, and PNE) by dividing the predicted number of deaths or readmissions by the expected number and multiplying this ratio by the national unadjusted 30-day mortality or 30-day readmission rate for the particular condition.

We also examined core process measures, which are publicly reported, evidence-based standards of care specific to each condition. Hospitals receive a financial incentive to voluntarily submit these data to the Hospital Quality Alliance; these data are reported publicly in the CMS Hospital Compare database. We used the CMS Hospital Compare database, based on medical records and administrative databases covering July 1, 2005, to June 30, 2008, for information about performance on AMI, HF, and PNE core process measures.15 Hospital performance for each outcome and process measure is determined using varying numbers of admissions within each hospital and, therefore, is based on different condition-specific patient populations. The method for the calculation of each core process measure is detailed elsewhere.16 Core process measures for AMI comprised appropriate angiotensin-converting enzyme inhibitor or angiotensin-receptor blocker prescription at discharge, smoking cessation advice, aspirin at admission, aspirin at discharge, β-blocker at discharge, percutaneous coronary intervention for less than 90 minutes, and fibrinolytic therapy for less than 30 minutes). Quality measures for HF comprised appropriate angiotensin-converting enzyme inhibitor or angiotensin-receptor blocker prescription at discharge, discharge instructions, smoking cessation advice, and assessment of left ventricular ejection fraction. Quality measures for PNE comprised influenza and pneumococcal vaccination, initial antibiotics within 6 hours, oxygenation assessment at admission, smoking cessation advice, most appropriate initial antibiotic, and blood culture in the emergency department.

Territories

The US territories included the 4 organized insular areas of the United States: Puerto Rico, Guam, the Northern Mariana Islands, and the US Virgin Islands. An organized territory differs from a state in the allowance for relatively limited self government under Organic Acts (formal congressional legislation to organize local government); ultimate authority is held by the US Congress, not the territorial government.17 Residents in these 4 territories are US citizens by birth. We also included American Samoa, an unorganized insular area of the United States. American Samoa is technically unorganized because it does not have an Organic Act despite having limited self-government; residents here are US nationals by birth and have to apply for US citizenship. We excluded territories that do not have civilian hospitals because they are uninhabited or only inhabited by military personnel or a small number of people with caretaking responsibilities for the island.

The US territories are largely viewed as 1 collective for the purpose of federal government policies but differ in population size and composition.18 American Samoa is the smallest territory by population with approximately 57 000 residents, most of whom (92%) self-identify as native Hawaiian and other Pacific Islander. Residents of the Northern Mariana Islands (population, n = 69 000) primarily identify as Asian (55%), specifically Chinese or Filipino, or native Hawaiian and other Pacific Islander (32%). Guam (population, n = 155 000) residents, similarly, identify as Asian (33%) or native Hawaiian and other Pacific Islander (45%). The US Virgin Islands has a population of 108 000, and 76% of residents identify as black or African American. Puerto Rico is the largest territory, with a population of 3.9 million; most people who reside in Puerto Rico identify as white (81%) and Hispanic (99%).

Hospital characteristics

Hospital characteristics were derived from the 2007 American Hospital Association (AHA) Survey and selected based on findings from earlier work.19-22 We assessed the number of total staffed beds (≤50, 51-100, 101-200, 201-300, and >300), ownership type (government owned, private not-for-profit, and private for-profit), cardiac facilities (no catheterization laboratory, catheterization laboratory but no open heart surgery, open heart surgery, unknown cardiac facilities), and The Joint Commission accreditation status (yes or no). We used CMS data to calculate the Medicare diagnosis-specific 3-year volume using the number of patients reported for each of the 3 mortality measures, categorized as quintiles of discharges (≤15, 16-45, 46-150, 151-480, and >480).

Statistical analysis

We described the distribution of patient-level characteristics including medical history and co-existing conditions across the cohort stratified by condition and location in the US states or in the US territories. In contrast, the hospital was the unit of analysis for all subsequent analyses. We compared hospital characteristics between hospitals located in the territories and those in the states using χ2 tests among the subset of hospitals we could match with the AHA survey data. We compared core process measures and 30-day RSMRs and 30-day RSRRs for all reporting hospitals in the territories and in the states using linear regression, weighted for the hospital 3-year volume of the corresponding outcome measure (RSMR or RSRR), including an indicator for either all territories combined or separate indicators for Puerto Rico and the other territories; we report the corresponding Wald test statistic. Our early analyses demonstrated we could combine these 3 years of data because differences by territorial or stateside location did not vary significantly by year. We used box and whisker plots to illustrate the differences in RSMRs and RSRRs between hospitals in the territories and hospitals in the states.

We next used this AHA-matched group in multivariable analyses. We estimated similarly weighted linear regression models using mean RSMRs and RSRRs for hospitals in the AHA-matched group as our dependent variables and territory status as the independent variable; we included hospital core process measures and hospital structural characteristics as covariates. We included hospital core process measures in the final regression models to assess whether differences in process measure performance could account for any observed differences in the outcomes of interest; we included hospital structural characteristics based on previously published work.19-23 We excluded percutaneous coronary intervention less than 90 minutes from multivariable analyses because fewer than 10% of hospitals in the US territories reported this measure. We chose 2 different strategies to handle missing data. In multivariable models, indicator variables were included for missing hospital characteristics (ie, ownership and cardiac facilities) and multiple imputation was used to account for missing performance measures.24 We used 2 approaches because the missing hospital characteristics data were typically missing for the same set of hospitals (eg, not missing at random). For multiple imputation, we generated 20 imputed data sets for each outcome measure using linear regression of performance measures against the outcome measure and hospital characteristics; we then estimated the model on each of the 20 data sets and combined the results to produce a single set of estimates for the coefficients and standard errors.25-27 We repeated these analyses with data from the hospitals in Puerto Rico separated from the other territories to assess potential interterritorial differences.

All analyses were performed using SAS version 9.1 (SAS Institute Inc, Cary, North Carolina) and Stata 11 (StataCorp, College Station, Texas, 2009) statistical software.

Results
Hospital sample

The total sample included 4856 hospitals, 57 in the territories and 4799 in the states, reporting at least 1 admission for 1 of the 3 diagnoses over the study period. The total patient numbers for each outcome measure and geographic area are reported along with characteristics of the corresponding patient samples (Table 1). For the multivariable analyses, we excluded 204 hospitals in the US states and 4 hospitals in Puerto Rico because they could not be matched to the AHA survey data; therefore, these models included 53 hospitals in the territories and 4595 hospitals in the states.

Hospital structural characteristics in us territories and us states

The hospitals in the US territories differed significantly from hospitals in the US states on several structural characteristics (Table 2). Compared with the hospitals in the states, hospitals in the territories had fewer staffed beds, lower condition-specific volume over the study period, more for-profit ownership (relative to government or private not-for-profit ownership), and higher number of hospitals missing data on cardiac intervention facilities (vs having open heart surgery facilities, having interventional catheterization laboratory facilities only, or having no open heart catheterization facilities) (P < .001). Although fewer hospitals in the territories were accredited by The Joint Commission, the difference was not statistically significant (P = .08).

Hospital core process measure performance in the us territories and the us states

Hospitals in the US territories demonstrated significantly worse performance compared with the US states on all core process measures (P < .05) (Table 3). Hospitals in Puerto Rico and hospitals in the other territories performed similarly on most core process measures (Table 3).

Hospital outcome measure performance in us territories and us states

The hospital mean 30-day RSMR was significantly higher in the US territories compared with the US states for AMI (18.8% [range, 16.1%-24.5%] vs 16.0% [range, 10.9%-24.9%]; P < .001), HF (12.3% [range, 10.3%-15.7%] vs 10.8% [range, 6.6%-19.8%]; P < .001), and PNE (14.9% [9.2%-21.6%] vs 11.4% [range, 6.4%-20.1%]; P < .001) (Table 3). After adjusting for condition-specific core process measures and hospital characteristics, mortality in territorial compared with stateside hospitals remained significant for all 3 conditions (Figure 1). After adjusting for hospital characteristics and core measure performance, 30-day RSMRs in the territories remained significantly worse for patients with AMI (19.1% vs 17.3%; P < .001), HF (12.3% vs 11.3%; P < .001), and PNE (15.3% vs 12.0%; P < .001) (Table 4).

The unadjusted hospital mean 30-day RSRR was significantly higher in the territories for 2 of the 3 conditions: AMI (20.6% [range, 18.7%-24.3%] vs 19.8% [range, 15.3%-29.4%]; P = .04) and PNE (19.4% [range, 15.7%-22.5%] vs 18.4% [range, 13.1%-27.6%]; P = .01) (Table 3). Differences in RSRR for the US territories compared with the US states were not statistically significant for HF (25.5% [range, 22.6%-29.1%] vs 24.5% [range, 15.9%-34.4%]; P = .07). However, 30-day RSRRs in the US territories were not significantly different from readmission rates in the US states for any of the 3 conditions after adjustment for hospital structural characteristics and core process measure performance (Table 4 and Figure 2). Hospitals in Puerto Rico performed similarly to hospitals in the other US territories across all performance measures (Table 3).

The percentage of hospitals in each decile of outcome measure performance (30-day RSMR and RSRR for the 3 conditions) differed greatly between hospitals in the US territories and hospitals in the US states across all 6 outcome measures (Table 5). Using RSMR after AMI as an example, we found that 10.3% of stateside hospitals fell within the first decile (lowest mortality rate) compared with 0% of territorial hospitals. Similarly, 9.5% of stateside hospitals fell within the top decile (highest mortality rate) compared with 37.7% of territorial hospitals. Overall, mean state-level and territorial-level performance on each of the 6 outcome measures varied across the country (Figures 3, 4, and 5).

Comment

Our findings reveal a marked geographic disparity that affects a subset of racial/ethnic minority populations in the United States. Hospitals in the US territories, on average, have significantly higher RSMRs than hospitals in the US states. The magnitude of differences across these rates raises concerns about differences in the quality of care. In comparison with the states, for every 100 AMI admissions in the US territories there are approximately 2 additional deaths, for every 100 HF admissions there is 1 additional death, and for every 100 pneumonia admissions there are 3 additional deaths. The higher mortality rates are not explained by the types of hospitals included or their lower use of guideline-recommended therapies. Furthermore, the higher mortality rates observed in the US territories are not the result of a few outlier institutions; virtually all of the territorial hospitals performed below the US national averages.

Notably, the US territories have lower federal insurance reimbursement rates compared with all of the US states.28,29 In 2003, the General Accounting Office found that Medicare spending averaged $6300 per enrollee in the US states compared with $2800 in the US territories.29 This study did not directly assess whether low reimbursement rates in the US territories contribute to low hospital performance in these regions. Still, it is important to consider the context of differential federal reimbursement policies. Specifically, the federal government has limited its Medicaid contribution to 50%, the lowest allowable percentage, for the US territories. In contrast to reimbursement policies for the US states, the federal government does not make any additional adjustments for lower per capita income in the US territories. The federal government also limits its contribution to a specific dollar amount in the US territories; there are no comparable “cap” policies in any US state or in the District of Columbia. Both of these discrepancies severely limit health care funding streams in the US territories, with consequences such as narrow Medicaid eligibility criteria and the elimination of Medicaid services that are commonly covered in many US states. Medicaid policies are particularly relevant to the Medicare population, given the growth of dual eligible residents in the territories.30 Puerto Rico faces additional policy challenges when compared with the US states and other territories because of Medicare policies that reimburse in-patient hospitalizations at rates lower than anywhere else in the nation.31,32 In addition, the territories have a limited ability to shape the policies that may ultimately influence health care quality; the US territories lack voting representation in the US Congress and residents cannot vote in national elections.33

We also found that risk-standardized readmission rates were higher in the US territories for AMI and PNE prior to adjustment. Again, almost all of the hospitals in the territories performed worse than the average in the US states, although these associations were not significant after adjusting for hospital characteristics and core process measure performance. Still, readmission rates for all the hospitals were high, and although the disparity was not as prominent as with the mortality measure, the need for improvement is clear.

Lastly, we found marked disparities in performance on the core process measures. These publicly reported measures assess compliance with a set of guideline-recommended therapies and actions that are associated with improved patient outcomes. They demonstrate lower quality care in the care of patients in the territories for each of the 3 conditions, representing substantial opportunities for improvement. As observed in prior work done in the United States, these differences in performance on core process measures explained only a small amount of the variation in mortality, indicating that many other factors play a role.34-37 Still, we included core process measures in our multivariable analysis because the association between processes of care and outcomes may have been different in the US territories and we could have missed important and potentially intervention-sensitive levers for change if they were not assessed. However, the fact that performance on these measures does not explain the higher mortality rates suggests that, beyond these processes, there are other aspects of care that are likely contributing to these differences.

Our study is one of the first to examine quality of care for hospitals located in the US territories; however, there are some limitations to consider when interpreting these findings. First, being located in a US territory may be a marker for geographic location on an island or unmeasured characteristics such as patient socioeconomic status; poverty is much more common in the territories.1,6 Although there is evidence that hospitals disproportionately providing care for lower socioeconomic status populations have similar mortality rates to hospitals providing care to higher socioeconomic status populations, this evidence does not include US territories and their corresponding low reimbursements for Medicare.8 Second, we examined AMI, HF, and PNE and our results may not be generalizable to other conditions. Still, the existence of high-quality CMS data in these clinical areas represents an opportunity to investigate hospital performance in the territories and establishes the foundation for future work in this area. Third, our measures were based on the experience of patients in Medicare FFS and our results may not extend to younger populations. However, this is an appropriate group to investigate, given the expanding proportion of patients older than 65 years and associated increasing health care costs. Fourth, our outcomes measures were based on models using administrative claims data. We did not have extensive patient-level data for patients in the US states or in the US territories and therefore could not take into account health behaviors, health literacy, or adherence across these populations. However, we assessed acute care processes and short-term outcomes, and comorbid conditions were well captured in our administrative claims data. Although there may be unmeasured patient characteristics in the territorial populations for which we do not account, the statistical models used in the outcome measures produce estimates that are good surrogates for estimates from a medical record model.9,11,37 In addition, the mortality measure, which is approved by the National Quality Forum, is designed to convey information about hospital performance and already adjusts for hospital case-mix.7,9 We also conducted several secondary analyses to assess whether our findings primarily reflected the experience of Puerto Rico, since it has the largest population of the territories; we found the disparities were consistent across all US territories.

Despite the national effort to address health care disparities through increased public reporting and standardizing hospital performance, hospitals in the US territories have been largely neglected. Improving health care outcomes in the US territories should be included in any comprehensive effort to tackle national racial/ethnic and other health care disparities. The striking disparity revealed in this study demonstrates that people living in the US territories are at a notable disadvantage compared with those in the US states. Importantly, these US possessions are legally restricted from full participation in the shaping of relevant US health care policy. The nation has a great responsibility to guarantee that residents on these islands have access to care that is at least of the same quality as care in the US states.

Back to top
Article Information

Correspondence: Marcella Nunez-Smith, MD, MHS, Yale School of Medicine, PO Box 208088, IE-61 SHM, New Haven, CT 06520 (marcella.nunez-smith@yale.edu).

Accepted for Publication: April 12, 2011.

Published Online: June 27, 2011. doi:10.1001/archinternmed.2011.284

Author Contributions: Dr Nunez-Smith had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: Nunez-Smith, Bradley, Herrin, Santana, Normand, and Krumholz. Acquisition of data: Krumholz. Analysis and interpretation of data: Nunez-Smith, Bradley, Herrin, Santana, Curry, Normand, and Krumholz. Drafting of the manuscript: Nunez-Smith. Critical revision of the manuscript for important intellectual content: Nunez-Smith, Bradley, Herrin, Santana, Curry, Normand, and Krumholz. Statistical analysis: Nunez-Smith, Herrin, and Normand. Administrative, technical, and material support: Nunez-Smith. Study supervision: Nunez-Smith.

Financial Disclosure: Drs Normand and Krumholz report they developed RSMRs and RSRRs for AMI, HF, and PNE under contract with the Colorado Foundation for Medical Care. Dr Krumholz reports he chairs a scientific advisory board for United Healthcare.

Funding/Support: The analyses on which this publication is based were supported by the Agency for Healthcare Quality and Research (grant RO1-HS0-16929-1), the United Health Fund (grant 20090565), and the Commonwealth Fund. Dr Nunez-Smith was supported by the Association of American Medical Colleges' Nickens Faculty Fellowship at the time this project was developed. Dr Krumholz was supported by grant U01 HL105270 (Center for Cardiovascular Outcomes Research at Yale University) from the National Heart, Lung, and Blood Institute.

Disclaimer: The content of this publication does not necessarily reflect the views or policies of any of the sponsors.

Additional Contributions: Yongfei Wang, MS, Section of Cardiovascular Medicine, Department of Medicine, New Haven, Connecticut, assisted with database creation, and Stacy D. Maples, MSc, Map Department, Yale University Library, New Haven, created the maps included in the article figures. No compensation was received by anyone named in this acknowledgment.

References
1.
US Census Bureau.  US Census 2000. http://www.census.gov/main/www/cen2000.html. Accessed April 4, 2011
2.
Agency for Healthcare Research and Quality.  National Healthcare Quality Report 2008Rockville, MD: Agency for Healthcare Research and Quality, US Dept of Health and Human Services; 2009. AHRQ publication 09-0001
3.
Agency for Healthcare Research and Quality.  National Healthcare Disparities Report 2008Rockville, MD: Agency for Healthcare Research and Quality, US Dept of Health and Human Services; 2009. AHRQ publication 09-0002
4.
Smedley BD, Stith AY, Nelson AR. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC: National Academic Press; 2003
5.
The Dartmouth Institute for Health Policy and Clinical Practice.  The Dartmouth Atlas of Health Care. http://www.dartmouthatlas.org/. Accessed April 4, 2011
6.
Centers for Medicare and Medicaid Services.  CMS Announces Guidelines for Reporting Hospital Quality Data. Baltimore, MD: Centers for Medicare and Medicaid Services; 2004
7.
National Quality Forum.  Measuring performance. Washington, DC: National Quality Forum; 2010. http://www.qualityforum.org/Measuring_Performance/Measuring_Performance.aspx. Accessed January 4, 2011
8.
National Quality Forum.  National Voluntary Consensus Standards for Hospital Care: Additional Priority Areas 2005-2006: Pneumonia Mortality Supplement. Washington, DC: National Quality Forum; April 2007
9.
Keenan PS, Normand SL, Lin Z,  et al.  An administrative claims measure suitable for profiling hospital performance on the basis of 30-day all-cause readmission rates among patients with heart failure.  Circ Cardiovasc Qual Outcomes. 2008;1(1):29-3720031785PubMedGoogle ScholarCrossref
10.
Krumholz HM, Merrill AR, Schone EM,  et al.  Patterns of hospital performance in acute myocardial infarction and heart failure 30-day mortality and readmission.  Circ Cardiovasc Qual Outcomes. 2009;2(5):407-41320031870PubMedGoogle ScholarCrossref
11.
Krumholz HM, Wang Y, Mattera JA,  et al.  An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with an acute myocardial infarction.  Circulation. 2006;113(13):1683-169216549637PubMedGoogle ScholarCrossref
12.
Bratzler DW, Normand SL, Wang Y,  et al.  An administrative claims model for profiling hospital 30-day mortality rates for pneumonia patients.  PLoS One. 2011;6(4):e1740121532758PubMedGoogle ScholarCrossref
13.
Krumholz HM, Lin Z, Drye EE,  et al.  An administrative claims measure suitable for profiling hospital performance based on 30-day all-cause readmission rates among patients with acute myocardial infarction.  Circ Cardiovasc Qual Outcomes. 2011;4(2):243-25221406673PubMedGoogle ScholarCrossref
14.
Lindenauer PK, Normand SL, Drye EE,  et al.  Development, validation, and results of a measure of 30-day readmission following hospitalization for pneumonia.  J Hosp Med. 2011;6(3):142-15021387551PubMedGoogle ScholarCrossref
15.
US Department of Health and Human Services.  Medical Hospital Quality Compare. Washington, DC: US Department of Health and Human Services; 2009. http://www.hospitalcompare.hhs.gov/hospital-search.aspx?AspxAutoDetectCookieSupport=1. Accessed April 4, 2011
16.
Joint Commission on Accreditation of Healthcare Organizations.  Performance measurement initiatives: current specification manual for national hospital quality measures.  2006. http://www.cms.gov/HospitalQualityInits/downloads/HospitalHQA2004_2007200512.pdf. Accessed April 4, 2011
17.
US Department of the Interior. Definitions of insular area political organizations. Washington, DC: US Department of the Interior; 2007. http://www.doi.gov/oia/Islandpages/political_types.htm. Accessed April 4, 2011
18.
US Census Bureau.  US Census 2000: The island areas. http://www.census.gov/population/www/cen2000/islandareas/index.html. Accessed April 4, 2011
19.
Ross JS, Cha SS, Epstein AJ,  et al.  Quality of care for acute myocardial infarction at urban safety-net hospitals.  Health Aff (Millwood). 2007;26(1):238-24817211034PubMedGoogle ScholarCrossref
20.
Ross JS, Normand SL, Wang Y, Nallamothu BK, Lichtman JH, Krumholz HM. Hospital remoteness and thirty-day mortality from three serious conditions.  Health Aff (Millwood). 2008;27(6):1707-171718997230PubMedGoogle ScholarCrossref
21.
Schultz MA, van Servellen G, Litwin MS, McLaughlin EJ, Uman GC. Can hospital structural and financial characteristics explain variations in mortality caused by acute myocardial infarction?  Appl Nurs Res. 1999;12(4):210-21410589110PubMedGoogle ScholarCrossref
22.
Thiemann DR, Coresh J, Oetgen WJ, Powe NR. The association between hospital volume and survival after acute myocardial infarction in elderly patients.  N Engl J Med. 1999;340(21):1640-164810341277PubMedGoogle ScholarCrossref
23.
Bradley EH, Herrin J, Elbel B,  et al.  Hospital quality for acute myocardial infarction: correlation among process measures and relationship with short-term mortality.  JAMA. 2006;296(1):72-7816820549PubMedGoogle ScholarCrossref
24.
Rubin DB. A non-iterative algorithm for least squares estimation of missing values in any analysis of variance design.  J R Stat Soc Ser A. 1972;C:136-141Google Scholar
25.
Rubin DB. Multiple Imputation for Nonresponse in Surveys. New York, NY: Wiley; 1987
26.
Barnard J, Rubin DB. Small-sample degrees of freedom with multiple imputation.  Biometrika. 1999;86(4):948-955Google ScholarCrossref
27.
Schafer JL. Analysis of Incomplete Multivariate Data. Boca Raton, FL: Chapman & Hall/CRC; 1997
28.
Scott C. Federal Medical Assistance Percentage (FMAP) for Medicaid: CRS Report for Congress. Washington, DC: Congressional Research Service. Publication RS21262
29.
 US Insular Areas: Multiple Factors Affect Health Care Funding: Report to Congressional Requesters. Washington, DC: US Government Accountability Office; 2005. GOA publication GOA-06-075
30.
Scanlon WJ. Medicare and Medicaid: Meeting Needs of Dual Eligibles Raises Difficult Cost and Care Issues: Testimony Before the Special Committee on Aging, US Senate. Washington, DC: US General Accounting Office; 1997. GOA publication GOA/T-HEHS-97-119
31.
Social Security Administration.  Annual statistical supplement, 2006. Baltimore, MD: Social Security Administration; 2006. http://www.ssa.gov/policy/docs/statcomps/supplement/2006/oasdi.html. Accessed April 4, 2011
32.
Evans M. Healthcare's place in the sun. Execs, GAO find sandy beaches, salty Medicaid funding in US territories.  Mod Healthc. 2006;36(29):28-2916958380PubMedGoogle Scholar
33.
US Electoral College.  Frequently asked questions. Washington, DC: US National Archives and Records Administration. http://www.archives.gov/federal-register/electoral-college/faq.html#territories. Accessed September 23, 2010
34.
Bratzler DW, Nsa W, Houck PM. Performance measures for pneumonia: are they valuable, and are process measures adequate?  Curr Opin Infect Dis. 2007;20(2):182-18917496578PubMedGoogle ScholarCrossref
35.
Fonarow GC, Peterson ED. Heart failure performance measures and outcomes: real or illusory gains.  JAMA. 2009;302(7):792-79419690314PubMedGoogle ScholarCrossref
36.
Werner RM, Bradlow ET. Relationship between Medicare's hospital compare performance measures and mortality rates.  JAMA. 2006;296(22):2694-270217164455PubMedGoogle ScholarCrossref
37.
Normand ST, Wang Y, Krumholz HM. Assessing surrogacy of data sources for institutional comparisons.  Health Serv Outcomes Res Methodol. 2007;7(1-2):79-96Google ScholarCrossref
×