Importance To improve the quality of health care, many researchers have suggested that health care institutions adopt management approaches that have been successful in the manufacturing and technology sectors. However, relatively little information exists about how these practices are disseminated in hospitals and whether they are associated with better performance.
Objectives To describe the variation in management practices among a large sample of hospital cardiac care units; assess association of these practices with processes of care, readmissions, and mortality for patients with acute myocardial infarction (AMI); and suggest specific directions for the testing and dissemination of health care management approaches.
Design We adapted an approach used to measure management and organizational practices in manufacturing to collect management data on cardiac units. We scored performance in 18 practices using the following 4 dimensions: standardizing care, tracking of key performance indicators, setting targets, and incentivizing employees. We used multivariate analyses to assess the relationship of management practices with process-of-care measures, 30-day risk-adjusted mortality, and 30-day readmissions for acute myocardial infarction (AMI).
Setting Cardiac units in US hospitals.
Participants Five hundred ninety-seven cardiac units, representing 51.5% of hospitals with interventional cardiac catheterization laboratories and at least 25 annual AMI discharges.
Main Outcome Measures Process-of-care measures, 30-day risk-adjusted mortality, and 30-day readmissions for AMI.
Results We found a wide distribution in management practices, with fewer than 20% of hospitals scoring a 4 or a 5 (best practice) on more than 9 measures. In multivariate analyses, management practices were significantly correlated with mortality (P = .01) and 6 of 6 process measures (P < .05). No statistically significant association was found between management and 30-day readmissions.
Conclusions and Relevance The use of management practices adopted from manufacturing sectors is associated with higher process-of-care measures and lower 30-day AMI mortality. Given the wide differences in management practices across hospitals, dissemination of these practices may be beneficial in achieving high-quality outcomes.
Interest in quality improvement in health care during the past 10 years has been associated with a handful of important successes.1-3 However, improvements in the quality of care have been slower than many would have hoped for,4-8 and quality is still highly variable across organizations.9 Although significant effort has been focused on the use of evidence-based medicine—clinical practices that lead to better care—an interest in organizational strategies and management practices that enable and incentivize high-quality health care is emerging.10-15
One of the most active areas of interest is in the use of management practices with origins in manufacturing, including, for example, “Lean” methodologies developed at Toyota16 or the use of balanced scorecard approaches that originated in the technology sector.17 These management approaches can be characterized as a set of formalized tools, the use of which is intended to improve quality through multiple pathways, such as eliminating inefficient and variable practices; engaging providers in a collaborative, team-based approach; and structuring mechanisms for setting targets and tracking progress. However, the evidence on the potential effectiveness of these approaches in health care is relatively weak13,18 and consists primarily of single-site studies.19-21
To address this gap in knowledge, we present a new framework and instrument for defining key management dimensions and for measuring them on a large scale in health care organizations. We describe the variation in management practices among a large sample of hospitals; assess its association with processes of care, readmissions, and mortality for patients with acute myocardial infarction (AMI); and suggest specific directions for the testing and dissemination of health care management approaches.
We took an approach originally developed by economists to measure management practices in manufacturing and adapted it to the cardiac inpatient setting.22,23 This management framework has been used to measure organizational practices in more than 6000 firms across more than 15 countries and serves as the basis for the newly introduced Management and Organizational Practices Survey component of the US Census.24 The management survey approach had been validated previously in selected health care settings, including 147 substance abuse treatment programs in the United States25 and 100 hospitals in the United Kingdom.26
Our survey tool queried about 18 management practices grouped into the following 4 primary dimensions: standardizing care (the Lean methods; 6 practices), performance monitoring (5 practices), setting targets (3 practices), and incentivizing employees and managers (4 practices). Table 1 provides a brief description of these 4 dimensions and 18 practices. The section on standardizing care focused on processes and systems that minimize variations. The monitoring section focused on strategies for collecting and tracking key performance indicators. Targets examined the clarity and ambition of unit targets (eg, “Was the unit engaged in a drive toward a 0% bloodstream infection rate?”).
Following the design of previous work,22,23 we scored unit performance on the 18 practices, with trained interviewers asking open-ended questions designed to elicit information on whether the unit is a poor, average, or high performer for that particular practice. The response was scored on a scale from 1to 5, with a higher score indicating better performance. Surveys were conducted via telephone interview. Table 2 provides the scoring grid and example responses for 4 of our 18 questions, along with the percentage of hospitals receiving scores of 1, 3, or 5. Additional details of the survey questions are provided in eAppendix 1. Technical aspects of the survey implementation are provided in eAppendix 2 and the eTable.
We converted our management scores from the original 1- to 5-point scale to z scores (mean, 0; SD, 1) because scaling may vary across the 18 measured practices (eg, interviewers might consistently give higher scores on question 1 compared with question 2). We took an additional step to mitigate potential bias by regressing, without an intercept, the mean of the management z scores on a set of prespecified indicator variables for interviewer, interviewee job position (eg, nurse manager vs unit director), interviewee location (eg, intensive care unit vs telemetry), and the duration, day, and week of the interview.22 The predicted values of this regression were then subtracted from the mean management score to create an adjusted mean management score. This adjusted management score was the primary measure of overall managerial practice.
Hospital data collection and sample
The survey was conducted during 2010. All research interviewers were trained on the interview guide and scoring grid for 1 week. We used the American Hospital Association Guide27 to identify hospitals with interventional cardiac catheterization laboratories and to determine hospital contact information. We excluded federal (Veterans Administration) hospitals and hospitals with fewer than 25 annual Medicare discharges with a primary diagnosis of AMI. Interviewers made contact with a nurse manager in a cardiac unit, confirmed that the unit performed interventional cardiology, and confirmed consent to conduct the interview. Interviews were conducted using a standard interview guide and generally were scored by 2 members of the interview team, with one member asking questions and scoring responses and the second listening and scoring responses in parallel. At the conclusion of each interview, interviewers discussed discrepancies between scores and made changes where appropriate. Interobserver agreement was assessed using a subset of 58 interviews in which the 2 individuals scoring the interview were not permitted to change their score. The correlation coefficient in the mean management score for these interviews was 0.89 (P < .001).
We obtained hospital administrative data (ie, profit status, number of beds, teaching status, and presence of open heart surgery facilities) from the American Hospital Association Guide27 and Medicare's Provider of Service file.
We obtained publicly available data from the Centers for Medicare & Medicaid Services on 6 AMI process measures included in the Hospital Compare evaluation for 2010.28 These measures include aspirin use within 24 hours of arrival, angiotensin-converting enzyme inhibitor use for left ventricular dysfunction, provision of percutaneous coronary intervention within 90 minutes of arrival, aspirin prescribed at discharge, β-blocker prescribed at discharge, and provision of smoking cessation counseling.
Mortality and readmissions risk adjustment and sample
Analyses of mortality and readmissions were based on the 2010 Medicare Provider Analysis and Review file and used risk adjustment variables described by Krumholz and colleagues.29,30 We calculated hospital risk-adjusted mortality using the Dimick and Staiger method, a Bayesian “shrinkage” estimator that accounts for some of the random variation associated with mortality rates and has been shown to have the best predictive accuracy among potential estimators.31 Readmissions were calculated as any readmission within 30 days of discharge from the index admission, excluding transfers or admissions into a skilled nursing facility or a long-term acute care hospital and admissions for rehabilitation (diagnosis related group code 462 or admission diagnosis code V57.xx).
We present univariate unadjusted values for quality measures, displayed by hospitals at the top and bottom quartiles of the management score. To test for trends by quartile, we calculated the Pearson correlation coefficient.
In multivariate models assessing the association of management with risk-adjusted 30-day mortality, we estimated a weighted linear least squares model weighted by the number of AMI discharges. We controlled for a set of independent variables that have a previously demonstrated association with AMI mortality,32-38 including AMI volume (25-75, 76-125, 126-250, and >250 discharges annually), region, ownership, number of licensed beds (<151, 151-374, and >374), location (rural vs urban), teaching status, open heart surgery capability, and hospital system membership. To assess the association with each process-of-care measure, we used a binomial regression weighted by the number of patients and including the same set of independent variables used in the mortality regression.39
To provide results that are interpretable across quality measures, we estimated the change in mortality or process measures associated with moving a typical hospital (defined as a hospital with the median values for all independent variables except the adjusted management score) from the 25th to the 75th percentile of the adjusted management score. We used bootstrapping to generate 95% confidence intervals.
Analyses of mortality and process-of-care measures were conducted at the hospital level. In sensitivity analyses, we ran patient-level models of 30-day AMI mortality using mixed-effects logistic models with a hospital-level random effect. In additional analyses, we included a composite measure of performance on AMI process-of-care measures (based on a sum of the z score of each process measure40) as an additional covariate in our hospital-level analyses of management on mortality.
To examine the relationship between management practice scores and 30-day readmission, we used competing-risks survival regressions, which control for the fact that patients who die are no longer at risk for readmission. Models were adjusted for the described individual and hospital factors, with standard errors adjusted for hospital-level clustering.29 Mortality and readmission models also adjusted for patient comorbidities, age, sex, and emergency admission. In these analyses, we tested the proportionality assumption that the effect of management on readmission is constant over time. We used a significance level of .05 and 2-sided tests for all hypotheses.
The study protocol was approved by the institutional review board of Oregon Health & Science University. Additional details on modeling choices and survey approach are available in eAppendix 2.
From the administrative data, we identified 1358 nonfederal hospitals with interventional cardiac catheterization laboratories and with at least 25 annual AMI discharges. Of those hospitals, 199 indicated verbally that they did not conduct interventional catheterization.
We completed interviews and scored management practices in 597 hospitals, capturing detailed management data for 51.5% of 1159 units with interventional cardiology and at least 25 annual AMI discharges. Table 2 provides an indication of the spread of management processes for practices 2, 8, 14, and 15. Although only a small percentage (1.9% and 1.7%) of units were scored a 1 (indicating little or no adoption of modern management processes) for practices 2 (standardization of protocols) and 8 (monitoring errors), the percentages scoring a 5 (indicating high adoption and fidelity to best practices) were also relatively small (10.6% and 12.7%, respectively). A similar spread was observed for all 18 practices, with only 23.1% of hospitals scoring a 4 or a 5 on more than half.
The Figure displays the distribution of overall management scores across our 597 hospitals. We found a wide distribution in management practices, with 38.2% of hospitals scoring a mean of less than 3 across the 18 practices.
Table 3 compares surveyed and nonsurveyed hospitals. Surveyed hospitals were slightly more likely to be located in the western United States, be not-for-profit hospitals, offer cardiac surgery, and exhibit slightly lower mortality.
Table 4 displays unadjusted, unweighted quality measures for hospitals in the top, bottom, and middle 2 quartiles of the management practice score. In comparison with hospitals in the bottom quartile of management, hospitals in the top quartile had better performance on all process-of-care measures, except for the provision of smoking cessation counseling.
Table 5 displays results for regression models that adjust for all hospital-level covariates described previously. To provide results that are interpretable across process and mortality measures, we estimated the effect of increasing the adjusted management score from the 25th to the 75th percentiles. The overall management score was associated with statistically significant improvements in 30-day risk-adjusted mortality (P = .01) and the process-of-care measures (P = .03 for aspirin at discharge; P = .02 for smoking cessation; and P < .01 for all other process-of-care measures).
Table 5 also displays the hazard ratio for our competing risk regression of risk-adjusted 30-day readmission. The proportionality assumption was met for the hospital-level exposure of interest (χ2 = 1.4; P = .24). The overall management score was not associated with a reduction in readmissions.
In sensitivity analyses, patient-level models of 30-day AMI mortality using a mixed-effects logistic model demonstrated similar results (odds ratio, 0.93 [95% CI, 0.88-0.99]). In hospital-level models of mortality that included a composite measure of AMI process-of-care measures as an additional covariate, the overall management score was still significantly associated with mortality (P = .02).
In our survey of more than half the US hospitals with interventional cardiac services, we found a wide distribution in management practices. Higher management practice scores were correlated with lower mortality and better performance on AMI process-of-care measures. Models that included a composite measure of AMI process-of-care measures also demonstrated a strong association between management practices and mortality, suggesting that the benefits of management were not solely attributable to better performance on process-of-care measures. Although strongly associated with mortality and process-of-care measures, management practices were not associated with lower readmission rates, a finding that may be consistent with evidence suggesting that 30-day readmission rates are driven primarily not by hospital practice but by a hospital's patient population and the resources of the community in which it is located.41,42
The practices that we measured have been promoted by business schools, researchers, and industry leaders as mechanisms for reducing variations in practice, increasing motivation and accountability of employees, and identifying errors or subpar performance. In short, these practices can be seen as concrete examples of a system for improving care. Our findings are consistent with the empirical research in manufacturing and reports of individual organizational successes that have been attributed to the adoption of Lean management and related approaches.21,43-47
Our findings parallel additional studies of management in health care settings. A survey of 537 hospitals identified 5 key strategies that were significantly associated with lower AMI mortality and noted that a small proportion of hospitals used all 5 strategies.10 A study of management in 42 intensive care units found that attributes such as coordination, communication, and conflict management abilities were associated with better quality.48 Qualitative studies of AMI care also provide support for many of the practices defined inTable 1.12,49,50
In our study, a movement from the 25th to the 75th percentile in management scores was associated with a 0.17% reduction in mortality, a potentially important although modest improvement. A number of studies have indicated that process-of-care measures are correlated with lower AMI mortality, although the magnitude of effect has also been small.39,51-53 Our estimates may underestimate the true effect of management for several reasons. First, the noise inherent in our scoring method, coupled with the shrinkage approach of the Dimick-Staiger estimator, may introduce attenuation bias, leading to an underestimate of the true effect of better management.54 Second, our study measures association, not causation. Experimental and survey evidence from manufacturing studies suggest that cross-sectional studies may underestimate substantially the improvements that can be realized through the adoption of modern management practices.22,55 The small effect size may also reflect a plateau in the widespread improvements in the quality of AMI treatment that have occurred during the past 10 years.2 The management practices that we tested—many of which are not specific to the care of AMI patients—may have significant potential in clinical areas that have not experienced similar improvements in quality.
Our study has additional limitations. Process-of-care measures depend on systems that are in place in several locations in the hospital, and good performance on these measures is not solely the domain of the cardiac unit, where we measured management. However, some of our questions reflect a systems perspective, and “good management” in the cardiac unit may in part be reflected by an overall hospital approach.
Our study used only 1 respondent at each site. In their work on manufacturing, Bloom and Van Reenen22 ran a second interview with a different manager on a subset of firms and found a strong correlation between the first and second interviews (ρ = 0.734; P < .001). Unfortunately, the pool of managers in cardiac units who could provide reliable answers to our questions was relatively small, restricting our ability to conduct a second interview with a different manager. However, because we used the same approach, training team, and materials as Bloom and Van Reenen,22,23 it is likely, although uncertain, that our scores would have similar accuracy.
Finally, our study was based on data collected from approximately 50% of cardiac units, and the surveyed hospitals differed in some ways from the nonrespondents (eg, surveyed hospitals were slightly more likely to be located in rural areas). However, the surveyed hospitals also had smaller but statistically significantly lower mortality rates, providing some indication that management scores might be worse in the nonsurveyed group. In other words, if our survey of management does not reflect accurately the full distribution of practices across all hospitals, it should be relatively close, although perhaps biased toward better-managed hospitals. The study's strengths included the use of a method for measuring management that has been validated in large-scale studies of manufacturing, a large sample size, and an empirical test of management's association with widely accepted quality metrics.
Our results suggest future directions for hospital management practices and quality of care. We find wide variation in the dissemination of modern management practices, with better management associated with higher performance in process-of-care measures and lower risk-adjusted mortality. Many of these practices are relatively moderate in scope and do not require substantial capital investment. The identification of essential aspects of management can help administrators, clinicians, and policymakers understand the types of organizational changes that are feasible and currently in place in some hospitals and may speed the adoption of practices that are relatively new to health care but have the potential to improve patient care.
Correspondence: K. John McConnell, PhD, Oregon Health & Science University, 3181 Sam Jackson Park Rd, Mail Code CR-114, Portland, OR 97239 (mcconnjo@ohsu.edu).
Accepted for Publication: November 19, 2012.
Published Online: March 18, 2013. doi:10.1001/jamainternmed.2013.3577
Author Contributions:Study concept and design: All authors. Acquisition of data: McConnell. Analysis and interpretation of data: McConnell, Lindrooth, Wholey, and Maddox. Drafting of the manuscript: McConnell, Lindrooth, and Maddox. Critical revision of the manuscript for important intellectual content: All authors. Statistical analysis: McConnell, Lindrooth, and Bloom. Obtained funding: McConnell and Lindrooth. Administrative, technical, and material support: McConnell. Study supervision: McConnell.
Conflict of Interest Disclosures: None reported.
Funding/Support: This study was supported by grant 1R01HS018466 from the Agency for Healthcare Research and Quality.
Previous Presentations: This study was presented in part at the 2011 and 2012 AcademyHealth Annual Research Meetings (June 14, 2011; Seattle, Washington; and June 18, 2012; Orlando, Florida); the 2011 American Heart Association Quality of Care and Outcomes Research Scientific Sessions (May 13, 2011; Washington DC); and the 2012 Academy for Health Care Improvement Meeting (May 7, 2012; Arlington, Virginia).
1.Joint Commission. Improving America's Hospitals: the Joint Commission's Annual Report on Quality and Safety, 2010. Oakbrook Terrace, IL: Joint Commission; 2010
2.Krumholz HM, Wang Y, Chen J,
et al. Reduction in acute myocardial infarction mortality in the United States: risk-standardized mortality rates from 1995-2006.
JAMA. 2009;302(7):767-77319690309
PubMedGoogle ScholarCrossref 3.Pronovost P, Needham D, Berenholtz S,
et al. An intervention to decrease catheter-related bloodstream infections in the ICU.
N Engl J Med. 2006;355(26):2725-273217192537
PubMedGoogle ScholarCrossref 4.Brennan TA, Gawande A, Thomas E, Studdert D. Accidental deaths, saved lives, and improved quality.
N Engl J Med. 2005;353(13):1405-140916192489
PubMedGoogle ScholarCrossref 5.Classen DC, Resar R, Griffin F,
et al. “Global trigger tool” shows that adverse events in hospitals may be ten times greater than previously measured [published correction appears in
Health Aff (Millwood). 2011;30(6):1217].
Health Aff (Millwood). 2011;30(4):581-58921471476
PubMedGoogle ScholarCrossref 8.Wachter RM. The end of the beginning: patient safety five years after “To Err Is Human.”
Health Aff (Millwood). 2004;23:(suppl Web exclusives)
W534-W545
Google Scholar 9.Krumholz HM, Merrill AR, Schone EM,
et al. Patterns of hospital performance in acute myocardial infarction and heart failure 30-day mortality and readmission.
Circ Cardiovasc Qual Outcomes. 2009;2(5):407-41320031870
PubMedGoogle ScholarCrossref 10.Bradley EH, Curry LA, Spatz ES,
et al. Hospital strategies for reducing risk-standardized mortality rates in acute myocardial infarction.
Ann Intern Med. 2012;156(9):618-62622547471
PubMedGoogle ScholarCrossref 11.Bradley EH, Herrin J, Wang Y,
et al. Strategies for reducing the door-to-balloon time in acute myocardial infarction.
N Engl J Med. 2006;355(22):2308-232017101617
PubMedGoogle ScholarCrossref 12.Curry LA, Spatz E, Cherlin E,
et al. What distinguishes top-performing hospitals in acute myocardial infarction mortality rates? a qualitative study.
Ann Intern Med. 2011;154(6):384-39021403074
PubMedGoogle ScholarCrossref 13.Shojania KG, Grimshaw JM. Evidence-based quality improvement: the state of the science.
Health Aff (Millwood). 2005;24(1):138-15015647225
PubMedGoogle ScholarCrossref 14.Shortell SM, Rundall TG, Hsu J. Improving patient care by linking evidence-based medicine and evidence-based management.
JAMA. 2007;298(6):673-67617684190
PubMedGoogle ScholarCrossref 16.Liker J. The Toyota Way. New York, NY: McGraw-Hill; 2003
17.Kaplan RS, Norton DP. The balanced scorecard: measures that drive performance.
Harv Bus Rev. 1992;70(1):71-7910119714
PubMedGoogle Scholar 18.Hoff T, Jameson L, Hannan E, Flink E. A review of the literature examining linkages between organizational factors, medical errors, and patient safety.
Med Care Res Rev. 2004;61(1):3-37
Google ScholarCrossref 19.Meyer H. Life in the “Lean” lane: performance improvement at Denver Health.
Health Aff (Millwood). 2010;29(11):2054-2060
Google ScholarCrossref 21.Toussant J. Writing the new playbook for US health care: lessons from Wisconsin.
Health Aff (Millwood). 2009;28(5):1343-1350
Google ScholarCrossref 22.Bloom N, Van Reenen J. Measuring and explaining management practices across firms and countries.
Q J Econ. 2007;122(4):1351-1408
Google ScholarCrossref 23.Bloom N, Van Reenen J. Why do management practices differ across firms and countries?
J Econ Perspect. 2010;24(1):203-224
Google ScholarCrossref 25. McConnell KJ, Hoffman KA, Quanbeck A, McCarty D. Management practices in substance abuse treatment programs.
J Subst Abuse Treat. 2009;37(1):79-89
Google ScholarCrossref 26.Bloom N, Propper C, Seiler S, Van Reenen J. The impact of competition on management quality: evidence from public hospitals.
NBER Working Paper 16032. 2010. http://www.nber.org/papers/w16032. Accessed March 14, 2012 27.American Hospital Association. American Hospital Association Guide: 2010 Edition. Washington, DC: American Hospital Association; 2010
29.Krumholz HM, Lin Z, Drye EE,
et al. An administrative claims measure suitable for profiling hospital performance based on 30-day all-cause readmission rates among patients with acute myocardial infarction.
Circ Cardiovasc Qual Outcomes. 2011;4(2):243-25221406673
PubMedGoogle ScholarCrossref 30.Krumholz HM, Wang Y, Mattera JA,
et al. An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with an acute myocardial infarction.
Circulation. 2006;113(13):1683-169216549637
PubMedGoogle ScholarCrossref 31.Ryan AM, Burgess JF, Strawderman R, Dimick JB. What is the best way to estimate hospital quality outcomes? a simulation approach.
Health Serv Res. 2012;47(4):1699-171822352894
PubMedGoogle ScholarCrossref 32.Allison JJ, Kiefe CI, Weissman NW,
et al. Relationship of hospital teaching status with quality of care and mortality for Medicare patients with acute MI.
JAMA. 2000;284(10):1256-1262
Google ScholarCrossref 33.Baldwin LM, MacLehose RF, Hart LG, Beaver SK, Every N, Chan L. Quality of care for acute myocardial infarction in rural and urban US hospitals.
J Rural Health. 2004;20(2):99-10815085622
PubMedGoogle ScholarCrossref 34.Bradley EH, Herrin J, Curry LA,
et al. Variation in hospital mortality rates for patients with acute myocardial infarction.
Am J Cardiol. 2010;106(8):1108-111220920648
PubMedGoogle ScholarCrossref 35.Krumholz HM, Chen J, Rathore SS, Wang Y, Radford MJ. Regional variation in the treatment and outcomes of myocardial infarction: investigating New England's advantage.
Am Heart J. 2003;146(2):242-24912891191
PubMedGoogle ScholarCrossref 36.Popescu I, Werner RM, Vaughan-Sarrazin MS, Cram P. Characteristics and outcomes of America's lowest-performing hospitals [published correction appears in
Circ Cardiovasc Qual Outcomes. 2011;4(3):e2].
Circ Cardiovasc Qual Outcomes. 2009;2(3):221-22720031841
PubMedGoogle ScholarCrossref 37.Ross JS, Normand S-LT, Wang Y,
et al. Hospital volume and 30-day mortality for three common medical conditions.
N Engl J Med. 2010;362(12):1110-111820335587
PubMedGoogle ScholarCrossref 38.Thiemann DR, Coresh J, Oetgen WJ, Powe NR. The association between hospital volume and survival after acute myocardial infarction in elderly patients.
N Engl J Med. 1999;340(21):1640-164810341277
PubMedGoogle ScholarCrossref 39.Bradley EH, Herrin J, Elbel B,
et al. Hospital quality for acute myocardial infarction: correlation among process measures and relationship with short-term mortality.
JAMA. 2006;296(1):72-7816820549
PubMedGoogle ScholarCrossref 40.Ryan AM, Burgess JF Jr, Tompkins CP, Wallack SS. The relationship between Medicare's process of care quality measures and mortality.
Inquiry. 2009;46(3):274-29019938724
PubMedGoogle ScholarCrossref 42.Joynt KE, Orav EJ, Jha AK. Thirty-day readmission rates for Medicare beneficiaries by race and site of care.
JAMA. 2011;305(7):675-68121325183
PubMedGoogle ScholarCrossref 44.Gabow PA, Mehler PS. A broad and structured approach to improving patient safety and quality: lessons from Denver Health.
Health Aff (Millwood). 2011;30(4):612-61821471480
PubMedGoogle ScholarCrossref 45.Jimmerson C, Weber D, Sobek DK II. Reducing waste and errors: piloting Lean principles at Intermountain Healthcare.
Jt Comm J Qual Patient Saf. 2005;31(5):249-25715960015
PubMedGoogle Scholar 46.Pham HH, Ginsburg PB, McKenzie K, Milstein A. Redesigning care delivery in response to a high-performance network: the Virginia Mason Medical Center.
Health Aff (Millwood). 2007;26(4):w532-w544
Google ScholarCrossref 47.Pryor D, Hendrich A, Henkel RJ, Beckmann JK, Tersigni AR. The quality “journey” at Ascension Health: how we’ve prevented at least 1,500 avoidable deaths a year—and aim to do even better.
Health Aff (Millwood). 2011;30(4):604-61121471479
PubMedGoogle ScholarCrossref 48.Shortell SM, Zimmerman JE, Rousseau DM,
et al. The performance of intensive care units: does good management make a difference?
Med Care. 1994;32(5):508-5258182978
PubMedGoogle ScholarCrossref 49.Bradley EH, Holmboe ES, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM. A qualitative study of increasing beta-blocker use after myocardial infarction: why do some hospitals succeed?
JAMA. 2001;285(20):2604-261111368734
PubMedGoogle ScholarCrossref 50.Keroack MA, Youngberg BJ, Cerese JL, Krsek C, Prellwitz LW, Trevelyan EW. Organizational factors associated with high performance in quality and safety in academic medical centers.
Acad Med. 2007;82(12):1178-118618046123
PubMedGoogle ScholarCrossref 51.Jha AK, Orav EJ, Li Z, Epstein AM. The inverse relationship between mortality rates and performance in the Hospital Quality Alliance measures.
Health Aff (Millwood). 2007;26(4):1104-111017630453
PubMedGoogle ScholarCrossref 52.Werner RM, Bradlow ET. Relationship between Medicare's hospital compare performance measures and mortality rates.
JAMA. 2006;296(22):2694-270217164455
PubMedGoogle ScholarCrossref 53.Werner RM, Bradlow ET. Public reporting on hospital process improvements is linked to better patient outcomes.
Health Aff (Millwood). 2010;29(7):1319-132420606180
PubMedGoogle ScholarCrossref 54.Wooldridge JM. Econometric Analysis of Cross Section and Panel Data. Cambridge, MA: MIT Press; 2002
55.Bloom N, Eifert B, Mahajan A, McKenzie D, Roberts J. Does management matter? evidence from India.
National Bureau of Economic Research working paper 16658. http://www.nber.org/papers/w16658. Accessed January 16, 2012