[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 54.147.238.62. Please contact the publisher to request reinstatement.
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Download PDF
Figure 1. Sample Report Card
Image description not available.
Figure 2. Flow Diagram of Study Hospitals and Patients With Acute Myocardial Infarction (AMI)
Image description not available.
Figure 3. Monthly Use of β-Blockers Within 30 Days After Discharge Among AMI Patients Admitted to Rapid- vs Delayed-Feedback Hospitals (January 1999 to March 2003)
Image description not available.
Table 1. Characteristics of Study Hospitals and Patients With AMI at Baseline (1999-2000)
Image description not available.
Table 2. Quality of Care for Patients With AMI at Baseline
Image description not available.
Table 3. Characteristics of Patients With AMI Admitted 4 to 10 Months After Report Card Dissemination at Rapid-Feedback Hospitals (October 2002–March 2003)
Image description not available.
Table 4. Quality of Care for Patients With AMI Admitted 4 to 10 Months After Report Card Dissemination at Rapid-Feedback Hospitals
Image description not available.
Table 5. Mean Change in Quality Indicators Between Follow-up and Baseline Among Individual Study Hospitals
Image description not available.
1.
Krumholz H, Radford MJ, Wang Y, Chen J, Heiat A, Marciniak TA. National use and effectiveness of beta-blockers for the treatment of elderly patients after acute myocardial infarction: National Cooperative Cardiovascular Project.  JAMA. 1998;280:623-629PubMedArticle
2.
Barron HV, Michaels AD, Maynard C, Every NR. Use of angiotensin-converting enzyme inhibitors at discharge in patients with acute myocardial infarction in the United States: data from the National Registry of Myocardial Infarction 2.  J Am Coll Cardiol. 1998;32:360-367PubMedArticle
3.
Pilote L, Beck C, Karp I.  et al.  Secondary prevention after acute myocardial infarction in four Canadian provinces, 1997-2000.  Can J Cardiol. 2004;20:61-67PubMed
4.
Mehta R, Montoye C, Gallogly M.  et al.  Improving quality of care for acute myocardial infarction: the guidelines applied in practice (GAP) initiative.  JAMA. 2002;287:1269-1276PubMedArticle
5.
Bradley EH, Holmboe ES, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM. A qualitative study of increasing beta-blocker use after myocardial infarction: why do some hospitals succeed?  JAMA. 2001;285:2604-2611PubMedArticle
6.
Cooperative Cardiovascular Project Best Practices Working Group.  Improving care for acute myocardial infarction: experience from the Cooperative Cardiovascular Project.  Jt Comm J Qual Improv. 1998;24:480-490PubMed
7.
Jamtvedt G, Young JM, Kristoffersen DT, Thomson O’Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes.  Cochrane Database Syst Rev. 2003;((3)):CD000259PubMed
8.
Ghali W, Ash A, Hall R, Moskowitz M. Statewide quality improvement initiatives and mortality after cardiac surgery.  JAMA. 1997;277:379-382PubMedArticle
9.
Tu JV, Schull MJ, Ferris LE, Hux JE, Redelmeier DA. Problems for clinical judgement, 4: surviving in the report card era.  CMAJ. 2001;164:1709-1712PubMed
10.
Topol EJ, Califf RM. Scorecard cardiovascular medicine: its impact and future directions.  Ann Intern Med. 1994;120:65-70PubMedArticle
11.
Ramsay S. Cardiac surgeons in England face publication of outcomes data.  Lancet. 2002;359:329PubMedArticle
12.
Romano PS, Rainwater JA, Antonius D. Grading the graders: how hospitals in California and New York perceive and interpret their report cards.  Med Care. 1999;37:295-305PubMedArticle
13.
Bentley JM, Nash DB. How Pennsylvania hospitals have responded to publicly released reports on coronary artery bypass graft surgery.  Jt Comm J Qual Improv. 1998;24:40-49PubMed
14.
Dans PE. Caveat doctor: how to analyze claims-based report cards.  Jt Comm J Qual Improv. 1998;24:21-30PubMed
15.
Marshall MN, Shekelle PG, Leatherman S, Brook RH. The public release of performance data: what do we expect to gain? a review of the evidence.  JAMA. 2000;283:1866-1874PubMedArticle
16.
Tamblyn R, Lavoie G, Petrella L, Monette J. The use of prescription claims databases in pharmacoepidemiologic research: the accuracy and comprehensiveness of the prescription claims databases in Quebec.  J Clin Epidemiol. 1995;48:999-1009PubMedArticle
17.
Levy A, Tamblyn RM, Fitchett D, McLeod PJ, Hanley JA. Coding accuracy of hospital discharge data for elderly survivors of myocardial infarction.  Can J Cardiol. 1999;15:1277-1282PubMed
18.
Tu JV, Austin PC, Walld R, Roos L, Agras J, McDonald KM. Development and validation of the Ontario acute myocardial infarction mortality prediction rules.  J Am Coll Cardiol. 2001;37:992-997PubMedArticle
19.
 Centers for Medicare & Medicaid Services (CMS) H. Medicare program. Medicare prescription drug benefit; interpretation. Final rule; interpretation. 70 Federal Register 13397-13400 (2005)
20.
Pilote L, Lavoie F, Ho V, Eisenberg M. Changes in the treatment and outcomes of acute myocardial infarction in Quebec, 1988-1995.  CMAJ. 2000;163:31-36PubMed
21.
Tran C, Lee DS, Flintoft VF.  et al.  CCORT/CCS Canadian quality indicators for acute myocardial infarction care.  Can J Cardiol. 2003;19:38-45PubMed
22.
Thomas JW. Report cards—useful to whom and for what?  Jt Comm J Qual Improv. 1998;24:50-51PubMed
23.
Rainwater JA, Romano PS, Antonius DM. The California hospital outcomes project: How useful is California’s report card for quality improvement?  Jt Comm J Qual Improv. 1998;24:31-39PubMed
24.
Donner A, Klar N. Design and Analysis of Cluster Randomization Trials in Health ResearchNew York, NY: Oxford University Press; 2000
25.
Marciniak TA, Ellerbeck EF, Radford MJ.  et al.  Improving the quality of care for Medicare patients with acute myocardial infarction. Results from the Cooperative Cardiovascular Project.  JAMA. 1998;279:1351-1357PubMedArticle
26.
Allison JJ, Kiefe CI, Weissman NW.  et al.  Relationship of hospital teaching status with quality of care and mortality for Medicare patients with acute MI.  JAMA. 2000;284:1256-1262PubMedArticle
27.
Ayanian J, Hauptman P, Guadagnoli E, Antman E, Pashos C, McNeil B. Knowledge and practices of generalist and specialist physicians regarding drug therapy for acute myocardial infarction.  N Engl J Med. 1994;331:1136-1142PubMedArticle
28.
Halabi AR, Eisenberg MJ, Richard H, Beck CA, Pilote L. Impact of on-site availability of cardiac catheterization on cardiac procedure use and quality of life after acute myocardial infarction.  Can J Cardiol. 2001;17:152C
29.
Tu JV, Austin P, Rochon P, Zhang H. Secondary prevention after acute myocardial infarction, congestive heart failure and coronary artery bypass graft surgery in Ontario. In: Naylor CD, Slaughter PM, eds. Cardiovascular Health and Services in Ontario: An ICES Atlas. Toronto, Ontario: Institute for Clinical Evaluative Sciences; 1999:199-238
30.
Pilote L, Merrett P, Karp I.  et al.  Cardiac procedures after an acute myocardial infarction across nine Canadian provinces.  Can J Cardiol. 2004;20:491-500PubMed
31.
Tu J, Austin P, Filate W.  et al.  Outcomes of acute myocardial infarction in Canada.  Can J Cardiol. 2003;19:893-901PubMed
32.
Bradley E, Holmboe E, Mattera J, Roumanis S, Radford M, Krumholz H. Data feedback efforts in quality improvement: lessons learned from US hospitals.  Qual Saf Health Care. 2004;13:26-31PubMedArticle
33.
Institute for Clinical Evaluative Sciences.  Quality of Cardiac Care in OntarioToronto, Ontario: Institute for Clinical Evaluative Sciences; 2004
34.
Horbar J, Carpenter J, Buzas J.  et al.  Collaborative quality improvement to promote evidence based surfactant for preterm infants: a cluster randomized trial.  BMJ. 2004;329:1004Article
35.
Ornstein S, Jenkins R, Nietert P.  et al.  A multimethod quality improvement intervention to improve preventive cardiovascular care: a cluster randomized trial.  Ann Intern Med. 2004;141:523-532PubMedArticle
36.
Kiefe C, Allison J, Williams O, Person S, Weaver M, Weissman N. Improving quality improvement using achievable benchmarks for physician feedback. A randomized controlled trial.  JAMA. 2001;285:2871-2879PubMedArticle
37.
LaBresh K, Ellrodt A, Gliklich R, Liljestrand J, Peto R. Get with the guidelines for cardiovascular secondary prevention: pilot results.  Arch Intern Med. 2004;164:203-209PubMedArticle
38.
Mehta RH, Montoye CK, Faul J.  et al.  Enhancing quality of care for acute myocardial infarction: shifting the focus of improvement from key indicators to process of care and tool use: the American College of Cardiology Acute Myocardial Infarction Guidelines Applied in Practice Project in Michigan: Flint and Saginaw Expansion.  J Am Coll Cardiol. 2004;43:2166-2173PubMedArticle
Original Contribution
July 20, 2005

Administrative Data Feedback for Effective Cardiac TreatmentAFFECT, A Cluster Randomized Trial

Author Affiliations
 

Author Affiliations: Division of Clinical Epidemiology, McGill University Health Centre, Montreal, Quebec (Ms Beck, Mr Richard, and Dr Pilote); Institute for Clinical Evaluative Sciences and Division of General Internal Medicine, Sunnybrook and Women’s College Health Sciences Centre, University of Toronto, Toronto, Ontario (Dr Tu).

JAMA. 2005;294(3):309-317. doi:10.1001/jama.294.3.309
Context

Context Hospital report cards are increasingly being implemented for quality improvement despite lack of strong evidence to support their use.

Objective To determine whether hospital report cards constructed using linked hospital and prescription administrative databases are effective for improving quality of care for acute myocardial infarction (AMI).

Design The Administrative Data Feedback for Effective Cardiac Treatment (AFFECT) study, a cluster randomized trial.

Setting and Patients Patients with AMI who were admitted to 76 acute care hospitals in Quebec that treated at least 30 AMI patients per year between April 1, 1999, and March 31, 2003.

Intervention Hospitals were randomly assigned to receive rapid (immediate; n = 38 hospitals and 2533 patients) or delayed (14 months; n = 38 hospitals and 3142 patients) confidential feedback on quality indicators constructed using administrative data.

Main Outcome Measures Quality indicators pertaining to processes of care and outcomes of patients admitted between 4 and 10 months after randomization. The primary indicator was the proportion of elderly survivors of AMI at each study hospital who filled a prescription for a β-blocker within 30 days after discharge.

Results At follow-up, adjusted prescription rates within 30 days after discharge were similar in the early vs late groups (for β-blockers, odds ratio [OR], 1.06; 95% confidence interval [CI], 0.82-1.37; for angiotensin-converting enzyme inhibitors, OR, 1.17; 95% CI, 0.90-1.52; for lipid-lowering drugs, OR, 1.14; 95% CI, 0.86-1.50; and for aspirin, OR, 1.05; 95% CI, 0.84-1.33). In addition, adjusted mortality was similar in both groups, as were length of in-hospital stay, physician visits after discharge, waiting times for invasive cardiac procedures, and readmissions for cardiac complications.

Conclusions Feedback based on one-time, confidential report cards constructed using administrative data is not an effective strategy for quality improvement regarding care of patients with AMI. A need exists for further studies to rigorously evaluate the effectiveness of more intensive report card interventions.

Despite widespread dissemination of evidence-based guidelines for management of acute myocardial infarction (AMI), many patients are not receiving recommended treatments.1,2 For example, from 1997 to 2000, rates of prescription for β-blockers within 30 days of discharge for elderly patients with AMI were as low as 43% in certain Canadian regions.3 In the United States in 1998, less than 70% of Medicare patients were prescribed β-blockers at discharge from certain Michigan hospitals.4 There is therefore increasing interest in implementing quality improvement strategies for AMI care.5

One quality improvement strategy that has been suggested is provision of feedback on “quality indicators” to hospitals and clinicians treating AMI patients.6 Quality indicators are defined as a summary of clinical performance over a specified time.7 It is suggested that “report cards” presenting a summary of quality indicators relevant to care provided by individual hospitals can catalyze quality improvement at these hospitals.5 Ideally, hospital report cards provide clinicians with an accurate picture of the care they deliver and provide benchmarks for comparison, such as the care delivered at other hospitals or recommended target rates. Although public reporting strategies have been used, some have argued that confidential data feedback is sufficient to stimulate quality improvement.8

Hospital report cards are increasingly being implemented in the United States and some parts of Canada as a strategy for quality improvement in many areas of health care.911 However, there has been limited implementation of hospital report cards specific to AMI care. Some critics are skeptical of risk-adjustment methods and the accuracy of data coding.1214 Others are concerned that they will bring only small gains for large uses of health care resources.15 In Canada, the unique, population-based administrative databases available to describe the care and outcomes of AMI present an opportunity to address these concerns. Hospital discharge databases can be linked to physician claims and outpatient prescription databases, providing a comprehensive summary of care and outcomes. The construction of report cards using these linked databases requires considerably fewer resources in comparison with those constructed using abstracted data from hospital charts. In addition, the accuracy of these data and validity of risk-adjustment methods using these databases have been extensively evaluated and confirmed.1618 Of note, a Medicare drug benefit may eventually permit construction of similar linked databases in the United States.19

We used a controlled experiment to determine whether hospital report cards constructed using linked administrative databases are effective for improving AMI care. We conducted this study in the Canadian province of Quebec, a report card–naive region, where most health care practitioners had no prior experience with either publicized or confidential report cards. Quebec acute care hospitals were randomized to receive rapid (immediately after randomization) or delayed (14 months after randomization) confidential feedback on quality indicators constructed using administrative data. We used this cluster randomization approach to minimize the potential for contamination among individual physicians and because our observations were aimed at the hospital level. Confidential data reporting minimized the potential for contamination between study groups. Confidential reporting also permitted an initial evaluation of effectiveness in a report card–naive region without marked potential for antagonizing the medical community, as sometimes occurs with public feedback. To the best of our knowledge, this is the first randomized trial to evaluate the effectiveness of administrative data report cards, including those specific to AMI care as well as other areas of health care.

METHODS
Data Sources

We used encrypted Medicare numbers to link the Quebec hospital discharge summary database (Maintenance et Exploitation des Données pour l’Étude de la Clientèle Hospitalière [Med-Echo]) with provincial physician and drug claims databases (la Régie de l’Assurance Maladie du Québec [RAMQ]). The Med-Echo database was used to identify AMI patients for inclusion in the study cohort as well as to obtain patient demographic and comorbid disease characteristics. The RAMQ physician claims database was used to obtain data on inpatient and outpatient cardiac procedures and physician visits. The RAMQ drug claims database was used to obtain data on outpatient prescriptions filled for all patients aged 65 years or more who are enrolled in the provincial drug plan (approximately 96%). A previous study demonstrated the validity and accuracy of these data.16 Survival data were obtained for close to 100% of the AMI cohort from the RAMQ database.20

For the creation of hospital report cards, we obtained data on all AMI patients admitted during the 1999-2000 fiscal year (April 1, 1999, to March 31, 2000). Complete follow-up data were available from the date of admission to March 31, 2000. For the analyses of report card effectiveness, we obtained data on all AMI patients admitted between October 1, 2002, and March 31, 2003. Complete Med-Echo and RAMQ follow-up data were available from the date of admission to March 31, 2003, and March 31, 2004, respectively.

Patients

The inclusion and exclusion criteria were established by a Canadian consensus panel.21 Briefly, the inclusion criterion was a most responsible diagnosis of AMI (International Classification of Diseases, Ninth Revision code 410.x). Exclusion criteria were (1) not admitted to an acute care hospital; (2) admission to noncardiac surgical service; (3) transfer from another acute care facility; (4) AMI coded as an in-hospital complication; (5) discharge alive with total length of stay of 2 days or less; (6) previous AMI within the past year; (7) age younger than 20 years or older than 105 years; and (8) invalid health card number.

Hospitals

All acute care hospitals in Quebec admitting at least 30 AMI patients per year were eligible to participate in this study (n = 77). The cutoff of 30 patients was used to attempt to ensure an adequate sample size for statistically stable estimates.

Intervention

The trial intervention included provision of confidential feedback to coronary care unit directors, chief executive officers, and directors of professional services of the study hospitals. Feedback was provided in the form of a hospital report card presenting information on 12 quality indicators for AMI care that were developed by a Canadian consensus panel21 (Figure 1). Most indicators summarized processes of care because previous surveys of physicians have indicated that such “actionable” indicators have greater utility than indicators related to patient outcomes.22 The content and format of the report cards were developed partly based on recommendations from previous studies that the data be benchmarked against a reasonable comparison group,4 as well as limited and graphically displayed.23 Quality indicators reflecting patient outcomes were risk adjusted according to validated methods,18 as previous studies have reported that health care practitioners are skeptical of the comparability of these outcomes among different hospitals.12 The suggestions of 2 cardiologists working in McGill University hospitals were also taken into account when designing report cards.

Attempts were made to encourage dissemination of report card data. Each contact person received a package containing (1) a cover letter; (2) an information sheet; (3) 10 paper copies of the report card; (4) an electronic copy of the report card and a PowerPoint presentation (Microsoft Inc, Redmond, Wash) summarizing the report card data; (5) acetate copies of the PowerPoint presentation; and (6) a stamped, self-addressed postcard to indicate that the report card had been received. After 2 months, a reminder was sent to each contact person to encourage report card dissemination.

The report card package was delivered to all hospitals in the rapid feedback group in May 2002 and to the delayed feedback group in July 2003.

Outcomes

The primary outcome variable was the proportion of elderly survivors of AMI at each study hospital who filled a prescription for a β-blocker within 30 days after discharge. The secondary outcomes were 12 additional quality indicators.

We chose the primary outcome variable for a number of reasons. First, previous studies have suggested that the provision of data feedback is most likely to improve prescribing practice rather than improve other processes and outcomes of care.9,23 Second, this indicator received one of the highest ratings from a Canadian consensus panel in terms of potential for improvement, meaningfulness, usefulness, and impact.21 Third, it was feasible to create this indicator since we had extensive experience creating, validating, and using this variable.20 Fourth, unlike several of the other Canadian indicators, this indicator had been used to describe quality of care and effectiveness of data feedback in other study populations.5 Therefore, using this indicator permitted comparison with other studies. Finally, β-blocker use is almost universally recommended after AMI, while the recommended target rates for other processes and outcomes of AMI care are less certain.

Sample Size

Our power calculations were based on the formula outlined by Donner and Klar24 for cluster randomized trials with a binary study outcome. We judged a difference in prescription rates of 5% between intervention and control hospitals to be the minimum clinically important difference. This estimate was based on our clinical judgment and on a nonrandomized study that found an increase in β-blocker prescription rates of 5.6% (95% confidence interval, 2.7%-8.6%) after 6 months at hospitals that received data feedback.25 Based on data for patients admitted in 1999, the analysis of variance estimator of the intracluster correlation coefficient was calculated as 0.015. With 38 hospitals in each group, we had 80% power to detect the 5% difference in β-blocker prescription rates at the 5% level of significance, assuming that the number of patients per hospital (m) equaled 79. The estimate of m was derived using the data for patients admitted in 1999 and the formula for the case of varying cluster sizes.24

Intervention Allocation Procedures

The study hospital was the unit of randomization. Intervention allocation was based on a stratified randomization procedure with a blocking size of 4. Hospitals were stratified by volume of AMI admissions during 1999-2000 (high or low volume, defined by the 50th percentile values across all study hospitals) and by availability of on-site cardiac catheterization facilities. These variables have been shown to be associated with differences in aspects of AMI care, such as prescription rates for β-blockers,26,27 use and waiting times for cardiac procedures,28 specialty of the treating physician, and hospital teaching status.28 It has also been reported that health care practitioners do not believe data from high-volume hospitals to be relevant to care at smaller-volume hospitals.14

A research assistant used computer-generated randomization procedures. To minimize the potential for selection bias, the research assistant was blinded to the name of the study hospitals until randomization was complete.

Additional efforts were made to minimize information bias. The study investigators and data handlers were blinded to the intervention status of the study hospitals. One study investigator (L.P.), however, was available to answer any questions from the contacts at the study hospitals.

Statistical Methods

The unit of inference in this trial was directed at the hospital, or cluster, level. Cluster-level analyses were appropriate in this case because the primary research questions focused more on the randomized unit as a whole than on individual patients.24 An intention-to-treat analysis strategy was applied, comparing outcomes at all hospitals that were randomly allocated to receive rapid vs delayed administrative data feedback. Adjusted odds ratios were calculated using generalized estimating equation extensions of logistic regression procedures for cluster randomized trials.24 The variables used in these adjustments were age, sex, comorbidities, hospital volume of AMI admissions, hospital teaching status, and presence of on-site catheterization facilities. In a further set of generalized estimating equation models, we also adjusted for the baseline measures of the quality indicator corresponding to the outcome variable of interest. We explored time trends in quality indicators following rapid vs delayed administrative data feedback through subgroup analyses according to month of admission for AMI. Our final set of analyses was at the hospital level. We measured the mean change in quality indicator values between baseline and follow-up among individual hospitals in the rapid and delayed feedback groups. We then compared the unadjusted difference in mean rates of change between the 2 hospital groups.

Statistical analyses were performed using Stata, version 6.0 (Stata Corp, College Station, Tex) and SAS, version 8.1 (SAS Institute Inc, Cary, NC) statistical software.

Ethics Approval

The McGill University Health Centre Ethics Board provided approval for the design and conduct of this study.

RESULTS
Evidence of Receipt of Report Cards

A large proportion of hospitals in the rapid feedback group sent back their completed postcards acknowledging receipt of the study intervention materials (82%). Several contact individuals also showed their interest in the study through e-mail and/or telephone contact, with requests for further information and/or additional copies of study materials.

Hospital and Patient Characteristics

A total of 76 eligible hospitals were randomized (Figure 2). Baseline patient characteristics were generally similar in each group (Table 1). However, despite randomization techniques that used stratification by hospital volume and on-site catheterization status, a smaller proportion of patients in the rapid feedback group were admitted to hospitals with a high volume of AMI admissions and/or on-site catheterization facilities. There was room for improvement in most quality indicators at baseline, but the indicators were similar in both groups (Table 2).

Effectiveness of Report Cards

At follow-up, patient characteristics were also similar in each group (Table 3). In general, quality of care improved from baseline in each group (Table 4). For example, rates of prescription for β-blockers increased by approximately 10 percentage points between 1999-2000 and 2002-2003. However, the overall quality of care remained similar in each group. The percentages of patients prescribed β-blockers in the rapid and delayed feedback groups were 74% and 76%, respectively (adjusted odds ratio, 1.1; 95% confidence interval, 0.8-1.4; P = .67). Adjusted mortality was similar in both groups, as were length of in-hospital stay, physician visits after discharge, waiting times for invasive cardiac procedures, and readmissions for cardiac complications (Table 4). In a further set of multivariable models, which adjusted for baseline values of the corresponding quality indicators, all quality indicator values remained similar in each group. There were no obvious time trend differences in rates of prescription for β-blockers in any 1-month period between baseline and the end of the follow-up period (Figure 3).

The average difference between follow-up and baseline in rates of prescription of β-blockers at the hospital level was an increase of 9.6% in the rapid feedback group and an increase of 5.4% in the delayed feedback group (Table 5). The difference between these 2 groups was not statistically significant but showed a trend toward a modest clinically significant benefit for the rapid feedback group (4.1%; 95% confidence interval, −2.9% to 11.2%). Among the 38 hospitals randomized to rapid feedback, 24 hospitals improved rates of β-blocker prescription by at least 5% and 20 hospitals improved by at least 10%. Among the hospitals randomized to delayed feedback, 20 hospitals improved by at least 5% and 14 hospitals by at least 10%. A small number of hospitals decreased their overall rates of prescription for β-blockers. There were no significant differences between the 2 groups for all additional quality indicators.

COMMENT

In this cluster randomized controlled trial, confidential feedback provided to hospitals in the form of report cards constructed using linked administrative data was not effective in improving quality of AMI care. Our results suggest that even if the United States eventually acquires these types of administrative data through the Medicare program, confidential feedback based on these data are unlikely to be a sufficient strategy for health care quality improvement.

The lack of previous studies of effectiveness of hospital report cards constructed using administrative data limits interpretation of the generalizability of our findings to other regions. However, our findings are consistent with recent observational evidence. In Canada, prior to 2003 the release of hospital-level administrative data on quality indicators for AMI care had been limited to the publication of an atlas of cardiovascular care in Ontario in 1999.29 Follow-up observational studies comparing quality indicators between Ontario and other provinces did not detect a systematic impact of this atlas on AMI care or outcomes compared with other provinces.3,30,31

Although we received anecdotal evidence that our report card intervention was well received at the study hospitals, a detailed exploration of the AMI care providers’ perceptions and use of the intervention was beyond the scope of this study. Nonetheless, when interpreted in the context of results from previous studies, our results point toward several potential reasons for the lack of effectiveness of the study intervention. One potential reason is that the administrative data were perceived as invalid or irrelevant to practice.32 It is possible that report cards constructed using chart review data may be more effective than those constructed using administrative data because physicians are less skeptical of their data quality. The presence of chart abstractors in hospitals could also increase physicians’ awareness of performance monitoring and affect their practice. Evidence from an observational study in the United States supports this hypothesis.25 However, rates on quality indicators obtained from these data compared with chart reviews have been found to be similar in the Canadian context. A large, randomized controlled trial of effectiveness of report cards constructed using chart review data currently under way in Ontario will provide further evidence.33

Related to the perception of data validity is the fact that it takes time to develop credibility of performance data within a hospital.32 For the practical reason of lag time between AMI admissions and data availability, our intervention represented the first and only introduction of performance measures to the study hospitals. It is possible that the physicians at the study hospitals were not supportive of the concept of hospital report cards because it was new to them. It is also possible that they would have been more supportive should the report card intervention have been repetitively introduced.

Another potential reason is that our intervention was not intensive enough to have an impact on quality of AMI care. For example, several cluster randomized trials have provided evidence of the effectiveness of more intensive or multimodal quality improvement interventions in non-AMI patient populations.34,35 One effective intervention consisted of a combination of chart reviews and physician-specific and benchmark feedback.36 In cardiac populations, it has been suggested that an intensive intervention involving quarterly, interactive multidisciplinary team workshops among health care practitioners, as well as a Web-based performance feedback tool, is effective for quality improvement.37 Another study suggests that the use of practice guideline–based tools, such as standard orders, is more effective for quality improvement than are interventions involving feedback on quality indicators.38 Unfortunately, the amount of resources necessary to provide chart review–based report cards and more intensive interventions on a continuing basis is likely prohibitive in many regions.

One remaining possibility is that administrative data feedback would have been effective had it been publicized. Perhaps public awareness of deficiencies in quality of care is a major and necessary incentive for quality improvement. Some argue that the coronary artery bypass graft surgery report cards based on administrative data have had a positive impact on quality of care in the United States.13 The fact that public reporting may be required even in the context of Canada’s universal health care system is an interesting finding. Canadian hospitals are funded by global budgets and, thus, there are no major market or government incentives stimulating quality, only professional pride. This finding warrants further exploration of motivators for health care quality improvement in public vs market economies.

In summary, our results suggest that one-time provision of confidential hospital report cards constructed using administrative data does not appear to be sufficient for quality improvement in AMI care. More intensive interventions, which could include chart review and continuous and/or public data feedback accompanied by other multimodal interventions, such as team workshops and standard orders, may be effective, but a need remains to study these interventions and their cost-benefit ratios in well-controlled randomized trials.

Back to top
Article Information

Corresponding Author: Louise Pilote, MD, MPH, PhD, Division of Clinical Epidemiology, McGill University Health Centre, 1650 Cedar Ave, Suite L10-421, Montreal, Quebec, Canada H3G 1A4 (louise.pilote@mcgill.ca).

Author Contributions: Dr Pilote had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Beck, Tu, Pilote.

Acquisition of data: Richard, Pilote.

Analysis and interpretation of data: Beck, Richard, Pilote.

Drafting of the manuscript: Beck, Pilote.

Critical revision of the manuscript for important intellectual content: Beck, Richard, Tu, Pilote.

Statistical analysis: Beck, Richard, Pilote.

Obtained funding: Tu, Pilote.

Administrative, technical, or material support: Beck, Pilote.

Study supervision: Pilote.

Financial Disclosures: None reported.

Funding/Support: Ms Beck was supported by a PhD fellowship from the Canadian Cardiovascular Outcomes Research Team, jointly funded by the Canadian Institutes for Health Research and the Heart and Stroke Foundation of Canada. Dr Tu is supported by a Canada Research Chair in Health Services Research. Dr Pilote is a research scholar of the Canadian Institutes for Health Research and a William Dawson professor of Medicine at McGill University. This project was jointly supported by operating grants to the Canadian Cardiovascular Outcomes Research Team from the Canadian Institutes for Health Research and the Heart and Stroke Foundation of Canada.

Role of the Sponsors: The funding organizations had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; or preparation, review, or approval of the manuscript.

Acknowledgment: From the Division of Clinical Epidemiology, McGill University Health Center, we acknowledge Lawrence Joseph, PhD, for his advice concerning the statistical analyses used in this study, as well as Hassan Behouli, PhD, for his assistance with data analyses. We also acknowledge the physicians and directors at Quebec hospitals.

REFERENCES
1.
Krumholz H, Radford MJ, Wang Y, Chen J, Heiat A, Marciniak TA. National use and effectiveness of beta-blockers for the treatment of elderly patients after acute myocardial infarction: National Cooperative Cardiovascular Project.  JAMA. 1998;280:623-629PubMedArticle
2.
Barron HV, Michaels AD, Maynard C, Every NR. Use of angiotensin-converting enzyme inhibitors at discharge in patients with acute myocardial infarction in the United States: data from the National Registry of Myocardial Infarction 2.  J Am Coll Cardiol. 1998;32:360-367PubMedArticle
3.
Pilote L, Beck C, Karp I.  et al.  Secondary prevention after acute myocardial infarction in four Canadian provinces, 1997-2000.  Can J Cardiol. 2004;20:61-67PubMed
4.
Mehta R, Montoye C, Gallogly M.  et al.  Improving quality of care for acute myocardial infarction: the guidelines applied in practice (GAP) initiative.  JAMA. 2002;287:1269-1276PubMedArticle
5.
Bradley EH, Holmboe ES, Mattera JA, Roumanis SA, Radford MJ, Krumholz HM. A qualitative study of increasing beta-blocker use after myocardial infarction: why do some hospitals succeed?  JAMA. 2001;285:2604-2611PubMedArticle
6.
Cooperative Cardiovascular Project Best Practices Working Group.  Improving care for acute myocardial infarction: experience from the Cooperative Cardiovascular Project.  Jt Comm J Qual Improv. 1998;24:480-490PubMed
7.
Jamtvedt G, Young JM, Kristoffersen DT, Thomson O’Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes.  Cochrane Database Syst Rev. 2003;((3)):CD000259PubMed
8.
Ghali W, Ash A, Hall R, Moskowitz M. Statewide quality improvement initiatives and mortality after cardiac surgery.  JAMA. 1997;277:379-382PubMedArticle
9.
Tu JV, Schull MJ, Ferris LE, Hux JE, Redelmeier DA. Problems for clinical judgement, 4: surviving in the report card era.  CMAJ. 2001;164:1709-1712PubMed
10.
Topol EJ, Califf RM. Scorecard cardiovascular medicine: its impact and future directions.  Ann Intern Med. 1994;120:65-70PubMedArticle
11.
Ramsay S. Cardiac surgeons in England face publication of outcomes data.  Lancet. 2002;359:329PubMedArticle
12.
Romano PS, Rainwater JA, Antonius D. Grading the graders: how hospitals in California and New York perceive and interpret their report cards.  Med Care. 1999;37:295-305PubMedArticle
13.
Bentley JM, Nash DB. How Pennsylvania hospitals have responded to publicly released reports on coronary artery bypass graft surgery.  Jt Comm J Qual Improv. 1998;24:40-49PubMed
14.
Dans PE. Caveat doctor: how to analyze claims-based report cards.  Jt Comm J Qual Improv. 1998;24:21-30PubMed
15.
Marshall MN, Shekelle PG, Leatherman S, Brook RH. The public release of performance data: what do we expect to gain? a review of the evidence.  JAMA. 2000;283:1866-1874PubMedArticle
16.
Tamblyn R, Lavoie G, Petrella L, Monette J. The use of prescription claims databases in pharmacoepidemiologic research: the accuracy and comprehensiveness of the prescription claims databases in Quebec.  J Clin Epidemiol. 1995;48:999-1009PubMedArticle
17.
Levy A, Tamblyn RM, Fitchett D, McLeod PJ, Hanley JA. Coding accuracy of hospital discharge data for elderly survivors of myocardial infarction.  Can J Cardiol. 1999;15:1277-1282PubMed
18.
Tu JV, Austin PC, Walld R, Roos L, Agras J, McDonald KM. Development and validation of the Ontario acute myocardial infarction mortality prediction rules.  J Am Coll Cardiol. 2001;37:992-997PubMedArticle
19.
 Centers for Medicare & Medicaid Services (CMS) H. Medicare program. Medicare prescription drug benefit; interpretation. Final rule; interpretation. 70 Federal Register 13397-13400 (2005)
20.
Pilote L, Lavoie F, Ho V, Eisenberg M. Changes in the treatment and outcomes of acute myocardial infarction in Quebec, 1988-1995.  CMAJ. 2000;163:31-36PubMed
21.
Tran C, Lee DS, Flintoft VF.  et al.  CCORT/CCS Canadian quality indicators for acute myocardial infarction care.  Can J Cardiol. 2003;19:38-45PubMed
22.
Thomas JW. Report cards—useful to whom and for what?  Jt Comm J Qual Improv. 1998;24:50-51PubMed
23.
Rainwater JA, Romano PS, Antonius DM. The California hospital outcomes project: How useful is California’s report card for quality improvement?  Jt Comm J Qual Improv. 1998;24:31-39PubMed
24.
Donner A, Klar N. Design and Analysis of Cluster Randomization Trials in Health ResearchNew York, NY: Oxford University Press; 2000
25.
Marciniak TA, Ellerbeck EF, Radford MJ.  et al.  Improving the quality of care for Medicare patients with acute myocardial infarction. Results from the Cooperative Cardiovascular Project.  JAMA. 1998;279:1351-1357PubMedArticle
26.
Allison JJ, Kiefe CI, Weissman NW.  et al.  Relationship of hospital teaching status with quality of care and mortality for Medicare patients with acute MI.  JAMA. 2000;284:1256-1262PubMedArticle
27.
Ayanian J, Hauptman P, Guadagnoli E, Antman E, Pashos C, McNeil B. Knowledge and practices of generalist and specialist physicians regarding drug therapy for acute myocardial infarction.  N Engl J Med. 1994;331:1136-1142PubMedArticle
28.
Halabi AR, Eisenberg MJ, Richard H, Beck CA, Pilote L. Impact of on-site availability of cardiac catheterization on cardiac procedure use and quality of life after acute myocardial infarction.  Can J Cardiol. 2001;17:152C
29.
Tu JV, Austin P, Rochon P, Zhang H. Secondary prevention after acute myocardial infarction, congestive heart failure and coronary artery bypass graft surgery in Ontario. In: Naylor CD, Slaughter PM, eds. Cardiovascular Health and Services in Ontario: An ICES Atlas. Toronto, Ontario: Institute for Clinical Evaluative Sciences; 1999:199-238
30.
Pilote L, Merrett P, Karp I.  et al.  Cardiac procedures after an acute myocardial infarction across nine Canadian provinces.  Can J Cardiol. 2004;20:491-500PubMed
31.
Tu J, Austin P, Filate W.  et al.  Outcomes of acute myocardial infarction in Canada.  Can J Cardiol. 2003;19:893-901PubMed
32.
Bradley E, Holmboe E, Mattera J, Roumanis S, Radford M, Krumholz H. Data feedback efforts in quality improvement: lessons learned from US hospitals.  Qual Saf Health Care. 2004;13:26-31PubMedArticle
33.
Institute for Clinical Evaluative Sciences.  Quality of Cardiac Care in OntarioToronto, Ontario: Institute for Clinical Evaluative Sciences; 2004
34.
Horbar J, Carpenter J, Buzas J.  et al.  Collaborative quality improvement to promote evidence based surfactant for preterm infants: a cluster randomized trial.  BMJ. 2004;329:1004Article
35.
Ornstein S, Jenkins R, Nietert P.  et al.  A multimethod quality improvement intervention to improve preventive cardiovascular care: a cluster randomized trial.  Ann Intern Med. 2004;141:523-532PubMedArticle
36.
Kiefe C, Allison J, Williams O, Person S, Weaver M, Weissman N. Improving quality improvement using achievable benchmarks for physician feedback. A randomized controlled trial.  JAMA. 2001;285:2871-2879PubMedArticle
37.
LaBresh K, Ellrodt A, Gliklich R, Liljestrand J, Peto R. Get with the guidelines for cardiovascular secondary prevention: pilot results.  Arch Intern Med. 2004;164:203-209PubMedArticle
38.
Mehta RH, Montoye CK, Faul J.  et al.  Enhancing quality of care for acute myocardial infarction: shifting the focus of improvement from key indicators to process of care and tool use: the American College of Cardiology Acute Myocardial Infarction Guidelines Applied in Practice Project in Michigan: Flint and Saginaw Expansion.  J Am Coll Cardiol. 2004;43:2166-2173PubMedArticle
×