Statewide cardiac surgery outcome reporting is intended to inform patient and clinician decisions. However, on the day of their release, they often are reporting performances from several years prior.1,2 Past performance may inform quality,3-5 and shorter time lag improves the relevance of the report,4 but publication continues to lag by several years. We used California and New York data to compare what was reported at the time of public report card release with the contemporary performance at that time (based on subsequent public reports).
We used data on isolated coronary artery bypass graft surgery (CABG) from publicly available statewide cardiac surgery outcome reports in New York and California.1,2 We analyzed cases performed between 2013 and 2016 (2016 data are the latest published for both states). Of 164 centers, we included centers performing at least 1 isolated CABG per year between 2013 and 2016. Because the data were publicly available, the Yale institutional review board waived approval and the need for patient consent.
We used observed-to-expected (O-E) operative mortality ratio as the standardized metric of risk-adjusted outcome.1,2 Observed-to-expected ratio is defined as the ratio of observed and expected mortality rates for each center. Expected mortality is calculated from state-specific risk models that accounted for patient factors.
The 2013 performance was made available in 2016, and we plotted O-E ratios in 2013 and 2016 to visualize the differences. We categorized centers by extreme outcomes (O-E ratio >2 or <0.5) and nonextreme outcomes (O-E ratio 0.5-2), because extreme outcomes likely inform choices. We defined the O-E ratio thresholds a priori to facilitate interpretation of the frequency of O-E ratio changes. Values were reported in mean and standard deviation. Analyses were conducted using Python, version 3.6 (Python Software Foundation).
We included 119 California centers and 36 New York centers. Mean (SD) center-level annual case volume was 127 (107) (Table). In 2013, 22 centers (14%) had O-E ratios greater than two, 92 centers (59%) had O-E ratios between 0.5 and 2, and 41 centers (26%) had O-E ratios less than 0.5.
During the 3-year time lag, center O-E ratios changed by a mean (SD) of 1.0 (1.7). Of the 41 centers that had less than half the expected mortality in 2013, only 12 centers (29%) continued to have less than half the expected mortality, and 9 centers (22%) had more than twice the expected mortality in 2016. Of the 22 centers that had more than twice the expected mortality in 2013, only 6 centers (27%) continued to have more than twice the expected mortality, and 5 centers (23%) had less than half the expected mortality in 2016 (Figure).
Patients who depend on the point estimates of 3-year-old data from these public reports could draw incorrect inferences about the current quality of their prospective clinicians. Hospital-level risk-adjusted surgical outcomes changed for many programs between the year of outcome measurement and when the report was published. Although a prior study4 showed that using 2-year instead of 3-year data better predicted current quality, we showed outcome disparities still exist. The change was more prominent in centers that had extreme outcomes in the year of outcome measurement, suggesting instability of estimates at extremes and regression to the mean.6 Consistent with our findings, only 1 of 155 centers had CABG mortality that was statistically significantly different in 2013.1,2
States reported that validating the data delayed the publication of reports.1 The process may be shortened with currently available digital platforms to facilitate data centralization and analysis, as done in the Scientific Registry of Transplant Recipients, for example. Bayesian methods may aid interpretation of statistical noise related to low volume. For public reporting to have relevance to decision-making and quality improvement, we must collect, analyze, and disseminate information in near-real time.
Corresponding Author: Harlan M. Krumholz, MD, SM, Center for Outcomes Research and Evaluation, One Church Street, Ste 200, New Haven, CT 06510 (harlan.krumholz@yale.edu).
Accepted for Publication: December 26, 2019.
Published Online: March 4, 2020. doi:10.1001/jamasurg.2019.6367
Author Contributions: Dr Mori had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: Mori, Krumholz.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: Mori.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Mori, Lin.
Administrative, technical, or material support: Suter.
Supervision: Shahian, Suter, Krumholz.
Conflict of Interest Disclosures: Dr Krumholz was a recipient of a research grant, through Yale, from Medtronic and the US Food and Drug Administration to develop methods for postmarket surveillance of medical devices; is a recipient of research agreements with Medtronic and Johnson & Johnson (Janssen) through Yale to develop methods of clinical trial data sharing; works under contract with the Centers for Medicare & Medicaid Services through Yale to develop and maintain performance measures that are publicly reported; was a recipient of a research agreement through Yale from the Shenzhen Center for Health Information for work to advance intelligent disease prevention and health promotion, and collaborates with the National Center for Cardiovascular Diseases in Beijing; received payment from the Arnold and Porter Law Firm for work related to the Sanofi clopidogrel litigation and from the Ben C. Martin Law Firm for work related to the Cook inferior vena cava filter litigation; chairs a Cardiac Scientific Advisory Board for UnitedHealth; is a participant/participant representative of the IBM Watson Health Life Sciences Board; is a member of the Advisory Boards for Element Science, Facebook, and the Physician Advisory Board for Aetna; and is the founder of Hugo, a personal health information platform. Dr Mori received funding support from the Yale Clinical and Translational Science Award (grant UL1TR001863) from the National Center for Advancing Translational Science, a component of the National Institutes of Health. No other disclosures were reported.