[Skip to Navigation]
Sign In
Figure.  Surgical Supply Costs in the Intervention vs Control Groups
Surgical Supply Costs in the Intervention vs Control Groups

A, Shown is the percentage change in the median surgical supply direct cost (in US dollars) during the study period (2015) in the intervention vs control groups. B, Shown is the difference between observed and expected total surgical supply spending between 2013 and 2015. Positive values indicate that the group was more costly than expected, and negative values indicate that the group was less costly than expected.

Table 1.  Multivariable Model for the Intervention Effect on Surgical Supply Costsa
Multivariable Model for the Intervention Effect on Surgical Supply Costsa
Table 2.  Surgical Supply Spending per Department During the Study Perioda
Surgical Supply Spending per Department During the Study Perioda
Table 3.  Change in the Case Mix Index–Adjusted Surgical Supply Cost by Departmenta
Change in the Case Mix Index–Adjusted Surgical Supply Cost by Departmenta
Table 4.  Intervention Effect on Patient Outcomes
Intervention Effect on Patient Outcomes
1.
National Center for Health Statistics. Health, United States, 2013: with special feature on prescription drugs. Hyattsville, MD: National Center for Health Statistics; 2014:116. http://www.cdc.gov/nchs/data/hus/hus13.pdf. Accessed November 4, 2015.
2.
Zygourakis  CC, Oh  T, Sun  MZ, Barani  I, Kahn  JG, Parsa  AT.  Surgery is cost-effective treatment for young patients with vestibular schwannomas: decision tree modeling of surgery, radiation, and observation.  Neurosurg Focus. 2014;37(5):E8. doi:10.3171/2014.8.FOCUS14435PubMedGoogle ScholarCrossref
3.
Zygourakis  CC, Valencia  V, Boscardin  C,  et al.  Predictors of variation in neurosurgical supply costs and outcomes across 4,904 surgeries at a single institution [published online September 6, 2016].  World Neurosurg.PubMedGoogle Scholar
4.
Okike  K, O’Toole  RV, Pollak  AN,  et al.  Survey finds few orthopedic surgeons know the costs of the devices they implant.  Health Aff (Millwood). 2014;33(1):103-109.PubMedGoogle ScholarCrossref
5.
Crosson  FJ.  Change the microenvironment: delivery system reform essential to control costs.  Mod Healthc. 2009;39(17):20-21.PubMedGoogle Scholar
6.
Sinaiko  AD, Chien  AT, Rosenthal  MB.  The role of states in improving price transparency in health care.  JAMA Intern Med. 2015;175(6):886-887.PubMedGoogle ScholarCrossref
7.
Castlight Health. http://www.castlighthealth.com/. Accessed November 4, 2015.
8.
Healthcare Bluebook. http://www.healthcarebluebook.com/. Accessed November 4, 2015.
9.
CompareMaine. Health costs & quality. http://comparemaine.org/. Accessed November 4, 2015.
11.
Schroeder  SA, Myers  LP, McPhee  SJ,  et al.  The failure of physician education as a cost containment strategy: report of a prospective controlled trial at a university hospital.  JAMA. 1984;252(2):225-230.PubMedGoogle ScholarCrossref
12.
Schroeder  SA, Kenders  K, Cooper  JK, Piemme  TE.  Use of laboratory tests and pharmaceuticals: variation among physicians and effect of cost audit on subsequent use.  JAMA. 1973;225(8):969-973.PubMedGoogle ScholarCrossref
13.
Tu  HT, Lauer  JR.  Impact of health care price transparency on price variation: the New Hampshire experience.  Issue Brief Cent Stud Health Syst Change. 2009;(128):1-4.PubMedGoogle Scholar
14.
Tabib  CH, Bahler  CD, Hardacker  TJ, Ball  KM, Sundaram  CP.  Reducing operating room costs through real-time cost information feedback: a pilot study.  J Endourol. 2015;29(8):963-968. PubMedGoogle ScholarCrossref
15.
Zygourakis  CC, Winkler  E, Pitts  L, Hannegan  L, Franc  B, Lawton  MT.  Clinical utility and cost analysis of routine postoperative head CT in elective aneurysm clippings [published online April 29, 2016].  J Neurosurg.PubMedGoogle Scholar
16.
Gitelis  M, Vigneswaran  Y, Ujiki  MB,  et al.  Educating surgeons on intraoperative disposable supply costs during laparoscopic cholecystectomy: a regional health system’s experience.  Am J Surg. 2015;209(3):488-492.PubMedGoogle ScholarCrossref
17.
Gunaratne  K, Cleghorn  MC, Jackson  TD.  The Surgeon Cost Report Card: a novel cost-performance feedback tool.  JAMA Surg. 2016;151(1):79-80.PubMedGoogle ScholarCrossref
18.
US Inflation Calculator. Consumer Price Index data from 1913 to 2016. http://usinflationcalculator.com. Accessed April 10, 2016.
19.
Pettengill  J, Vertrees  J.  Reliability and validity in hospital case-mix measurement.  Health Care Financ Rev. 1982;4(2):101-128.PubMedGoogle Scholar
20.
Dripps  RD.  New classification of physical status.  Anesthesiology. 1963;24:111.Google Scholar
21.
Rauh  MA, Krackow  KA.  In-hospital deaths following elective total joint arthroplasty.  Orthopedics. 2004;27(4):407-411.PubMedGoogle Scholar
22.
Wolters  U, Wolf  T, Stützer  H, Schröder  T.  ASA classification and perioperative variables as predictors of postoperative outcome.  Br J Anaesth. 1996;77(2):217-222.PubMedGoogle ScholarCrossref
23.
Cassel  CK, Guest  JA.  Choosing Wisely: helping physicians and patients make smart decisions about their care.  JAMA. 2012;307(17):1801-1802.PubMedGoogle ScholarCrossref
24.
Tilburt  JC, Wynia  MK, Sheeler  RD,  et al.  Views of US physicians about controlling health care costs [published corrections appear in JAMA. 2013;310(19):2102 and 2013;310(8):857].  JAMA. 2013;310(4):380-388.PubMedGoogle ScholarCrossref
25.
U.S. Food and Drug Administration. Updated: laparoscopic uterine power morcellation in hysterectomy and myomectomy: FDA Safety Communication. Date issued: November 24, 2014. http://www.fda.gov/MedicalDevices/Safety/AlertsandNotices/ucm424443.htm. Accessed May 20, 2016.
Original Investigation
March 2017

Association Between Surgeon Scorecard Use and Operating Room Costs

Author Affiliations
  • 1Department of Neurological Surgery, University of California, San Francisco
  • 2UCSF Center for Healthcare Value, University of California, San Francisco
  • 3Department of Internal Medicine, Dell Medical School at The University of Texas at Austin
  • 4Department of Medicine, University of California, San Francisco
  • 5Department of Medicine, University Hospital Zurich, Zurich, Switzerland
  • 6Department of Surgery and Perioperative Care, Dell Medical School at The University of Texas at Austin
  • 7Department of Orthopedic Surgery, University of California, San Francisco
  • 8Healthcare Technology Assessment Program, University of California, San Francisco
  • 9Department of Otolaryngology–Head and Neck Surgery, University of California, San Francisco
  • 10Philip R. Lee Institute for Health Policy Studies, University of California, San Francisco
  • 11Continuous Process Improvement, UCSF Health, University of California, San Francisco
JAMA Surg. 2017;152(3):284-291. doi:10.1001/jamasurg.2016.4674
Key Points

Question  What is the association between providing surgeons with individualized cost feedback and surgical supply costs?

Findings  In this case-control study, surgeons in the intervention group received cost feedback scorecards during the study period, while those in the control group did not. The median surgical supply direct costs per case decreased 6.54% in the intervention group compared with a 7.42% increase in the control group.

Meaning  Cost feedback to surgeons was associated with significantly reduced surgical supply costs.

Abstract

Importance  Despite the significant contribution of surgical spending to health care costs, most surgeons are unaware of their operating room costs.

Objective  To examine the association between providing surgeons with individualized cost feedback and surgical supply costs in the operating room.

Design, Setting, and Participants  The OR Surgical Cost Reduction (OR SCORE) project was a single–health system, multihospital, multidepartmental prospective controlled study in an urban academic setting. Intervention participants were attending surgeons in orthopedic surgery, otolaryngology–head and neck surgery, and neurological surgery (n = 63). Control participants were attending surgeons in cardiothoracic surgery, general surgery, vascular surgery, pediatric surgery, obstetrics/gynecology, ophthalmology, and urology (n = 186).

Interventions  From January 1 to December 31, 2015, each surgeon in the intervention group received standardized monthly scorecards showing the median surgical supply direct cost for each procedure type performed in the prior month compared with the surgeon’s baseline (July 1, 2012, to November 30, 2014) and compared with all surgeons at the institution performing the same procedure at baseline. All surgical departments were eligible for a financial incentive if they met a 5% cost reduction goal.

Main Outcomes and Measures  The primary outcome was each group’s median surgical supply cost per case. Secondary outcome measures included total departmental surgical supply costs, case mix index–adjusted median surgical supply costs, patient outcomes (30-day readmission, 30-day mortality, and discharge status), and surgeon responses to a postintervention study-specific health care value survey.

Results  The median surgical supply direct costs per case decreased 6.54% in the intervention group, from $1398 (interquartile range [IQR], $316-$5181) (10 637 cases) in 2014 to $1307 (IQR, $319-$5037) (11 820 cases) in 2015. In contrast, the median surgical supply direct cost increased 7.42% in the control group, from $712 (IQR, $202-$1602) (16 441 cases) in 2014 to $765 (IQR, $233-$1719) (17 227 cases) in 2015. This decrease represents a total savings of $836 147 in the intervention group during the 1-year study. After controlling for surgeon, department, patient demographics, and clinical indicators in a mixed-effects model, there was a 9.95% (95% CI, 3.55%-15.93%; P = .003) surgical supply cost decrease in the intervention group over 1 year. Patient outcomes were equivalent or improved after the intervention, and surgeons who received scorecards reported higher levels of cost awareness on the health care value survey compared with controls.

Conclusions and Relevance  Cost feedback to surgeons, combined with a small departmental financial incentive, was associated with significantly reduced surgical supply costs, without negatively affecting patient outcomes.

Introduction

More than 50 million inpatient surgical procedures were performed in the United States in 2010, costing approximately $175 billion.1 Operating room (OR) costs can account for more than 40% of hospitalization costs for surgical patients,2 with disposable supplies (implantable and nonimplantable items, such as spinal hardware, hemostatic agents, and sutures) representing a large portion of overall OR costs. Whereas clinicians cannot easily address some drivers of high surgical costs (eg, labor or hospital indirect costs), individual surgeons can directly control the supplies they use for a particular operation either through their preference card (a list of supplies and equipment needed for a specific case) or requests made in the OR. This system results in significant variability in surgical supply use by different surgeons, leading to large cost differences for similar procedures at a single institution, often without proven effect on patient outcomes.3 Despite their major influence on OR surgical supply choice, most surgeons have little knowledge of their OR costs. A recent national survey of orthopedic surgeons found that they correctly estimated the cost of a commonly used implant only 21% of the time, and their guesses ranged from 0.02 to 24.6 times the actual cost of the item.4

This finding is not surprising given that health care professionals, who drive more than 80% of use,5 often do not receive the information they need to include cost as a factor in their decision making. However, growing evidence shows that physicians are increasingly concerned about the rising cost of health care, and consumer-facing health care cost transparency is gaining momentum nationwide.6-10 Although early studies11-13 found limited or no success in changing physician behavior through cost transparency efforts, more recent work suggests that surgeons may choose a lower-cost disposable surgical supply in the OR when presented with cost feedback14 or forgo an expensive postoperative computed tomographic scan in a neurologically intact patient.15

These prior investigations were retrospective and small, limited to a single specialty or procedure type.14,16,17 Therefore, we sought to perform a prospective controlled study across multiple surgical departments to examine the association between providing surgeons with individualized cost feedback and surgical supply costs in the operating room.

Methods
Study Design

A single–health system, multihospital, multidepartmental prospective controlled study, OR Surgical Cost Reduction (OR SCORE) project, was conducted from January 1 to December 31, 2015, and was sponsored by the University of California, San Francisco (UCSF) Center for Healthcare Value Caring Wisely program and UCSF Health. The study was conducted as part of UCSF Health hospital’s quality improvement initiatives and was exempt from human participants review. Publication was approved by the UCSF Committee on Human Research.

Intervention

Data from electronic medical records (Epic; Epic Systems Corporation) were aggregated and summarized in a software program (R, version 3.1.3; R Development Core Team). We then used macros (Visual Basic; Microsoft Excel) to create individualized surgeon scorecards (eFigure 1 in the Supplement). These scorecards show the median surgical supply direct cost for each procedure type (eg, total knee replacement) that the surgeon performed in the prior month compared with the surgeon’s baseline (July 1, 2012, to November 30, 2014) and compared with all UCSF surgeons performing the same procedure at baseline.

Surgical supplies included all disposable and implantable items (eg, spinal hardware, hemostatic agents, and sutures) and excluded instrument sets or multiuse equipment, such as microscopes. For each procedure type, scorecards showed the top 10 most expensive items (by unit cost), the top 10 most frequently used items, and the top 10 “bang for your buck” items, which were the most frequently used items multiplied by the unit cost and represented the most significant source of potential cost savings.

All attending surgeons in the orthopedic surgery, otolaryngology–head and neck surgery (OHNS), and neurological surgery departments operating at the main UCSF Health hospital (n = 63) received monthly scorecards (eFigure 1 in the Supplement) via email from January 1 to December 31, 2015, and were categorized as the intervention group. Attending surgeons in cardiothoracic surgery, general surgery, vascular surgery, pediatric surgery, obstetrics/gynecology, ophthalmology, and urology (n = 186) did not receive scorecards and served as the control group.

Physician champions in the intervention departments delivered educational presentations to prepare surgeons for receiving scorecards and to encourage cost reduction at their departmental meetings. All surgical departments (intervention and control) were eligible for a $50 000 financial incentive from the UCSF Health hospital’s administration to be used for academic or research purposes if they met a 5% cost reduction goal. The financial incentive was approved by the chief medical officer and publicized at the surgical chair’s meeting. Departments received the incentive if their median surgical supply costs per case in 2015 decreased 5% compared with 2014 after adjusting for the case mix index (CMI), a measure of case complexity that represents the weighted mean of all diagnosis-related groups associated with a given patient.

Primary Outcome

The primary outcome was the median surgical supply direct costs per case calculated across each group (intervention vs control) from January 1 to December 31, 2015, compared with January 1 to December 31, 2014. Interquartile ranges (IQRs) are given where appropriate.

Secondary Outcomes

There were 4 secondary outcomes. These outcome measures included (1) total spending on surgical supply costs by intervention group and by department, (2) CMI-adjusted median surgical supply costs per case for each department, (3) patient outcomes (30-day readmission, 30-day mortality, and discharge status), and (4) surgeon responses to a postintervention study-specific health care value survey.

We determined the observed total spending (in US dollars) for each group (intervention vs control) by obtaining the mean observed cost of surgical supplies per case for each department in each year from 2012 to 2015 multiplied by the number of cases performed. We calculated the difference between the total observed cost and the total expected cost for each case by subtracting the previous year’s departmental mean cost from the total case cost multiplied by the appropriate rate of inflation from the US Consumer Price Index,18 which was 1.5% from 2012 to 2013, 1.6% from 2013 to 2014, and 0.1% from 2014 to 2015. We then calculated the mean difference in observed vs expected spending per case (by department and group) and reported 95% CIs.

We aimed to examine whether changes in case costs were due to more complex cases in one year compared with the other. Therefore, we adjusted all inpatient case costs for each group (intervention vs control) and each surgical department by the CMI.19

To confirm that quality of care was not compromised during our intervention, we obtained several outcomes for all UCSF Health hospital’s surgical patients from January 1, 2014, to December 31, 2015. These measures included readmission within 30 days of discharge, mortality within 30 days of discharge, and discharge to a location other than home (eg, skilled nursing facility or rehabilitation center).

Finally, we distributed an anonymous, study-specific survey (online via Qualtrics [https://www.qualtrics.com/]) to surgical attendings after the intervention to assess the efficacy of the OR SCORE study and individual attitudes toward OR costs and health care value (eTable 1 in the Supplement). The survey was completed by 91 of 249 attending surgeons, representing an overall response rate of 36.5% and a response rate of 47.6% (30 of 63) for intervention surgeons who received scorecards.

Statistical Analysis

To determine if the changes in the primary outcome (surgical supply cost) were statistically significant, we built a model to account for various procedure types performed by different surgeons, patient demographics, and clinical indicators. This time series mixed-effects model tested the hypothesis that the trend in surgical supply cost per case decreased significantly in the intervention group in 2015 compared with 2014 after controlling for these factors. We first performed a log transformation of the surgical supply cost per case because of the skewed nature of the data. The main effect of the intervention was captured by the interaction term between the intervention group × days after intervention to allow for the change in cost associated with the intervention period in 2015 in this time series analysis. Logarithmic estimates were converted back into percentage changes in cost. To account for the differential effect of departments and the fact that surgeons tend to perform different types of procedures, we included department and surgeon in the model as random effects. Additional covariates (patient sex, patient age, payer [commercial, Medicaid, Medicare, or other], and American Society of Anesthesiologists [ASA] classification20) were added to the model as fixed effects to control for other potential sources of cost variation between preintervention and postintervention and intervention vs control groups. The CMI could not be included in this model because it is available for only inpatient cases. Therefore, we included the ASA classification as an indicator of patient severity of illness owing to its known correlation with postoperative resource use and surgical morbidity and mortality.21,22

For our secondary patient outcomes, we built a time series mixed-effects model similar to the one described above that included only inpatient cases and added the CMI as a fixed variable. We also constructed mixed-effects models to determine if there was a significant difference in any of the 3 patient outcomes (30-day readmission, 30-day mortality, and discharge to a location other than home) after implementing the OR SCORE study in our intervention group in 2015. In these models, we included department and surgeon as random effects and included patient sex, patient age, payer, ASA classification, intervention vs control group, 2015 vs 2014, and the interaction term of intervention × control with 2015 vs 2014 as fixed effects. Finally, for the secondary outcome of survey responses, we used 2-sample t tests to compare results between the intervention vs control surgeons.

All statistical analyses were performed using computer software. These programs included R version 3.1.3 (R Development Core Team) and JMP Pro 12.01 (SAS Institute Inc).

Results
Primary Outcome

The median surgical supply direct costs per case decreased 6.54% in the intervention group, from $1398 (IQR, $316-$5181) (10 637 cases) in 2014 to $1307 (IQR, $319-$5037) (11 820 cases) in 2015. In contrast, the median surgical supply direct cost increased 7.42% in the control group, from $712 (IQR, $202-$1602) (16 441 cases) in 2014 to $765 (IQR, $233-$1719) (17 227 cases) in 2015 (Figure). The time series mixed-effects model showed that this difference was statistically significant after adjusting for surgeon, department, patient demographics, and clinical indicators. The intervention group had a mean savings of 0.03% (95% CI, 0.01%-0.05%) per day, or 9.95% (95% CI, 3.55%-15.93%; P = .003) over 1 year after the initiation of the OR SCORE study compared with the control group (Table 1).

Secondary Outcomes
Total Spending on Surgical Supply Costs

The observed total spending on surgical supply costs was lower than expected for the intervention group in 2015. The intervention group saved $836 147 on surgical supplies in 2015 (−$71; 95% CI, −$195 to $54 per case), whereas the control group’s spending increased by $3 073 647 in 2015 ($178; 95% CI, $126-$231 per case) (Table 2). Looking separately at each department’s total spending, the OHNS and orthopedic surgery intervention departments had much greater savings ($215 173 and $2 140 923, respectively, in 2015) than the only 2 control departments with savings (obstetrics/gynecology saved $21 367, and ophthalmology saved $137 909 in 2015).

eFigure 1 in the Supplement shows that the intervention group had higher than expected spending in 2013 and 2014 and that the decreased spending occurred specifically in the 2015 intervention year. In contrast, the control group increased its spending significantly in 2015.

CMI-Adjusted Median Surgical Supply Costs

The CMI-adjusted median surgical supply cost per case decreased in the intervention group (−3.95%) and increased in the control group (5.07%) in 2015 vs 2014 (Table 3). Another mixed-effects model showed that this difference was statistically significant after adjusting for surgeon, department, patient demographics, and clinical indicators, including the ASA classification and CMI in this subset of inpatient cases (eTable 2 in the Supplement).

Two intervention departments (OHNS and orthopedic surgery) and 1 control department (ophthalmology) had more than 5% decreases in the CMI-adjusted median surgical supply cost per case (−8.22%, −5.63%, and −34.0%, respectively) (Table 3). As a result, these 3 departments received the financial incentive for achieving their cost reduction targets. Ophthalmology had a particularly large percentage change in the CMI-adjusted cost but a small number of inpatient cases (40 of 2846 total ophthalmology cases in 2015).

Patient Outcomes

After controlling for surgeon, department, patient demographics, and clinical indicators in mixed-effects models, there was no significant difference in 30-day readmission in the intervention vs control groups in 2015 vs 2014 (odds ratio, 1.10; 95% CI, 0.94-1.30; P = .25) (Table 4). Thirty-day mortality (odds ratio, 0.27; 95% CI, 0.11-0.67; P = .005) and discharge to a location other than home (odds ratio, 0.78; 95% CI, 0.70-0.88; P < .001) were significantly improved in the intervention group vs the control group in 2015 vs 2014.

Survey Findings

Thirty attending surgeons who completed the postintervention survey reported receiving scorecards (response rate, 47.6% [30 of 63]). Twenty-five of 29 (86.2%) surgeons who got scorecards and answered this survey question stated that they always, often, or sometimes looked at their cost scorecards, and 22 of 29 (75.9%) reported that they always, often, or sometimes used the scorecard data to influence OR surgical supply use (eFigure 2 in the Supplement).

On a 5-point Likert-type scale, surgeons who received scorecards and those who did not had similar responses on general statements about health care value: “I believe that surgeons have the capacity to help control OR costs” (mean [SD] score, 4.47 [0.83] vs 4.30 [0.78]; P = .32), and “I am partially responsible to help control OR costs” (mean [SD] score, 4.37 [0.72] vs 4.50 [0.50]; P = .21) (eFigure 3 and eTable 3 in the Supplement). However, surgeons who received cost feedback had significantly higher scores on questions that addressed knowledge about cost reduction: “I know how much my procedures cost in comparison to my peers” (mean [SD] score, 3.33 [0.99] vs 2.31 [1.19]; P < .001), and “I know which items contribute the most to high cost” (mean [SD] score, 3.83 [1.02] vs 2.63 [1.07]; P < .001).

Twenty-three surgeons (76.7% [23 of 30]) who received scorecards and completed the survey stated that they strongly agreed or agreed with the statement that “The OR SCORE project helped me learn more about cost and efficiency in the OR.” Twenty-four surgeons (80.0% [24 of 30]) strongly agreed or agreed with the statement that “The OR SCORE project should be continued.”

Discussion

Our study showed that providing cost scorecards to surgeons during 1 year, combined with a small financial departmental incentive and identification of physician cost-saving champions, was associated with a significant reduction in surgical supply costs compared with surgeons who did not receive scorecards but were still eligible for the same financial incentive. There was a 6.54% decrease in the median surgical supply direct costs per case in the intervention group compared with a 7.42% increase in the control group, a trend that persisted after adjusting for the CMI in inpatient cases. Although the percentage decrease was small, it led to substantial savings of $836 147 in 2015 in our intervention group. Savings were significant even after controlling for surgeon, department, patient demographics, and clinical indicators in a time series mixed-effects model. Three basic patient outcomes were equivalent or improved after the intervention, suggesting that our intervention did not negatively influence the quality of patient care.

Our intervention effect magnitude is similar to that reported in several smaller studies. One group found that cost feedback to a single urologist led to a 17% cost decrease for robot-assisted partial nephrectomies and laparoscopic donor nephrectomies,14 and another group reported that giving cost report cards to 4 general surgeons resulted in a 10% cost decrease in gastric bypass procedures.17 A third study16 described a 10% cost decrease for laparoscopic cholecystectomies after educating 15 surgeons about the cost of disposable supplies.

To our knowledge, this study represents the largest and only controlled study of cost feedback that targets surgeons across several specialties and a broad range of procedures. Like other studied approaches,17 our scorecards feed into physicians’ competitive nature by comparing them with their peers. Our approach empowers individual health care professionals to make their own decisions regarding resource use rather than relying on mandates from administrators or payers. This method also aligns with the growing movement for self-regulation of physicians’ cost of care through national programs like Choosing Wisely.23 In our study-specific survey, 87.7% (50 of 57) of surgeons stated that they are partially responsible to help control OR costs. This percentage is significantly greater than the 36% of practicing physicians who reported in a 2013 survey that they have a “major responsibility” in reducing health care costs.24 This difference may reflect increasing cost consciousness and sense of personal responsibility during the past few years or a difference in survey wording and physician specialty.

Limitations

One study limitation is that we cannot definitively say that the OR SCORE study feedback directly led to cost reduction but rather that there is a significant association between scorecard use and cost reduction in the intervention group that is not seen in the control group. This difference persisted even after we adjusted for surgeon, department, patient demographics, and clinical indicators in our statistical model. We do not know if the scorecards would be effective at reducing costs without a financial incentive because all surgical departments were eligible for the financial incentive. Moreover, we only provided scorecards to attending surgeons, but surgical residents, nurses, and scrub technicians are important members of the OR team who contribute to surgical supply use and might benefit from cost information. It is challenging to give scorecards to these individuals because they are often involved in multiple cases with different attendings and services. However, more direct involvement of trainees and nursing staff might improve the effect of our intervention.

Our results demonstrate that certain departments were more successful than others at reducing costs, an important consideration for others trying to replicate this intervention. Two control departments (obstetrics/gynecology and ophthalmology) decreased their costs without receiving the OR SCORE study feedback, while a single intervention department (neurological surgery) did not reduce its costs. This finding highlights the fact that many factors affect year-to-year surgical costs, as well as that variables like departmental culture and leadership influence the success of cost feedback interventions in nonuniform ways. This consideration is the main reason we included surgical department as a random effect in our statistical models. In addition, small changes in low-cost straightforward procedures, such as the adoption of generic eyedrops in ophthalmological cases, or the discontinuation of an expensive morcellator in gynecological procedures because of US Food and Drug Administration25 safety warnings can have a significant influence on overall cost, as seen in 2 of our control groups. In contrast, cost containment may be increasingly challenging for departments like neurological surgery that perform more diverse and complex procedures in which a greater number of changes are needed to see a significant cost reduction.

Conclusions

The prospective controlled OR SCORE study showed that cost feedback to surgeons, combined with a small departmental financial incentive, was associated with significantly reduced surgical supply costs. Basic patient outcomes were equivalent or improved after the intervention, and surgeons who received scorecards reported higher levels of cost awareness compared with controls on our study-specific survey.

Back to top
Article Information

Accepted for Publication: August 21, 2016.

Corresponding Author: Corinna C. Zygourakis, Department of Neurological Surgery, University of California, San Francisco, 505 Parnassus Ave, Room M779, San Francisco, CA 94143 (corinna.zygourakis@ucsf.edu).

Published Online: December 7, 2016. doi:10.1001/jamasurg.2016.4674

Author Contributions: Dr Zygourakis and Ms Valencia had full access to all the data and take responsibility for the integrity of the date and the accuracy of the data analysis.

Study concept and design: Zygourakis, Moriates, Boscardin, Catschegn, Rajkomar, Bozic, Soo Hoo, Goldberg, Pitts, Lawton, Dudley, Gonzales.

Acquisition, analysis, or interpretation of data: Zygourakis, Valencia, Moriates, Catschegn, Rajkomar, Goldberg, Lawton.

Drafting of the manuscript: Zygourakis, Boscardin, Goldberg.

Critical revision of the manuscript for important intellectual content: Zygourakis, Valencia, Moriates, Catschegn, Rajkomar, Bozic, Soo Hoo, Goldberg, Pitts, Lawton, Dudley, Gonzales.

Statistical analysis: Valencia, Rajkomar.

Administrative, technical, or material support: Valencia, Moriates, Catschegn, Bozic, Soo Hoo, Goldberg, Gonzales.

Conflict of Interest Disclosures: Dr Zygourakis reported receiving a travel grant from Globus and Nuvasive to attend a resident education course. Dr Goldberg reported owning stock in Siesta Medical Inc and Apnicure Inc, reported receiving an honorarium from Stryker, and reported having a patent pending for sinus diagnostics and therapeutics. Dr Lawton reported receiving speaker honoraria from Stryker and Depuy and reported receiving consulting payments from Stryker. No other disclosures were reported.

Funding/Support: This project was supported by the Caring Wisely program, a joint effort of the UCSF Center for Healthcare Value and UCSF Health. Dr Zygourakis is supported by a fellowship from the UCSF Center for Healthcare Value.

Role of the Funder/Sponsor: Individuals supported by the Caring Wisely program and the UCSF Center for Healthcare Value were responsible for the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Additional Contributions: The following individuals at the University of California, San Francisco, critically reviewed the manuscript: Catherine L. Chen, MD, MPH (Department of Anesthesia and Perioperative Care), Veronica Yank, MD (Department of Medicine), Grace Lin, MD (Department of Medicine), Colette Dejong, BA (School of Medicine), and Naomi Bardach, MD (Department of Pediatrics). None received compensation for their contributions.

References
1.
National Center for Health Statistics. Health, United States, 2013: with special feature on prescription drugs. Hyattsville, MD: National Center for Health Statistics; 2014:116. http://www.cdc.gov/nchs/data/hus/hus13.pdf. Accessed November 4, 2015.
2.
Zygourakis  CC, Oh  T, Sun  MZ, Barani  I, Kahn  JG, Parsa  AT.  Surgery is cost-effective treatment for young patients with vestibular schwannomas: decision tree modeling of surgery, radiation, and observation.  Neurosurg Focus. 2014;37(5):E8. doi:10.3171/2014.8.FOCUS14435PubMedGoogle ScholarCrossref
3.
Zygourakis  CC, Valencia  V, Boscardin  C,  et al.  Predictors of variation in neurosurgical supply costs and outcomes across 4,904 surgeries at a single institution [published online September 6, 2016].  World Neurosurg.PubMedGoogle Scholar
4.
Okike  K, O’Toole  RV, Pollak  AN,  et al.  Survey finds few orthopedic surgeons know the costs of the devices they implant.  Health Aff (Millwood). 2014;33(1):103-109.PubMedGoogle ScholarCrossref
5.
Crosson  FJ.  Change the microenvironment: delivery system reform essential to control costs.  Mod Healthc. 2009;39(17):20-21.PubMedGoogle Scholar
6.
Sinaiko  AD, Chien  AT, Rosenthal  MB.  The role of states in improving price transparency in health care.  JAMA Intern Med. 2015;175(6):886-887.PubMedGoogle ScholarCrossref
7.
Castlight Health. http://www.castlighthealth.com/. Accessed November 4, 2015.
8.
Healthcare Bluebook. http://www.healthcarebluebook.com/. Accessed November 4, 2015.
9.
CompareMaine. Health costs & quality. http://comparemaine.org/. Accessed November 4, 2015.
11.
Schroeder  SA, Myers  LP, McPhee  SJ,  et al.  The failure of physician education as a cost containment strategy: report of a prospective controlled trial at a university hospital.  JAMA. 1984;252(2):225-230.PubMedGoogle ScholarCrossref
12.
Schroeder  SA, Kenders  K, Cooper  JK, Piemme  TE.  Use of laboratory tests and pharmaceuticals: variation among physicians and effect of cost audit on subsequent use.  JAMA. 1973;225(8):969-973.PubMedGoogle ScholarCrossref
13.
Tu  HT, Lauer  JR.  Impact of health care price transparency on price variation: the New Hampshire experience.  Issue Brief Cent Stud Health Syst Change. 2009;(128):1-4.PubMedGoogle Scholar
14.
Tabib  CH, Bahler  CD, Hardacker  TJ, Ball  KM, Sundaram  CP.  Reducing operating room costs through real-time cost information feedback: a pilot study.  J Endourol. 2015;29(8):963-968. PubMedGoogle ScholarCrossref
15.
Zygourakis  CC, Winkler  E, Pitts  L, Hannegan  L, Franc  B, Lawton  MT.  Clinical utility and cost analysis of routine postoperative head CT in elective aneurysm clippings [published online April 29, 2016].  J Neurosurg.PubMedGoogle Scholar
16.
Gitelis  M, Vigneswaran  Y, Ujiki  MB,  et al.  Educating surgeons on intraoperative disposable supply costs during laparoscopic cholecystectomy: a regional health system’s experience.  Am J Surg. 2015;209(3):488-492.PubMedGoogle ScholarCrossref
17.
Gunaratne  K, Cleghorn  MC, Jackson  TD.  The Surgeon Cost Report Card: a novel cost-performance feedback tool.  JAMA Surg. 2016;151(1):79-80.PubMedGoogle ScholarCrossref
18.
US Inflation Calculator. Consumer Price Index data from 1913 to 2016. http://usinflationcalculator.com. Accessed April 10, 2016.
19.
Pettengill  J, Vertrees  J.  Reliability and validity in hospital case-mix measurement.  Health Care Financ Rev. 1982;4(2):101-128.PubMedGoogle Scholar
20.
Dripps  RD.  New classification of physical status.  Anesthesiology. 1963;24:111.Google Scholar
21.
Rauh  MA, Krackow  KA.  In-hospital deaths following elective total joint arthroplasty.  Orthopedics. 2004;27(4):407-411.PubMedGoogle Scholar
22.
Wolters  U, Wolf  T, Stützer  H, Schröder  T.  ASA classification and perioperative variables as predictors of postoperative outcome.  Br J Anaesth. 1996;77(2):217-222.PubMedGoogle ScholarCrossref
23.
Cassel  CK, Guest  JA.  Choosing Wisely: helping physicians and patients make smart decisions about their care.  JAMA. 2012;307(17):1801-1802.PubMedGoogle ScholarCrossref
24.
Tilburt  JC, Wynia  MK, Sheeler  RD,  et al.  Views of US physicians about controlling health care costs [published corrections appear in JAMA. 2013;310(19):2102 and 2013;310(8):857].  JAMA. 2013;310(4):380-388.PubMedGoogle ScholarCrossref
25.
U.S. Food and Drug Administration. Updated: laparoscopic uterine power morcellation in hysterectomy and myomectomy: FDA Safety Communication. Date issued: November 24, 2014. http://www.fda.gov/MedicalDevices/Safety/AlertsandNotices/ucm424443.htm. Accessed May 20, 2016.
×