[Skip to Navigation]
Sign In
Figure.  Intervention Timeline With 3 Periods of Data Collection3,4
Intervention Timeline With 3 Periods of Data Collection

Three periods of data collection include: (1) preimplementation of the web-based handoff tool, (2) postimplementation, and (3) postimplementation, matched by time of year. Regionalization of general medicine service occurred after data collection period 2. IPASS indicates Illness severity, Patient summary, Action items, Situational awareness, Synthesis by receiver; and NF, nightfloat.

Table.  Adjusted Effect of Web-Based Handoff Tool and Training of Health Care Professionals on Rates of Medical Errors
Adjusted Effect of Web-Based Handoff Tool and Training of Health Care Professionals on Rates of Medical Errors
1.
Accreditation Council for Graduate Medical Education. Duty hours. http://www.acgme.org/What-We-Do/Accreditation/Duty-Hours. Accessed May 10, 2016.
2.
Schnipper  JL, Karson  A, Morash  S,  et al. Design and evaluation of a multi-disciplinary web-based handoff tool. Presented at: Society of General Internal Medicine 35th Annual Meeting; May 12, 2012; Orlando, Florida.
3.
Starmer  AJ, Spector  ND, Srivastava  R,  et al; I-PASS Study Group.  Changes in medical errors after implementation of a handoff program.  N Engl J Med. 2014;371(19):1803-1812.PubMedGoogle ScholarCrossref
4.
Agency for Healthcare Research and Quality, US Dept of Health & Human Services. TeamSTEPPS: strategies and tools to enhance performance and patient safety. http://www.ahrq.gov/professionals/education/curriculum-tools/teamstepps/index.html. Updated May 2016. Accessed April 12, 2016.
5.
National Coordinating Council for Medication Error Reporting and Prevention. Types of medication errors http://www.nccmerp.org/types-medication-errors. Accessed May 3, 2016.
6.
Riesenberg  LA, Leitzsch  J, Massucci  JL,  et al.  Residents’ and attending physicians’ handoffs: a systematic review of the literature.  Acad Med. 2009;84(12):1775-1787.PubMedGoogle ScholarCrossref
1 Comment for this article
EXPAND ALL
Conclusion confounded by Improvement efforts
Eran Bellin | Professor of Clinical Epidemiology and Population Health and Medicine Albert Einstein College of Medicine Bronx, N.Y.
Dr. Mueller and Colleagues have undertaken important and meaningful interventions to reduce errors in transitions of care. Unfortunately, their conclusions are not supported by their data.
They describe three groups:
1-Pre web intervention
2. Post web intervention unmatched for time of year. More experienced House staff in 2.
3. Post web intervention matched for time of year but contaminated by an aggressive effort to institute geographic consolidation of a service's patients.
Their comparisons demonstrate that errors were reduced between 1 and 3 but not between 1 and 2. Further comparisons between 2 and three showed dramatic improvement suggesting
that either the geographic intervention alone or its synergistic effect with the web intervention were responsible for the dramatic reduction in errors.

All other comparisons are made between 1 vs (2 and 3) making it impossible to isolate the web intervention.

It is not unusual in systems committed to ongoing improvement that more than one intervention accumulates over the time of the study. There is no shame in this. Quite the contrary. But the drawn inference both in the paper and undoubtedly in its more public reporting that the authors have demonstrated utility of the web based transition of care tool is unsupported.

I would of course personally advocate for the use of such tools as they have obvious face validity.

Another point that requires clarification is the author's statement that while the medical service did show improvement in errors overall, the surgical program did not. I am not clear how to interpret the table identifying the surgical service error reduction with a p value of <.01.

Sincerely,
Eran Bellin, M.D.
CONFLICT OF INTEREST: None Reported
READ MORE
Research Letter
September 2016

Association of a Web-Based Handoff Tool With Rates of Medical Errors

Author Affiliations
  • 1Division of General Internal Medicine, Brigham and Women’s Hospital, Boston, Massachusetts
  • 2Harvard Medical School, Boston, Massachusetts
JAMA Intern Med. 2016;176(9):1400-1402. doi:10.1001/jamainternmed.2016.4258

Communication among health care personnel is vulnerable to error during patient handoffs (ie, the transfer of responsibility for patient care between health care professionals). Handoffs occur with high frequency in the hospital and have been increasing following restrictions of resident work hours.1 However, to our knowledge, there remains a lack of rigorously performed studies that help guide best practices in handoffs of hospitalized adult patients. In this study, we implemented a web-based handoff tool and training for health care professionals, and evaluated the association of the tool with rates of medical errors in adult medical and surgical patients.

Methods

We conducted a prospective cohort analysis from November 1, 2012, to February 1, 2014, of 5407 patients on 3 general medicine services and 2 general surgery services at Brigham and Women’s Hospital during 1 data collection period before implementation of a web-based handoff tool and 2 periods after implementation.2 Between periods 2 and 3, general medicine services (but not surgical services) underwent restructuring to regionalized care teams (Figure).3,4

To screen for potential errors, validated surveillance surveys3 were administered to “nightfloat” (working 12 am to 7 am) and “twilight” (working 4 pm to 12 am) residents on completion of their shifts, and to residents and attending physicians 2 days after starting on the general medical or surgical service, querying for potential errors, followed by targeted review of medical records. All incidents were rated on presence of errors and level of harm using the National Coordinating Council for Medication Error Reporting and Prevention scale5 and on attribution to failures in communication and handoff. Incidents with harm (adverse events) were additionally rated on preventability.3 All ratings were adjudicated by a physician who was unaware of the time period; discrepancies in ratings prompted review of medical records, with final determination by the adjudicator (S.K.M.). The study was approved by the Partners Healthcare Human Subjects Review Committee. The need for patient consent was waived by the institutional review board as this was a hospital-wide quality improvement initiative with additional focused teamwork and tool training on the intervention units.

Patient characteristics were compared using χ2 or t tests. All outcomes were converted to errors per 100 patient-days (error rates), which were compared in period 1 vs 2 and 3 using multivariable Poisson regression (SAS, version 9.3; SAS Institute), clustering by role and adjusting for covariates.

Results

Of the 5407 total patients, 77 medical errors were detected before the intervention vs 45 after the intervention. Primary and secondary outcomes (Table) are notable for significant reductions in total medical error rates per 100 patient-days (period 1 rate, 3.56; 95% CI, 1.70-7.44; period 2 and 3 rate, 1.76; 95% CI, 0.93-3.31; P < .001), errors owing to failures in communication (period 1 rate, 2.88; 95% CI, 1.22-6.82; period 2 and 3 rate, 1.15; 95% CI, 0.76-1.74; P < .001), errors owing to mistakes in handoffs (period 1 rate, 2.47; 95% CI, 1.00-6.07; period 2 and 3 rate, 0.95; 95% CI, 0.56-1.61; P < .001), errors from end-of-shift (but not end-of-rotation) handoffs (period 1 rate, 6.93; 95% CI, 5.36-8.76; period 2 and 3 rate, 3.59; 95% CI, 2.55-4.87; P = .001), and errors on both medical (period 1 rate, 3.18; 95% CI, 2.45-4.05; period 2 and 3 rate, 1.30; 95% CI, 0.85-1.87; P < .001) and surgical (period 1 rate, 13.11; 95% CI, 7.69-20.63; period 2 and 3 rate, 5.45; 95% CI, 3.40-8.20; P < .001) services. Total error rates were also significantly reduced on the medical services in period 1 vs period 3 (incident rate ratio, 0.47; 95% CI, 0.33-0.66) and in period 2 vs period 3 (incident rate ratio, 0.40; 95% CI, 0.17-0.96), but not on the surgical services.

Discussion

We found that implementation of a web-based handoff tool and training for health care professionals was associated with a significant reduction in rates of medical errors, driven largely by a reduction in errors attributable to communication failure and errors that occurred during end-of-shift handoffs. It is possible that the tool was more adept at improving end-of-shift handoffs, although it is also plausible that our study was underpowered to examine end-of-rotation handoffs, supported by the trend toward reduced errors observed in that subgroup.

More important, the reduction in rates of medical errors remained significant in the time-matched analysis (periods 1 vs 3), accounting for potential effects of resident experience. In addition, we saw a stepwise reduction in rates of errors on general medicine services, suggesting that regionalization between periods 2 and 3 had an additive or synergistic effect, supported by the fact that this reduction was not replicated on surgical services. As noted in the Figure, regionalization included dedicated time for handoffs. These results add to existing literature, which has focused mainly on the connection between poor-quality handoffs and medical errors,6 or evaluating the effects of interventions in limited patient populations with variable use of information technology tools.3

Our findings are subject to several limitations. As this was a single-site study, our findings may not be generalizable to other institutions. However, the components of the handoff tool are easily adaptable to other sites,2 including those that use vendor electronic health records. In addition, we are not able to separate the effect of the handoff tool from that of training for health care professionals.

Conclusions

Our findings suggest that implementation of a web-based handoff tool and training for health care professionals is associated with fewer medical errors, particularly those owing to communication failures. In addition, our intervention appeared synergistic (or additive) with concurrent care team regionalization, suggesting effectiveness in a real-world context.

Back to top
Article Information

Corresponding Author: Stephanie Mueller, MD, MPH, Division of General Internal Medicine, Brigham and Women’s Hospital, 1620 Tremont St, Roxbury, MA 02120 (smueller1@partners.org).

Published Online: August 1, 2016. doi:10.1001/jamainternmed.2016.4258.

Author Contributions: Dr Mueller had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Mueller and Schnipper.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Mueller.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Mueller, Yoon.

Administrative, technical, or material support: Mueller.

Study supervision: Schnipper.

Conflict of Interest Disclosures: Dr Schnipper reported receiving grant funding from Sanofi Aventis for an investigator-initiated study to design and evaluate an intensive discharge and follow-up intervention in patients with diabetes. No other disclosures were reported.

Funding/Support: This research was supported by funds within the Department of Medicine, Brigham and Women’s Hospital.

Role of the Funder/Sponsor: The funding source had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
Accreditation Council for Graduate Medical Education. Duty hours. http://www.acgme.org/What-We-Do/Accreditation/Duty-Hours. Accessed May 10, 2016.
2.
Schnipper  JL, Karson  A, Morash  S,  et al. Design and evaluation of a multi-disciplinary web-based handoff tool. Presented at: Society of General Internal Medicine 35th Annual Meeting; May 12, 2012; Orlando, Florida.
3.
Starmer  AJ, Spector  ND, Srivastava  R,  et al; I-PASS Study Group.  Changes in medical errors after implementation of a handoff program.  N Engl J Med. 2014;371(19):1803-1812.PubMedGoogle ScholarCrossref
4.
Agency for Healthcare Research and Quality, US Dept of Health & Human Services. TeamSTEPPS: strategies and tools to enhance performance and patient safety. http://www.ahrq.gov/professionals/education/curriculum-tools/teamstepps/index.html. Updated May 2016. Accessed April 12, 2016.
5.
National Coordinating Council for Medication Error Reporting and Prevention. Types of medication errors http://www.nccmerp.org/types-medication-errors. Accessed May 3, 2016.
6.
Riesenberg  LA, Leitzsch  J, Massucci  JL,  et al.  Residents’ and attending physicians’ handoffs: a systematic review of the literature.  Acad Med. 2009;84(12):1775-1787.PubMedGoogle ScholarCrossref
×