Handoff miscommunications are a leading cause of medical errors. Studies comprehensively assessing handoff improvement programs are lacking.
To determine whether introduction of a multifaceted handoff program was associated with reduced rates of medical errors and preventable adverse events, fewer omissions of key data in written handoffs, improved verbal handoffs, and changes in resident-physician workflow.
Design, Setting, and Participants
Prospective intervention study of 1255 patient admissions (642 before and 613 after the intervention) involving 84 resident physicians (42 before and 42 after the intervention) from July-September 2009 and November 2009-January 2010 on 2 inpatient units at Boston Children’s Hospital.
Resident handoff bundle, consisting of standardized communication and handoff training, a verbal mnemonic, and a new team handoff structure. On one unit, a computerized handoff tool linked to the electronic medical record was introduced.
Main Outcomes and Measures
The primary outcomes were the rates of medical errors and preventable adverse events measured by daily systematic surveillance. The secondary outcomes were omissions in the printed handoff document and resident time-motion activity.
Medical errors decreased from 33.8 per 100 admissions (95% CI, 27.3-40.3) to 18.3 per 100 admissions (95% CI, 14.7-21.9; P < .001), and preventable adverse events decreased from 3.3 per 100 admissions (95% CI, 1.7-4.8) to 1.5 (95% CI, 0.51-2.4) per 100 admissions (P = .04) following the intervention. There were fewer omissions of key handoff elements on printed handoff documents, especially on the unit that received the computerized handoff tool (significant reductions of omissions in 11 of 14 categories with computerized tool; significant reductions in 2 of 14 categories without computerized tool). Physicians spent a greater percentage of time in a 24-hour period at the patient bedside after the intervention (8.3%; 95% CI 7.1%-9.8%) vs 10.6% (95% CI, 9.2%-12.2%; P = .03). The average duration of verbal handoffs per patient did not change. Verbal handoffs were more likely to occur in a quiet location (33.3%; 95% CI, 14.5%-52.2% vs 67.9%; 95% CI, 50.6%-85.2%; P = .03) and private location (50.0%; 95% CI, 30%-70% vs 85.7%; 95% CI, 72.8%-98.7%; P = .007) after the intervention.
Conclusions and Relevance
Implementation of a handoff bundle was associated with a significant reduction in medical errors and preventable adverse events among hospitalized children. Improvements in verbal and written handoff processes occurred, and resident workflow did not change adversely.
Communication errors are a leading cause of sentinel events, unexpected occurrences involving death or serious physical injury, or the risk thereof.1 The Agency for Healthcare Research and Quality (AHRQ) and the Accreditation Council for Graduate Medical Education (ACGME) have identified improving handoffs as a priority in US nationwide efforts to improve patient safety.2,3 The ACGME now requires residency programs to provide formal instruction in handoffs.3 Despite these new requirements and the increasing frequency of handoffs as a result of reductions in resident-physician work hours, many institutions do not have robust procedures for training residents or ensuring high-quality handoffs.4-8
Studies that have evaluated handoffs raise substantial concerns about their process and content.4,7,9 Suggested strategies to improve handoffs and reduce medical errors include communication training,10,11 the use of mnemonics to standardize handoffs,12 the restructuring of verbal handoffs by minimizing interruptions and involving all team members,13 and the use of written or computerized tools.14
We sought to combine these handoff interventions into a resident handoff bundle, introduced on 2 general pediatrics services. We assessed the association of this intervention with changes in medical error rates, miscommunications, and resident workflow.
Following approval by the Boston Children’s Hospital institutional review board, we conducted a prospective intervention study on 2 general inpatient pediatric units. Preintervention data were collected from July through September 2009. All components of the resident handoff bundle were implemented during October 2009, and postintervention data were collected from November 2009 through January 2010. Resident participants included interns (postgraduate year 1 residents) and senior residents (postgraduate year 3 residents). All residents, regardless of study participation or year of training, received training in handoff practices and were asked to use the new handoff structures and resident handoff bundle components when working on the study units. Direct observational and survey data were collected from those who provided written informed consent; residents were provided small incentives (cookies, gift cards) for providing data.
The 2 study units were staffed by day and night teams working shifts of 12 to 14 hours. Three interns and 1 senior resident covered each unit during the day; at night, 1 intern covered each unit, supervised by a senior resident who covered both units. All day interns rotated for 1 week (Sunday through Thursday) during their month-long ward rotation as part of the night team. Work schedules were the same in the before and after intervention periods. Admissions on unit 1 included general pediatric and subspecialty patients; unit 2 included general pediatric and complex care service patients.
Quiz Ref IDDuring the baseline period, interns and residents completed verbal handoffs separately (ie, intern to intern and resident to resident); residents were not routinely present for intern handoffs, nor were interns present for senior resident handoffs. There was no team-based approach, standardized structure, or dedicated physical environment for handoffs. A printed handoff document created using a word-processing program was exchanged during handoffs. The document template included patient name, medical record number, admission date, weight, allergies, synopsis of the admission history, and plan with “to do” tasks listed according to organ system. This document was not integrated with the electronic medical record (Cerner Powerchart).
Quiz Ref IDThe resident handoff bundle intervention was an educational and systems-based handoff intervention that was introduced to interns and to senior residents. The intervention consisted of the following components: (1) a 2-hour communication training session that introduced elements of a program developed by AHRQ and the US Department of Defense (TeamSTEPPS, Team Strategies, and Tools to Enhance Performance and Patient Safety)11 and included interactive discussion regarding best practices for verbal and written handoffs15,16; (2) the introduction of the SIGNOUT? mnemonic16 to standardize verbal handoffs; and (3) the restructuring of verbal handoffs to include integration of interns’ and senior residents’ separate handoffs into a unified team handoff; relocation of handoff to a private and quiet space; and introduction of periodic handoff oversight by a chief resident or attending physician (minimum of 1 observation per resident per month). In addition, for unit 1 only, a computerized handoff tool was created that was integrated into the electronic medical record. The computerized tool automatically imported patient name, sex, age, weight, medical record number, location, admission date, diagnosis, allergies, medications, intravenous access, code status, laboratories, vital signs, and problem list, with the goal of reducing inaccurate and out-of-date information.17 It also contained structured fields entitled “Patient Summary, “To Do List,” and “Contingency Planning” to prompt entry of key handoff information in a free-text format. During the postintervention period, unit 2 continued to use the baseline word processing handoff tool.
Medical Errors and Adverse Events
Quiz Ref IDWe applied standard definitions of medical errors as preventable failures in processes of care and adverse events as preventable and nonpreventable unintended consequences of medical care that lead to patient harm, using a well-established surveillance process.18-22 Two research nurses reviewed all medical records and orders on the study units, 5 days a week, with Monday reviews including a review of the weekends; collected solicited daily error reports from clinicians, including a daily survey of overnight residents in this study; and reviewed formal incident reports. Each suspected incident was reviewed by 2 physician investigators blinded to the unit and time period (before vs after intervention) in which the incident occurred. Investigators independently classified each incident as an adverse event, nonintercepted potential adverse event (nonintercepted near miss), intercepted potential adverse event (intercepted near-miss), error with little potential for harm, or exclusion (suspected incident reported by research nurse that physician reviewers believed did not meet medical error or adverse event criteria).20,21 Definitions of error and adverse event subtypes and representative examples are provided in eAppendix A (in the Supplement). Severity was rated using the modified National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP) Index for Categorizing Errors23 (eAppendix B in the Supplement). The preventability of adverse events was rated using a 4-point Likert scale (definitely preventable, probably preventable, probably not preventable, or definitely not preventable), which was dichotomized into preventable vs nonpreventable before analysis. Any disagreements in ratings were resolved by discussion between the reviewers. Prediscussion interrater agreement and κ scores were calculated. The primary outcome was a comparison of the rate of medical errors per 100 admissions before vs after the intervention. Rates of preventable adverse events and unit-specific changes in rates of medical errors were also determined.20,24
Assessment of Written Handoffs
We compared rates of omissions in a random sample of written handoffs. Each patient entry within the handoff document was reviewed by a single physician investigator (A.J.S.) for the presence or absence of 14 data elements identified through consensus of study coinvestigators and a review of the handoff literature.
Observation of Resident Workflow Patterns and Verbal Handoffs
We conducted time-motion observation sessions in 8- to 12-hour blocks to assess the verbal handoff environment and to quantify time spent by residents in handoff, direct patient care, and all other activities. Research assistants were scheduled to collect a representative ratio of hours from all 24 hours of the day and from weekday vs weekend hours. During scheduled observations, research assistants followed a single intern or resident and recorded start and stop times for activities using a time-motion database on a tablet–based Microsoft Access database. Activities were categorized according to a previously described physician task list25 that was modified for the pediatric inpatient setting. During evening verbal handoffs, additional situational data were collected, including handoff duration, number of interruptions, privacy, and ambient noise.
Patient, intern, and senior resident participant demographic characteristics were described using proportions for dichotomous variables and means for continuous variables. Within each study unit, demographic characteristics were compared before and after the intervention using Pearson χ2 for dichotomous variables and the Wilcoxon rank sum (2 sample) test for continuous variables. For both units combined, demographic characteristics for patients were compared using the Cochran Mantel-Haenszel test for dichotomous variables and a stratified Wilcoxon test for continuous variables.
Error rates (per 100 admissions) were compared using Poisson regression, with a dichotomous covariate for before vs after the intervention period. No other covariates were included because patient demographic characteristics were comparable for both study periods. Within each unit, we fit a Poisson regression model with a dichotomous covariate for before vs after the intervention period. To compare the percentage of written handoff documents with inclusion of a key data element, a generalized estimating equation z test was used, accounting for clustering by the date the handoff document was created.26,27
To compare the percentage of time spent in a given activity, a generalized estimating equation z test was used, accounting for clustering by observation session. This estimating equation was based on a Dirichlet distribution, a distribution for the percentage of time a continuous variable (in this case time) is in each category.
Verbal handoff characteristics were compared in a manner analogous to that for the comparison of demographic variables.
Two-sided P values <.05 were considered statistically significant. All analysis was completed using SAS/STAT version 9.2 (SAS Institute Inc).
The study was powered to address the primary outcome of rates of medical errors. Prior studies using the same surveillance methods detected approximately 55 errors (including serious and minor errors) per 100 admissions.20 We therefore anticipated that a sample size of 648 admissions (324 before and 324 after the intervention) would be required to have 80% power to detect a 20% reduction in the rate of total medical errors, assuming a 2-sided α error of .05. We therefore expected 3 months of data collection on each unit would be sufficient to see a change in medical error rates on each team.
Eighty-four residents (95.5%, 21 first-year interns, 21 third-year residents) provided written informed consent to participate before the intervention vs 97.7% (20 first-year interns, 22 third-year residents) after the intervention (P = .88). There were no significant differences in the demographic variables of residents participating in the study before vs after the intervention.
A total of 1255 patient admissions (n = 642 preintervention; n = 613 postintervention) were reviewed for the presence of medical errors. On the 2 study units combined, as well as on each individual unit, patient demographics were similar before vs after the intervention, as were length of stay and severity of illness. There were statistically significant differences in patient age, illness severity, and length of stay between unit 1 and unit 2 (Table 1).
Medical Errors and Adverse Events
Comparing preintervention and postintervention periods, implementation of the resident handoff bundle was associated with a reduction in overall medical error rates from 33.8 (95% CI, 27.3-40.3) to 18.3 (95% CI, 14.7-21.9) per 100 admissions (P < .001) on both units combined. Preventable adverse events decreased from 3.3 (95% CI, 1.7-4.8) to 1.5 (95% CI, 0.51-2.4) per 100 admissions (P = .04), nonintercepted potential adverse events decreased from 7.3 (95% CI, 5.0-9.6) to 3.3 (95% CI, 1.85-4.7) per 100 admissions (P = .002), intercepted potential adverse events decreased from 15.0 (95% CI, 11.2,18.7) to 8.3 (95% CI, 6.0-10.7) per 100 admissions (P < .001), and errors with little or no potential for harm decreased from 8.3 (95% CI, 5.4-11.1) to 5.2 (95% CI, 3.3-7.2) per 100 admissions (P = .04) (Table 2). There was no significant change in rates of nonpreventable adverse events (11 nonpreventable adverse events in the preintervention group and 10 in the postintervention group, or 1.7 [95% CI, 0.7-2.7] vs 1.6 [95% CI, 0.5-2.7] per 100 admissions, P = .91). The 350 errors and adverse events observed represented incidents related to medications (77%), procedures (8.3%), tests (4.3%), other therapies (3.4%), falls (3.7%), and other incidents (3.2%).
In categorizing detected incidents, physician reviewers had moderate preconsensus agreement regarding categorization of incidents as adverse events, intercepted potential adverse events, nonintercepted potential adverse events, medical errors with little potential for harm, or exclusions (65.2% agreement, κ = 0.53). Categorizations of incidents as harmful or not harmful were highly reliable (97.9% agreement, κ = 0.91). Raters had moderate preconsensus agreement regarding adverse event preventability (71.9% agreement, κ = 0.40).
Improvements in medical error rates in the postintervention period occurred in both unit 1, which received the computerized tool in addition to the resident handoff bundle (27.5 [95% CI, 19.0-36.1] vs 16.5 [95% CI, 12.1-20.9] per 100 admissions, P = .002) and unit 2 (41.2 [95% CI, 31.7-52.1] vs 21.5 [95% CI, 14.5-26.9] per 100 admissions, P < .001; Table 2).
Written Handoff Documentation
Forty written handoff documents representing 729 unique patient entries were reviewed for the presence of 14 data elements (Figure). For both units combined, implementation of the resident handoff bundle was associated with significant reductions in omissions of key data. More improvements were seen on unit 1, for which group the intervention included implementation of the computerized handoff tool (significant reductions in omissions in 11 categories) than on unit 2, for which group the residents continued using the word processor handoff tool used during the preintervention period (significant reductions in 2 categories).
Observation of Resident Workflow and Verbal Handoff Communications
A total of 795 hours of time-motion data were collected (322 hours in the preintervention group; 473 hours in the postintervention group, Table 3). For both units combined, the percentage of time in a 24-hour period spent in contact with patients and families significantly increased in the postintervention group (8.3% [95% CI, 7.1%-9.8%] vs 10.6% [95% CI, 9.2%-12.2%], P = .03). There was no significant change in overall time spent at the computer (24.2% [95% CI, 21.3%-27.35%] vs 23.2% [95% CI, 20.3%-26.3%], P = .64) or in time spent creating or editing the computerized handoff document (2.7% [95% CI, 1.6%-4.4%] vs 2.3% [95% CI, 1.6%-3.2%], P = .64), but the amount of time writing on printed copies of the handoff document decreased significantly (1.4% [95% CI, 1.1%-1.8%] vs 0.7% [95% CI, 0.6%-1.0%], P = .002).
The mean number of minutes spent on verbal handoff sessions did not change significantly following implementation (32.3 [95% CI, 25.3-39.3] vs 33.2 [95% CI, 28.5-37.9], P = .42). There was no significant change in the mean number of interruptions per handoff session, (3.2 [95% CI, 2.1-4.3] vs 2.1 [95% CI, 1.4-2.7], P = .22), although the mean number of interruptions per patient decreased during the postintervention period (0.25 [95% CI, 0.15-0.36] vs 0.13 [95% CI, 0.08-0.18] interruptions per patient, P = .02), and verbal handoffs were more likely to occur in a quiet location (33.3% [95% CI, 14.5%-52.2%] vs 67.9% [95% CI, 50.6%-85.2%], P = .02) and private location (50.0% [95% CI, 30%-70%] vs 85.7% [95% CI, 72.8%-98.7%], P = .004; Table 4).
Quiz Ref IDWe found that implementation of a resident handoff bundle was associated with a significant reduction in medical errors and preventable adverse events. Written handoffs were more comprehensive after the intervention, and verbal handoffs were more likely to occur in a quiet, private location. Implementation of the intervention was not associated with adverse effects on resident workflow: time spent on verbal handoffs did not change, and time spent at the computer did not increase; residents spent more time in the postintervention period in direct contact with patients. As expected, rates of non-preventable adverse events did not change.
Although medical errors occurred frequently, these rates include very minor as well as serious errors and are commensurate with the rates found in numerous other studies using the same intensive surveillance methods. 17,20,21,28,29 Prior research on the relationship between handoffs and patient safety has been limited. Although studies have found poor handoffs to be associated with higher rates of self-reported medical errors and adverse events, these data have not been substantiated objectively. A survey by Horwitz et al30 found that 29% of emergency medicine physicians and internists reported an adverse event or near-miss as a result of a poor handoff. A similar survey by Kitch et al31 found that 59% of medicine and surgery residents reported a patient being harmed as a result of a poor handoff. Petersen et al32 found that a disproportionate number of adverse events were reported through a voluntary incident report system when patients were cross-covered by a resident from outside of the primary team. Our study, which used an objective, comprehensive surveillance method, builds on this work, demonstrating an objective relationship between poor handoffs, errors, and preventable adverse events.
Studies reporting the effects of handoff interventions on patient safety are also quite limited. In a review, Cohen et al4 noted that although the published literature has shown that handoff interventions can improve care processes, it has yet to establish that attempts at handoff standardization are associated with improvements in measured patient outcomes. For example, Van Eaton et al33 found that improved clinician workflow patterns followed introduction of a computerized handoff tool, but the study was not designed to evaluate the effect on patient outcomes.
Quiz Ref IDTo maximize the chance that the handoff intervention would lead to measurable improvements in care, we chose to bundle together several evidence-based interventions. The bundling of multiple interventions has been an effective means of reducing surgical complications,34 catheter-related bloodstream infections,35 and ventilator-associated pneumonias.36 We found that error rates, care processes, and work flow were improved following introduction of the resident handoff bundle. A limitation of this approach, however, is that it prevents us from directly associating most observed changes with particular elements of the bundle so that it remains unclear which elements of the bundle are most important or whether all elements are needed together. Additionally, because there were multiple interrelated aspects of the bundled intervention, comparisons of written and oral process measures in addition to medical error rates were believed to be important. To avoid obscuring potential relationships between these process measures and the primary outcome (type 2 error), we considered P < .05 significant for both the main outcome and these multiple process measures. However, evaluations of the relationships between these processes and outcomes should be considered exploratory.
Our study has several additional limitations. First, we studied 2 inpatient units in a single pediatric hospital. Although the resident schedules and baseline handoff practices we evaluated are common, it is unclear how generalizable our findings may be to other settings. Second, because of the observational design, causality cannot be established. In particular, our study design has the potential for confounding because the preintervention data were collected during the summer and fall, and postintervention data were collected during the subsequent winter. Therefore, increased resident experience over time, differences in patient populations, or other ongoing patient safety interventions might have contributed to reductions in overall error rates.
However, most studies of what is sometimes termed the July effect have either found that it does not exist or that it is small in magnitude. In a systematic review of prior studies of the July effect, Young et al37 found that 55% of higher-quality studies showed no effect; of the 45% of higher-quality studies that showed a relationship between mortality and the July effect, effect size ranged from an increase of 4.3% to 12%. Likewise, 17 of 23 studies of morbidity or medical errors and a possible July effect found no effect. Additionally, a study by Landrigan et al28 using systematic error surveillance methods over a 6-year period in 10 hospitals demonstrated stable temporal trends in rates of patient harm over time.28 Therefore, we believe it likely that our intervention played a significant role in the safety improvements observed. Larger-scale, multicenter studies will be needed to investigate these findings more thoroughly and quantify the degree of change attributable to handoff interventions.
Third, on the unit that introduced a computerized handoff tool, many of the decreases in omissions of key handoff data were the result of autoimportation of data (eg, dated laboratory results, medication list) rather than resident-physician behavioral change, although autoimporting did not contribute to changes in some other items (to do list; contingency plans). Regardless of how the change was accomplished, however, we believe the inclusion of more complete handoff data are valuable.
Fourth, the nurses collecting data and research assistants could not be blinded to the intervention period, an issue commonly encountered in investigating systematic interventions to improve patient safety. We addressed this by training observers—none of whom were study investigators—in a standardized fashion, which included emphasizing the importance of consistent, objective detection of serious errors, regardless of study schedule. Additionally, every suspected event was subsequently reviewed by 2 independent investigators who were blinded to intervention period. Despite these measures, we cannot exclude the possibility that some bias may have resulted from the inability to blind the primary detection process. Although agreement on incident classification and preventability was in line with previous patient safety studies,24 categorizing and definitively assessing the preventability of errors is complex.
Implementation of a handoff bundle was associated with a significant reduction in medical errors and preventable adverse events among hospitalized children. Improvements in verbal and written handoff processes occurred, and resident workflow did not change adversely. Given the increasing frequency of handoffs in hospitals following resident work-hour reductions38 and the high frequency with which miscommunications lead to serious medical errors, disseminating high-quality handoff improvement programs has the potential for benefit. Further work to improve and standardize handoffs across specialties and settings may lead to improvement in the safety of patients in teaching hospitals nationwide.
Corresponding Author: Amy J. Starmer, MD, MPH, Division of General Pediatrics, Department of Medicine, Boston Children's Hospital, Harvard Medical School, 300 Longwood Ave, Boston, MA 02115 (firstname.lastname@example.org).
Author Contributions: Dr Starmer had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Starmer, Sectish, Simon, Keohane, Wassner, Harper, Landrigan.
Acquisition of data: Starmer, Simon, Wassner.
Analysis and interpretation of data: Starmer, Simon, McSweeney, Chung, Yoon, Lipsitz, Landrigan.
Drafting of the manuscript: Starmer, Yoon, Lipsitz.
Critical revision of the manuscript for important intellectual content: Starmer, Sectish, Simon, Keohane, McSweeney, Chung, Lipsitz, Wassner, Harper, Landrigan.
Statistical analysis: Starmer, Yoon, Lipsitz, Landrigan.
Obtained funding: Starmer, Landrigan.
Administrative, technical, or material support: Starmer, Sectish, Simon, Keohane, Wassner, Harper, Landrigan.
Study supervision: Starmer, Sectish, Lipsitz, Landrigan.
Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Dr Landrigan reported receiving monetary awards, honoraria, and travel reimbursement from multiple academic and professional organizations for delivering lectures on sleep deprivation, physician performance, handoffs, and safety. No other authors reported disclosures.
Funding/Support: This study was supported by the Controlled Risk Insurance Company Risk Management Foundation Grant Program as well as a grant from the Boston Children’s Hospital Program for Patient Safety and Quality Research Grant Program. Dr Starmer was supported by grants T32 HP10018, a National Research Service Award in Pediatrics and 1K12HS019456-01 from the Oregon Comparative Effectiveness Research K12 Program through the Agency for Healthcare Research and Quality. Dr Landrigan is partially supported by the Child Health Corporation of America for his work as a member of the PRIS Network Executive Council. Some support for statistical analyses provided in part by the Oregon Clinical and Translational Research Institute (OCTRI) and grant UL1TR000128 from the National Center for Advancing Translational Sciences (NCATS) at the National Institutes of Health (NIH).
Role of the Sponsor: The sponsors of the study had no role in the design and conduct of the study; collection, management, analysis and interpretation of the data; or the preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Additional Contributions: We thank Lauren Steffel, MD (Stanford University, Stanford School of Medicine), Caroline Hodgkins, MPH (University of Michigan), Shannon Cottreau, BSN, RN, CPN (Boston Children’s Hospital), and Lora Bemiss, BSN (Boston Children’s Hospital), for assistance with enrollment and data collection; Matt Wien, BS (Brigham and Women’s Hospital), for assistance with development and programing of study databases; Ellen McGrath, RN (Boston Children’s Hospital), and Mark Berry, MA, CCRC (Boston Children’s Hospital), for assistance with study coordination and intervention deployment; Zunqiu Chen, MS (Oregon Health and Science University), Lindsay McFarlane, BA (Boston Children’s Hospital), Gisele Charron, BS (Boston Children’s Hospital), for statistical support and data analysis; Julie Barenholtz, MSW (Boston Children’s Hospital, New England Research Institutes), for building study databases and data analysis; Robert Kenney, MD (Boston Children’s Hospital), and Gajen Sunthara, MS (Boston Children’s Hospital), for development and programing of the computerized handoff tool; Nancy Spector MD, (St Christopher’s Hospital for Children, Drexel University School of Medicine), Rajendu Srivastava, MD, MPH (University of Utah School of Medicine, Primary Children’s Medical Center), Lisa Tse, BA (Boston Children’s Hospital), and Elizabeth Noble, BA (Boston Children’s Hospital) for careful review of manuscript drafts; and all the resident physicians at Boston Children’s Hospital who agreed to participate in the study. Dr Steffel, Mss Hodgkins, Cottreau, Bemiss, and McFarlane, Mr Chen, and resident study participants received compensation through grant support for their contributions. Messrs Wien, Berry, Sunthara, Mss McGrath, Charron, Barenholtz, and Noble, and Drs Kenny, Spector, and Srivastava did not receive financial compensation for contributions.
et al. Closing the Quality Gap: A Critical Analysis of Quality Improvement Strategies.Vol 7. Rockville, MD: Care Coordination; 2007.
Jr; ACGME Duty Hour Task Force. The new recommendations on duty hours from the ACGME Task Force. N Engl J Med
. 2010;363(2):e3.PubMedGoogle ScholarCrossref
PB. The published literature on handoffs in hospitals: deficiencies identified in an extensive review. Qual Saf Health Care
. 2010;19(6):493-497.PubMedGoogle ScholarCrossref
R. Educational interventions to improve handover in health care: a systematic review. Med Educ
. 2011;45(11):1081-1089.PubMedGoogle ScholarCrossref
SJ. Transfers of patient care between house staff on internal medicine wards: a national survey. Arch Intern Med
. 2006;166(11):1173-1177.PubMedGoogle ScholarCrossref
et al. Residents’ and attending physicians’ hand-offs: a systematic review of the literature. Acad Med
. 2009;84(12):1775-1787.PubMedGoogle ScholarCrossref
I; Handoff Education and Assessment for Residents (HEAR) Computer Supported Cooperative Workgroup. The patient handoff: a comprehensive curricular blueprint for resident education to improve continuity of care. Acad Med
. 2012;87(4):411-418.PubMedGoogle ScholarCrossref
DO. Communication failures in patient sign-out and suggestions for improvement: a critical incident analysis. Qual Saf Health Care
. 2005;14(6):401-407.PubMedGoogle ScholarCrossref
D. The human factor: the critical importance of effective teamwork and communication in providing safe care. Qual Saf Health Care
. 2004;13(suppl 1):i85-i90.PubMedGoogle ScholarCrossref
JO. Handoff strategies in settings with high consequences for failure: lessons for health care operations. Int J Qual Health Care
. 2004;16(2):125-132.PubMedGoogle ScholarCrossref
HT. Review of computerized physician handoff tools for improving the quality of patient care. J Hosp Med
. 2013;8(8):456-463.PubMedGoogle ScholarCrossref
J. A model for building a standardized hand-off protocol. Jt Comm J Qual Patient Saf
. 2006;32(11):646-655.PubMedGoogle Scholar
ML. Development and implementation of an oral sign-out skills curriculum. J Gen Intern Med
. 2007;22(10):1470-1474.PubMedGoogle ScholarCrossref
et al. Effect of bar-code technology on the safety of medication administration. N Engl J Med
. 2010;362(18):1698-1707.PubMedGoogle ScholarCrossref
DL, Vander Vliet
L. Relationship between medication errors and adverse drug events. J Gen Intern Med
. 1995;10(4):199-205.PubMedGoogle ScholarCrossref
et al; ADE Prevention Study Group. Incidence of adverse drug events and potential adverse drug events: implications for prevention. JAMA
. 1995;274(1):29-34.PubMedGoogle ScholarCrossref
et al. Medication errors and adverse drug events in pediatric inpatients. JAMA
. 2001;285(16):2114-2120.PubMedGoogle ScholarCrossref
et al. Effect of reducing interns’ work hours on serious medical errors in intensive care units. N Engl J Med
. 2004;351(18):1838-1848.PubMedGoogle ScholarCrossref
et al. The Critical Care Safety Study: The incidence and nature of adverse events and serious medical errors in intensive care. Crit Care Med
. 2005;33(8):1694-1700.PubMedGoogle ScholarCrossref
DW. Adverse drug events and medication errors: detection and classification methods. Qual Saf Health Care
. 2004;13(4):306-314.PubMedGoogle ScholarCrossref
et al. Primary care physician time utilization before and after implementation of an electronic health record: a time-motion study. J Biomed Inform
. 2005;38(3):176-188.PubMedGoogle ScholarCrossref
J. GEE with Gaussian estimation of the correlations when data are incomplete. Biometrics
. 2000;56(2):528-536.PubMedGoogle ScholarCrossref
PJ. Temporal trends in rates of patient harm resulting from medical care. N Engl J Med
. 2010;363(22):2124-2134.PubMedGoogle ScholarCrossref
et al; PILL-CVD (Pharmacist Intervention for Low Literacy in Cardiovascular Disease) Study Group. Effect of a pharmacist intervention on clinically important medication errors after hospital discharge: a randomized trial. Ann Intern Med
. 2012;157(1):1-10.PubMedGoogle Scholar
GY. Dropping the baton: a qualitative analysis of failures during the transition from emergency department to inpatient care. Ann Emerg Med
. 2009;53(6):701-710.e704.PubMedGoogle ScholarCrossref
et al. Handoffs causing patient harm: a survey of medical and surgical house staff. Jt Comm J Qual Patient Saf
. 2008;34(10):563-570.PubMedGoogle Scholar
TH. Does housestaff discontinuity of care increase the risk for preventable adverse events? Ann Intern Med
. 1994;121(11):866-872.PubMedGoogle ScholarCrossref
CA. A randomized, controlled trial evaluating the impact of a computerized rounding and sign-out system on continuity of care and resident work hours. J Am Coll Surg
. 2005;200(4):538-545.PubMedGoogle ScholarCrossref
et al; Safe Surgery Saves Lives Study Group. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med
. 2009;360(5):491-499.PubMedGoogle ScholarCrossref
et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med
. 2006;355(26):2725-2732.PubMedGoogle ScholarCrossref
T. Using a bundle approach to improve ventilator care processes and reduce ventilator-associated pneumonia. Jt Comm J Qual Patient Saf
. 2005;31(5):243-248.PubMedGoogle Scholar
AD. “July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review. Ann Intern Med
. 2011;155(5):309-315.PubMedGoogle ScholarCrossref
et al. Effect of the 2011 vs 2003 duty hour regulation-compliant models on sleep duration, trainee education, and continuity of patient care among internal medicine house staff: a randomized trial. JAMA Intern Med
. 2013;173(8):649-655.PubMedGoogle ScholarCrossref