Graph illustrates the percentages and standard errors of 10 possible briefing and debriefing components completed during each phase and performance improvement initiatives implemented over the course of study.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Anderson CI, Gupta RN, Larson JR, et al. Impact of Objectively Assessing Surgeons’ Teaching on Effective Perioperative Instructional Behaviors. JAMA Surg. 2013;148(10):915–922. doi:10.1001/jamasurg.2013.2144
Advancing surgical technology and decreasing resident learning hours have limited exposure to perioperative training, necessitating more effective and efficient perioperative teaching by faculty surgeons. Participation in collaborative efforts and process improvement can change behaviors and enhance teaching.
To promote deliberate teaching of residents, change resident perception of their teachers, and produce sustainable improvements by objectively measuring surgeons’ perioperative teaching performance.
Design, Setting, and Participants
This 3-phase observational study of surgeons’ perioperative teaching behaviors included university-based surgeons, general surgery residents, and preclinical student observers and involved elective cases at a 600+ bed tertiary hospital. Initially, we measured teaching behaviors by surgeons unaware of study objectives, provided aggregate and confidential individual feedback, and developed standardized preoperative briefings and postoperative debriefings. Phase 2 applied a deliberate teaching model and reinforced behaviors with continuous process improvement efforts (Plan, Do, Check, Act) and repeat observations. Phase 3 used resident prompts to enhance teaching behaviors and demonstrate sustainability. Resident surveys conducted 3 times assessed perceptions of deliberate guidance by faculty when compared with national benchmarks.
Introduction of deliberate faculty preprocedural focusing and postprocedural reinforcement to facilitate resident learning.
Main Outcomes and Measures
More frequent and complete perioperative teaching by faculty and the perception of enhanced teaching by residents.
Faculty more commonly and more completely performed the 10-step preoperative briefings and postoperative debriefings (P < .001) during phase 2 (250% improvement over baseline). Intraoperative teaching styles significantly improved and residents’ survey-reported assessments of faculty teaching improved over national data for describing procedural steps (P = .02) and requests for resident self-evaluation (P = .006).
Conclusions and Relevance
Objective recording of teaching behavior frequency motivated adoption of deliberate guided teaching behaviors by surgeons, resulting in both subjective reports by residents of more frequent teaching and objective recording of parallel improvements. A deliberate focus on objectively assessing surgeon educators’ periprocedural teaching may motivate improved teaching.
Surgical proficiency comes with time, practice, graduated responsibility, and guided instruction.1-3 Case numbers have been the gold standard since Halsted introduced the apprenticeship model.4-6 New challenges in surgical education suggest this model is becoming outdated.7 Compliance with external regulations and programmatic requirements requires extensive nonoperative time for faculty surgeons and residents. Advancing surgical technologies and increased specialization add complexity to the curriculum, further limiting resident operative exposure to both these new case types (eg, robotics) and more traditional cases (eg, open cholecystectomy).6 Heightened emphasis on operating room efficiencies and perioperative safety processes, although important, may further diminish intraoperative teaching time.8-13 Reduced resident work hours and time for simulation training may also reduce intraoperative exposure, leaving many residents feeling unprepared to become independent surgeons.6,14,15 Given these challenges, the traditional apprenticeship model alone may no longer adequately allow expert surgeons the time to teach surgical proficiency to an apprentice in the same manner in which surgeons were traditionally trained.7 Consequently, it has become increasingly desirable that each resident operative experience include a deliberate, structured, and effective educational interface with a teaching surgeon.16 Surgical educators have long recognized the problem of unstructured perioperative teaching and have developed instruments (eg, checklists and global rating scales) to guide faculty, prescribing what should be taught and how to assess proficiency.17-20 Faculty development courses have been designed to assist surgeons in becoming better teachers, but training is minimal and often lacks the consistent reinforcement needed to produce lasting change.1,3,5,21,22 Furthermore, if surgeons are not actively involved in the development of new approaches to teach or if the faculty lack motivation to change, faculty buy-in is impaired and the best intentions fall short of sustainable improvement.22,23 Despite these efforts, reports continue that modern residents do not believe that they are being taught as well as they would like and nationally rate their attendings’ teaching as less effective than attendings’ self-ratings.24-28
Therefore, today’s educators recommend effective deliberate perioperative instruction, exchanging pure discovery learning for guided learning.21,29,30 The briefing, intraoperative teaching, and debriefing model is one tool for guided discovery learning, in which knowledgeable experts help trainees establish preoperative learning objectives based on the student’s needs and then offer postoperative debriefings with opportunities for self-reflection and feedback.29,31-33 Studies that quantify surgeons’ communication styles and teaching behaviors into categories (eg, informing and questioning) have offered further insight into how an expert imparts knowledge and skills to a learner.1,3,5,16,34 Unfortunately, to our knowledge, the actual impact these methods may have on teaching performance in the operating room has not been systematically or consistently assessed.
Therefore, we began by objectively assessing current operating room teaching behaviors, observing faculty surgeons who were blinded to the study at that time. We recorded their use of preoperative assessment and postoperative feedback, as well as their intraoperative teaching styles. We hypothesized that objectively measuring our teachers’ performance and providing them with iterative feedback through proven process improvement methods8,35-38 would promote deliberate teaching and enhance the use of modeled teaching behaviors, as well as enhance residents’ perceptions of the quality of our teaching. We further hypothesized that these efforts could be sustained over time.
The study was conducted in 3 phases. During the initial phase, surgeons consented to allowing students to observe their cases but were blinded to study objectives. Baseline data were collected, aggregate results reviewed with faculty and residents, and individual performance data confidentially shared with each surgeon. Faculty then helped design a preoperative briefing, including identification of resident learning objectives, and a postoperative debriefing. During the second phase, the teaching model was implemented and serial observations of our faculty and residents were conducted over the next 6 months with process improvement methods (Plan, Do, Check, Act)38 applied to facilitate change. Aggregate results over these 6 months were compared with baseline results from phase 1. The third phase of study was initiated with a new academic year and included periodic observations (C.I.A. and 1 student) to replace the original students who were no longer available to participate.
Eight students (5 premedical and 3 preclinical) attended training sessions to familiarize them with the operating room environment and enhance their observational skills. None had previous exposure to the operating room. Students independently completed the observation tool while observing 2 video recordings of a surgeon and a resident interacting in a mock operating room. One video exhibited good teaching behaviors and one portrayed poor teaching behaviors. The students then met to compare scoring variations and rehearsed various operating room scenarios until they were consistently able to complete the tool with minimal variability. Students were instructed to keep confidential both their instructions and the data they were collecting. To avoid any potential bias, medical students entering their third year clerkships were no longer used as observers. The Michigan State University institutional review board approved the study after discussing ethical issues surrounding deception (blinded phase) and confidentiality (individual performance).
Two faculty and 2 senior residents reviewed surgical29,31 and education literature39 data reported on a previously validated survey instrument,24 and additional studies on learning and skill acquistion2,40,41 that compared pure discovery learning with guided learning in nonmedical fields. We could not identify a universal or standardized tool for measuring surgical teaching behavior in the operating room. An iterative process was used by the group until agreement was reached on components of a final observation tool that included the occurrence of 15 potential communication elements (6 style related and 9 procedural teaching style) occurring between the resident and the surgeon during each operation (Table 1). Ten components of a conceptual model developed by Roberts et al29 to enhance learning in the operating room were chosen as a framework for the surgeon to conduct a preoperative briefing and a postoperative debriefing with their resident for each case. Although Roberts’ model had not previously been validated to determine whether learning had actually occurred, we elected to adopt the model and focus on introducing a process (not unlike preoperative time-outs)42,43 that would encourage at least some form of communication between the attending and the resident.
We conducted preliminary observational testing to clarify the terms used. Then, observers were asked to record when they heard the attending surgeon and the resident discuss any of the 10 components of the preoperative briefing and postoperative debriefing, as well as the intraoperative observable behaviors (Table 1). The type of procedure, case duration, and resident training level were also recorded. We validated our observation tool by demonstrating low interobserver variability. Twenty-one initial cases were observed by 2 students simultaneously using Cohen kappa, yielding a mean value of 0.78, indicating a very good agreement.44 The occurrence of teaching behaviors before and after introduction of an intervention was analyzed by χ2 (P ≤ .05).
General surgery residents were asked to complete a 26-question, previously validated, Likert-based survey24 regarding the frequency of deliberate perioperative teaching guidance provided by faculty surgeons. This was done 3 times. The first survey occurred during our initial phase prior to any feedback. The second survey was done after study results had been unblinded to residents and surgeons but before any planned intervention. The final repeat survey was completed 6 months later, at the end of the iterative process-improvement cycle. Survey results were compared via the Kruskal-Wallis test, seeking 95% confidence. Our faculty had previously participated in the same published survey, but we were unable to separate our surgeons’ responses from those of the larger group because of anonymity. We chose not to do another baseline survey of our own surgeons to avoid calling further attention to their teaching practices and biasing our ability to objectively observe baseline teaching behaviors during phase 1.
We observed 263 electively scheduled operative cases on weekdays between 7:30 AM and 8:05 PM, representing 374 observation hours. Phase 1 included 38 participants (8 students, 11 surgeons, and 19 residents), and phase 2 included 8 students, 13 surgeons, and 20 residents. Sixty-three percent were common to both phases. Phase 3 included 2 observers, 11 surgeons, and 14 residents. Thirty-five percent of all participants were common to all 3 phases.
The briefing and debriefing elements (4 briefing and 6 debriefing) were aggregated into a mean score of 10 possible teaching elements (Figure). During phase 1, a 9-month baseline mean (SD) of 2.73 (0.01) of the 10 elements occurred for each case observed. During the following month, faculty received personalized confidential feedback of their teaching performance and both surgeons and residents reviewed the aggregate data. Input was solicited from both groups to assist efforts to improve the frequency of modeled teaching behaviors (bottom of the Figure). Surgeons and residents were informed that students would be returning to the operating room to observe additional cases and would prompt surgeons for the desired behaviors if not witnessed.
Overall, the mean number of 10 teaching elements observed following intervention increased serially and substantially in parallel with phase 2 process-improvement efforts. During the first month of reobservations, we observed a 104% increase over baseline in the completeness of briefings and debriefings performed. From that point through the conclusion of phase 2 in June, we identified gradual improvements in the completeness of our 10-step process and the Plan-Do-Check-Act cycle was repeatedly applied to monitor adherence to the recommended plan. Faculty surgeons improved at a steady and similar rate in the frequency and completeness of the 10-step process during phase 2, demonstrating a gradual improvement until June 2012 when a decline was observed. Faculty convened to reevaluate efforts, suggesting we use a checklist for incoming residents to record the number of steps discussed with their surgeon, as well as the continuance of periodic observations and progress reports. Phase 3 monitoring began in July. During the next 4 months, the mean percentage of completion of the specified briefing and debriefing elements by surgeons averaged 92.5%, demonstrating an overall improvement of 239% compared with their preintervention baseline scores. The need for observer prompting declined steadily between January and June 2012 until the need to prompt became unnecessary (Figure).
We also analyzed each briefing and debriefing element individually, comparing aggregate data collected during each phase. Not only did the overall percentage of targeted elements discussed increase (Figure), but the mean (SD) percentage of preoperative briefing elements discussed increased from 33.9% (2.5) to 95.5% (1.5), while the mean (SD) percentage of debriefing elements discussed increased from 10.6% (2.7) to 90.2% (2.5) (P < .001 for each, not shown). Indeed, we observed a statistically significant increase in the use of every individual element (P < .001; Table 1).
Our intervention also changed intraoperative teaching behavior (Table 2). The general communication style of the surgeon was significantly changed in 4 of 6 components. In contrast, we observed a statistically significant decline in the incidence of idle or unstructured conversation between the student and teacher during each case (P < .001). Surgeons improved significantly in all elements of procedure-specific teaching behaviors except talking a resident through the steps of a procedure (which was already 89.5% at baseline).
Although many of our residents had participated in our previously published national survey of resident perceptions of faculty perioperative teaching,24 we had graduated a class of residents and acquired new interns since that study. Therefore, we resurveyed our current residents 3 times: once during the baseline period, once prior to implementation of the intervention, and again at the conclusion of the study, comparing these results with the national study (Table 3). Only nonverbal feedback (P = .03) was identified by our residents at baseline as occurring more frequently than had been reported in national results. The January repeat survey did not demonstrate any difference in resident perceptions of faculty teaching or attitudes from the previous baseline (data not shown). However, when the November preintervention survey results were compared with responses received at the conclusion of the study, our residents reported a statistically significantly higher frequency in the incidence of faculty describing procedural steps and asking residents to evaluate their own performance. In addition, significant improvement over national survey results was identified in 6 other survey questions, following intervention.
By offering and motivating adoption of a preoperative and postoperative educational briefing/debriefing structure, we not only were able to consistently remind faculty surgeons that they needed to teach, but we also reminded residents what they needed to learn. Reminders have been studied in other attempts to change physician clinical behavior with mixed results,1,45,46 but here we not only added a reminder but changed the culture so that participants reminded themselves. By combining a continuous process improvement approach with confidential objective individualized feedback, we achieved substantial, progressive, and sustainable improvement in both the frequency of desirable preoperative and postoperative teaching behaviors by faculty and the frequency of desirable behaviors during the procedure itself. Furthermore, the progressive increase in these theoretically desirable teaching behaviors was accompanied by a substantial and statistically significant improvement in resident perceptions of teaching by their faculty.
Fundamental to our success was the collection of objective baseline data followed by a nonpunitive approach of providing individual feedback results to each surgeon regarding their performance. We challenged our faculty that the results of our local resident survey suggested that they were no better than the national average, and following our multifaceted approach to change, we observed an improvement in teaching behaviors of nearly 250% over baseline. We capitalized on the fact that our surgeons knew they were being watched and used this as a tool for change, unlike many studies where the Hawthorne effect is considered an unavoidable observation bias.47,48 We also demonstrated our commitment to promoting lasting change by paying close attention to monthly reports and seeking surgeon input to continue the momentum over time. Whether these changes can be sustained over even longer periods remains an open question awaiting further study.
Establishing preoperative briefings to identify learning objectives also changed intraoperative behaviors. We observed less idle or unstructured conversations and more deliberate teaching. In addition, the faculty was observed to more frequently direct their intraoperative teaching and procedural teaching styles to meet the residents’ learning needs. For example, when a resident identified a personal learning objective to improve his dissection technique during thyroidectomy, intraoperative instruction sought to teach the learner new approaches (eg, consider a right angle). The debriefings that we used concluded the learning cycle by connecting the learners’ specific needs with the expert’s evaluation. In the thyroid case described here, the resident indicated he had been “too rough” with the tissues and the surgeon followed with a demonstration of deliberate, precise hand motions. We believe that the combined effect of structured briefings and debriefings and the introduction of student-directed intraoperative learning objectives focused our surgeons on resident learning needs and made them better teachers.
The gradual adoption of the briefing/debriefing model also reminded residents what they needed to learn. The new generation of residents is more heterogeneous than their predecessors, with different learning needs.49 Residents were asked to participate in their own learning and identify personal weaknesses, focus their learning where they perceived it to be most needed, and reflect on their operative performance. This new process was initially uncomfortable and awkward for both residents and surgeons, but as deliberate teaching became routine, the discomfort soon dissipated. Studies have shown that residents may not easily recognize educational experiences when they see them.24 However, the consistency in our approach to teaching resulted in residents equating the instruction provided to enhanced educational effort by faculty. In fact, when our interns asked surgeons to complete the briefings and debriefing checklist, they viewed our process as the norm, not as a research experiment.
Despite our documentation of more frequent good teaching behavior, it must be acknowledged that we cannot present objective data that our trainees have experienced better learning. The assessment of skill acquisition in a complex surgical environment remains a problem for all surgical training. As a discipline, we remain uncertain about how resident learning actually occurs or how to prove it. However, the teaching behaviors that we have increased have been shown to promote learning in other environments.21,22,33,34 Furthermore, in parallel with the increases in preoperative, intraoperative, and postoperative teaching behaviors by the faculty that we achieved, our residents not only responded that they were being taught more often than had occurred before the intervention, but also now rated their educators as significantly better than those described in the national survey results. Although improvements in resident perceptions are admittedly subjective, such improvements are themselves valuable.
Of significant importance to our success was our ability to recognize the value of investing in individuals who were willing to alter behaviors, as well as understanding change and process improvement principles. Furthermore, by preferring intrinsic rather than extrinsically determined motivators, we appealed to attendings’ values as educators and promoted a collective culture of change. As Centra50 described factors necessary to improve teaching, we found that by objectively and confidentially assessing our faculty, we were able to provide attending with new and credible knowledge about their personal teaching behaviors not previously available through standard faculty evaluations. We introduced proven teaching strategies, used faculty input to tailor our planned interventions, and reinforced their efforts by continuous feedback38 on the plan they had established. By understanding change processes, we were able to refocus and regain momentum, particularly when our improvement efforts declined in June, when we were reminded of just how quickly process improvement efforts can fail if left unmonitored. For example, data reported from our observers in June indicated that prompting the surgeons to conduct briefings and debriefings had become unnecessary. On further review, we realized that the decline reflected not a lack of prompting, but less complete briefings and debriefings. Once this drop-off was recognized, faculty and residents elected to complement student prompting with resident prompting using a checklist that served as a visual prompt to faculty to have a conversation with their resident before and following each case. Each surgeon who was given a form agreed to complete it, and over the next several months, residents reported that the frequency and thoroughness of each briefing and debriefing improved whether the visual paper prompt was brought to the surgeon’s attention. We believe this combination of prompting approaches contributed to the sustainability of our efforts.
Certainly, this study has limitations. The loss of one-third of our study participants during phase 2 could have biased our results. However, when we reanalyzed the data including only those participating in both phases, we still found statistically significant improvements in both the number and completeness of briefings and debriefings following the intervention (data not shown). It is also possible that deliberate perioperative teaching may have occurred outside the view of student observers, leading us to underestimate faculty teaching. However, our observers clearly documented even more teaching after our intervention cycle. Moreover, the fact that our residents perceived more teaching by surgeons following the intervention suggests that this change in observed teaching behavior was meaningful to our trainees. We did not assess whether resident year of training or case complexity influenced our results because this would have required a much larger case volume for statistical reliability, as well as an objective and standardized index of case complexity. However, the extent to which these affect optimal modalities for perioperative education remains an important subject for future research. Teaching during elective daytime cases certainly may have differed from teaching during emergencies at night. Finally, the sustainability of our momentum is no trivial matter, and the continuous use of students for observation purposes is not feasible. However, by keeping the topic of perioperative teaching alive at multiple venues (eg, faculty meetings and grand rounds) and continuing with episodic random monitoring, we believe we can maintain our momentum and continue to improve our culture.
Viable alternatives to the time-honored apprenticeship model of surgical training have yet to be developed, although the challenges faced by today’s educators require us to transition from pure discovery learning to guided learning. We were able to both define a method to motivate such a transition and create sustained improvement in its adoption, at least across 2 different academic years. Objective recording of teaching behaviors’ frequency and style, serial feedback, and process improvement efforts motivated the gradual adoption of deliberate guided teaching behaviors by surgeons, resulting in both more frequent and complete teaching and improved subjective perceptions of faculty teaching by residents. Although every residency is different, the combination of resident perceptions and validating objective data can motivate any surgeon to improve his or her teaching if given the opportunity. In the end, it may not matter how we improve our teaching behaviors, only that we do.
Corresponding Author: Cheryl I. Anderson, RN, BSN, MSA, Department of Surgery, College of Human Medicine, Michigan State University, 1200 E Michigan Ave, Ste 655, Lansing, MI 48912 (email@example.com).
Accepted for Publication: January 25, 2013.
Published Online: August 14, 2013. doi:10.1001/jamasurg.2013.2144.
Author Contributions:Study concept and design: Anderson, Kwiecien, Lake, Tanious, Basson.
Acquisition of data: Anderson, Gupta, Larson, Abubars, Kwiecien, Lake, Hozain, Tanious, O’Brien.
Analysis and interpretation of data: Anderson, Kwiecien, Tanious, O’Brien, Basson.
Drafting of the manuscript: Anderson, Larson, Abubars, Kwiecien, Lake, Hozain, Basson.
Critical revision of the manuscript for important intellectual content: Anderson, Gupta, Tanious, O’Brien, Basson.
Statistical analysis: Anderson, Kwiecien, O’Brien, Basson.
Administrative, technical, and material support: Anderson, Larson, Kwiecien, Lake, Tanious, O’Brien, Basson.
Study supervision: Anderson, Gupta, Kwiecien, Hozain, Basson.
Conflict of Interest Disclosures: None reported.
Additional Contributions: We thank Alan T. Davis, PhD, for his statistical expertise, Rebecca C. Henry, PhD, for her editorial knowledge, and students Aaron M. Brown and Chelsea J. Henderson for their assistance.
Create a personal account or sign in to: