Stratified by group type (resident vs faculty surgeon), survey responses to assess acceptability and use of the electronic medical record tool are displayed for all quantitative questions (A and C, residents; B and D, faculty members), as well as direct quotes from open-ended text responses that demonstrate main themes (E, residents; F, faculty members). OR indicates operating room.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Stickrath E, Rockhill K, Zuhn H, Alston MJ. Feasibility of Integration of Resident Surgical Evaluations Into the Electronic Medical Record. JAMA Surg. 2020;155(7):665–667. doi:10.1001/jamasurg.2020.0845
Evaluation of a resident’s surgical performance is key to the developing surgeon’s education; however, collecting surgical feedback can be problematic for residents and faculty members alike. This study aims to describe how a surgical evaluation tool can be presented to surgeons in a new way through integration into the electronic medical record.
This descriptive study was conducted in an academic safety-net hospital from February 2019 through June 2019. A novel tool was developed within the Epic electronic medical record (EMR) system that resulted in an in-basket message (Figure 1) to the faculty surgeon of record on case completion. The process was created by a physician builder in the EMR with the assistance of an EMR analyst; they required approximately 15 hours of personnel time to build and test this tool. The message contained a link to complete a surgical evaluation via Qualtrics (SAP), an outside survey platform. The evaluation tool consisted of the previously validated Zwisch1 scale, with 2 additional questions to allow free-text feedback on resident performance. When the evaluation was completed, an email was instantly generated to the operating resident, providing nearly real-time feedback. At the conclusion of the study, the proportion of the number of completed evaluations to the total number generated was calculated. After the study was completed, an anonymous survey was sent to the faculty surgeons and residents to assess the acceptability of the tool and the outcome the tool had on the amount of feedback given and received. This project was reviewed by the Colorado Multiple Institutional Review Board and determined to be exempt. Each participant provided informed consent with the completion of the survey. Data analysis was completed with SAS version 9.4 (SAS Institute).
A total of 17 operating surgeons and 37 residents participated in this study. During the study period, 724 operations were performed in the Department of Obstetrics and Gynecology at Denver Health Medical Center. Of the evaluation requests made via the in-basket, 552 evaluations were completed (76.2%). When the number of evaluations completed by each clinician was analyzed, 14 of 19 faculty members (74%) completed at least 80% of the evaluations they received. The completion rate was at or near 0 for 3 faculty members.
Poststudy surveys were completed by 26 of 27 residents (96%) and 14 of 17 faculty members (82%). Nearly all residents (26 [96%]) reported that they received more feedback (based on responses of “much more” or “somewhat more”) because of this tool (Figure 2). Residents also felt favorably about receiving feedback through this mechanism (23 [85%], combining the responses “like a great deal” and “like somewhat”). Among faculty, 13 of 14 (93%) found the tool acceptable to use, but only 3 (21%) felt that they gave more feedback as a result of this evaluation tool.
To our knowledge, this is the only study to date describing the integration of an evaluation tool into the EMR. Our study demonstrated that not only is it possible to build an evaluation tool into the EMR, but faculty members were likely to complete the evaluations (76.2% of evaluations) and find the tool acceptable to use (93% of respondents). Unlike other previously developed tools,2 this evaluation tool falls within the workflows of both the surgeons and residents and does not incur additional costs to training programs.
The strengths of this study include its novelty, as well as the relatively high volume of cases and the number of evaluations generated. Additionally, all postsurvey resident responses were kept confidential to limit any social desirability bias that could occur because of a student-teacher dynamic. This study was conducted as a feasibility study. The reliability and validity of resident performance through this tool were not evaluated.
Based on the high response rate seen in this study, there seems to be some advantage of presenting evaluation requests within the EMR. This method of collecting feedback for resident learners could be applied broadly, because it was found to be both feasible to implement and acceptable to users.
Corresponding Author: Elaine Stickrath, MD, Department of Obstetrics and Gynecology, Denver Health Medical Center, University of Colorado, Denver, 790 Delaware St, Denver, CO 80204 (email@example.com).
Published Online: May 13, 2020. doi:10.1001/jamasurg.2020.0845
Author Contributions: Dr Stickrath had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: Stickrath, Zuhn, Alston.
Acquisition, analysis, or interpretation of data: Stickrath, Rockhill, Alston.
Drafting of the manuscript: Stickrath, Rockhill, Alston.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Rockhill.
Administrative, technical, or material support: Stickrath, Zuhn, Alston.
Supervision: Stickrath, Alston.
Conflict of Interest Disclosures: None reported.