Integrating Morbidity and Mortality Core Competencies and Quality Improvement in Otolaryngology | Medical Education and Training | JAMA Otolaryngology–Head & Neck Surgery | JAMA Network
[Skip to Navigation]
Figure.  Pretest and Posttest Questionnaire Results
Pretest and Posttest Questionnaire Results

Each bar corresponds to the frequency of a given response (x-axis). For example, 4 of the received responses had a value of 1 among all responses by faculty on the pretest questionnaire.

Table 1.  Regression Results for Questionnaire Scale Responsesa
Regression Results for Questionnaire Scale Responsesa
Table 2.  Pretest and Posttest Questionnaire Ranked Response Summary Statisticsa
Pretest and Posttest Questionnaire Ranked Response Summary Statisticsa
Table 3.  Regression Results for Questionnaire Ranked Responsesa
Regression Results for Questionnaire Ranked Responsesa
Table 4.  Regression Results for Questionnaire Scale Responsesa
Regression Results for Questionnaire Scale Responsesa
1.
Gordon  LA. Gordon’s Guide to the Surgical Morbidity and Mortality Conference. Philadelphia, PA: Hanley & Belfus; 1994.
2.
Essentials and Information Items, Accreditation Council for Graduate Medical Education.  Graduate Medical Education Directory 1995-96. Chicago, IL: Accreditation Council for Graduate Medical Education; 1995.
3.
Rosenfeld  JC.  Using the Morbidity and Mortality conference to teach and assess the ACGME General Competencies.  Curr Surg. 2005;62(6):664-669.PubMedGoogle ScholarCrossref
4.
Kauffmann  RM, Landman  MP, Shelton  J,  et al.  The use of a multidisciplinary morbidity and mortality conference to incorporate ACGME general competencies.  J Surg Educ. 2011;68(4):303-308.PubMedGoogle ScholarCrossref
5.
McCormick  ME, Stadler  ME, Shah  RK.  Embedding quality and safety in otolaryngology-head and neck surgery education.  Otolaryngol Head Neck Surg. 2015;152(5):778-782.PubMedGoogle ScholarCrossref
6.
Bowe  SN.  Quality improvement in otolaryngology residency: survey of program directors.  Otolaryngol Head Neck Surg. 2016;154(2):349-354.PubMedGoogle ScholarCrossref
7.
Walner  DL, Karas  A.  Standardization of reporting post-tonsillectomy bleeding.  Ann Otol Rhinol Laryngol. 2013;122(4):277-282.PubMedGoogle ScholarCrossref
8.
Mitchell  EL, Lee  DY, Arora  S,  et al.  SBAR M&M: a feasible, reliable, and valid tool to assess the quality of, surgical morbidity and mortality conference presentations.  Am J Surg. 2012;203(1):26-31.PubMedGoogle ScholarCrossref
9.
Shah  RK, Kentala  E, Healy  GB, Roberson  DW.  Classification and consequences of errors in otolaryngology.  Laryngoscope. 2004;114(8):1322-1335.PubMedGoogle ScholarCrossref
10.
Greene  WH.  Econometric Analysis. 7th ed. Boston: Pearson Education; 2012:824-827.
11.
Beggs  S, Cardell  S, Hausman  J.  Assessing the potential demand for electric cars.  J Econom. 1981;17(1):1-19.Google ScholarCrossref
12.
Chapman  RG, Staelin  R.  Exploiting rank ordered choice set data within the stochastic utility model.  J Mark Res. 1982;19(3):288-301.Google ScholarCrossref
13.
Fok  D, Paap  R, Van Dijk  B.  A rank-ordered logit model with unobserved heterogeneity in ranking capabilities.  J Appl Econ. 2012;27(5):831-846.Google ScholarCrossref
14.
Croissant  Y. “Estimation of multinomial logit models in R: The mlogit Packages.” R package version 0.2-2. URL: https://cran.r-project.org/web/packages/mlogit/vignettes/mlogit.pdf. Accessed March 21, 2016.
15.
Citation: R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/. Accessed March 22, 2016.
16.
Williams  RG, Dunnington  GL.  Accreditation Council for Graduate Medical Education core competencies initiative: the road to implementation in the surgical specialties.  Surg Clin North Am. 2004;84(6):1621-1646, xi.PubMedGoogle ScholarCrossref
17.
Flynn-O’Brien  KT, Mandell  SP, Eaton  EV, Schleyer  AM, McIntyre  LK.  Surgery and medicine residents’ perspectives of morbidity and mortality conference: an interdisciplinary approach to improve ACGME core competency compliance.  J Surg Educ. 2015;72(6):e258-e266.PubMedGoogle ScholarCrossref
18.
Bhalla  VK, Boone  L, Lewis  F, Gucwa  AL, Kruse  EJ.  The utility of the matrix format for surgical morbidity and mortality conference.  Am Surg. 2015;81(5):503-506.PubMedGoogle Scholar
Original Investigation
February 2017

Integrating Morbidity and Mortality Core Competencies and Quality Improvement in Otolaryngology

Author Affiliations
  • 1Department of Otolaryngology–Head and Neck Surgery, San Antonio Uniformed Services Health Education Consortium (SAUSHEC), Fort Sam Houston, Texas
  • 2Department of Otolaryngology–Head and Neck Surgery, Massachusetts Eye and Ear Infirmary, Boston
  • 3782nd Military Intelligence Battalion, Detachment Texas, Fort Sam Houston, Texas
JAMA Otolaryngol Head Neck Surg. 2017;143(2):135-140. doi:10.1001/jamaoto.2016.2910
Key Points

Question  What is an effective and efficient method of integrating the Accreditation Council for Graduate Medical Education (ACGME) core competencies and quality improvement and patient safety (QI/PS) into an otolaryngology morbidity and mortality (M&M) conference?

Findings  In this cohort study with intervention, using the situation, background, assessment, and review/recommendations (SBAR) tool allowed for a significant improvement in resident M&M presentations over time. In addition, the integration of patient safety reports, otolaryngology-specific quality metrics, plan, do, study, act cycles, and evaluation of M&M cases with regard to the core competencies, provided a comprehensive M&M curriculum that is well-received by faculty and residents and emphasizes the importance of QI/PS.

Meaning  The described M&M curriculum and SBAR tool may be a useful and efficient method for incorporating the ACGME core competencies and QI/PS into an otolaryngology residency program.

Abstract

Importance  To date, an otolaryngology-specific morbidity and mortality (M&M) conference has never been reported or evaluated.

Objective  To propose a novel otolaryngology-specific M&M format and to assess its success using a validated assessment tool.

Design, Setting, and Participants  Preintervention and postintervention cohort study spanning 14 months (September 2014 to November 2015), with 32 faculty, residents, and medical students attending the department of otolaryngology M&M conference, conducted at the the San Antonio Uniformed Services Health Education Consortium.

Interventions  A novel quality assurance conference was implemented in the department of otolaryngology at the San Antonio Uniformed Services Health Education Consortium. This conference incorporates patient safety reports, otolaryngology-specific quality metrics, and individual case presentations. The revised format integrates the Accreditation Council for Graduate Medical Education (ACGME) core competencies and Quality Improvement and Patient Safety (QI/PS) system. This format was evaluated by faculty, residents, and medical students every other month for 14 months to assess changes in attitudes regarding the M&M conference as well as changes in presentation quality.

Results  Overall, 13 faculty, 12 residents, and 7 medical students completed 232 evaluations. Summary statistics of both resident and faculty attitudes about the success of the M&M format seem to improve over the 14 months between the prequestionnaires and postquestionnaires. General attitudes for both residents and faculty significantly improved from the pretest to posttest (odds ratio, 0.32 per month; 95% CI, 0.29-0.35). In the pretest period, “established presentation format” was considered the most necessary improvement, whereas in the posttest period this changed to “incorporate more QI.” For resident presentations evaluated using the situation, background, assessment, and review/recommendations (SBAR) tool, all evaluations, from all participants, improved over time.

Conclusions and Relevance  The M&M conference is an essential component of all otolaryngology residency programs and provides a unique opportunity to successfully incorporate the ACGME core competencies and regularly implement QI/PS.

Introduction

The morbidity and mortality (M&M) conference, also referred to as the “golden hour” in surgical education, is a surgical subspecialty’s most important forum for the discussion of adverse events and errors.1 In 1983, the Accreditation Council for Graduate Medical Education (ACMGE) mandated that all medical training programs incorporate regular M&M conferences into their educational paradigm.2 However, until the past decade, these conferences have been relatively unstructured and predominantly focused on individual “mistakes” rather than system-based errors. In 1999, with the implementation of the ACGME core competencies, a shift began which incorporated these competencies into the M&M conference.3,4 Kauffman et al4 found that this provided an opportunity to integrate the higher order competencies, such as practice-based learning and systems-based practice, without further stretching the already limited resident education time.

The incorporation of quality improvement and patient safety (QI/PS) into otolaryngology residency programs has also been increasingly emphasized over the past several years. However, most programs continue to struggle with how to efficiently and effectively integrate these principles into already established residency curriculums. The M&M conference allows for the identification of future QI opportunities for residents and faculty as well as fosters a culture of safety and quality in a department.5 In fact, the importance of M&M is so implicit, that a recent survey of program directors found that 100% considered it to be an integral aspect of education in QI.6

Despite its importance, no studies could be identified which propose an otolaryngology-specific M&M curriculum. Therefore, in this article we suggest a unique M&M format for otolaryngology that incorporates both the ACGME core competencies and opportunities for QI/PS. We sought to verify the efficacy of this new format using a validated assessment tool specifically developed to identify and improve the overall quality and educational value of the surgical M&M conference.

Methods
Quality Assurance Conference Development

The original M&M format at the San Antonio Uniformed Services Health Education Cosortium (SAUSHEC) Department of Otolaryngology consisted of individual oral discussions of surgical and inpatient complications from the prior 2 months. The conference consisted of a resident physician’s verbal synopsis, a brief literature review, and then an open forum faculty discussion. In October of 2014, this M&M conference format was replaced with a quality assurance conference which includes a review of hospital-wide patient safety reports (PSRs) in which any department member (resident, faculty, nursing, etc) has been involved. This is followed by a review of a standard quality metric tracked in our department–post tonsillectomy hemorrhage rates. This metric is tracked and objectively graded based on a recent publication by Walner et al7 Finally, 4 cases in which complications occurred are chosen by the residents and faculty for a more comprehensive presentation. All mortalities (if any) are presented.

These presentations follow a structured powerpoint format using the situation, background, assessment, and review/recommendations (SBAR) mnemonic. The SBAR tool is used by health care and military systems to standardize communication and was first used in 2009 at Oregon Health and Science University (OHSU) as a means for effectively and efficiently communicating about M&M cases.8 This process begins with a brief overview of the “situation” and complication which is the basis for the M&M. Then the “background” of the case is explored including a history, physical, and clinical course.

After this, a root cause analysis or “assessment” is performed by the resident. This requires evaluation of the complication with respect to a deficiency in at least 1 of the 6 ACGME core competencies. Finally, the presentation ends with a literature “review” of similar complications and evidence-based suggestions for prevention in the future. In addition, a more personal assessment and “recommendation” is often made by the resident or attending clinician of record as to how the complication could have been avoided or rectified more effectively. Each complication is evaluated from the perspective of a QI/PS representative action item. If one is identified, the resident assigned to the case initiates and completes a plan, do, study, act (PDSA) cycle as part of the QI curriculum. The resident then ideally presents the results of this at the quality assurance conference 4 months from the original presentation.

After the conference, each resident also submits a brief 1 to 2 page summary of the complication (Appendix A in the Supplement) identifying the core competencies that were compromised, classifying the error type based on the modified harm index presented by Shah et al,9 and recommending any future actions and/or interventions identified at the conference.

A nonresearch exemption was granted by the Department of Clinical Investigations at Brooke Army Medical Center prior to the initiation of this study, which ran from September 2014 to November 2015.

Quality Assurance Conference Assessment

Prior to the implementation of the new quality assurance conference, residents and faculty were administered a questionnaire to assess general attitudes about M&M case presentations (Appendix B in the Supplement). This same questionnaire was also administered at the end of the study period to evaluate any changes after the year-long implementation of the new SBAR system.

The SBAR evaluation tool, as developed and validated by Mitchell et al,8 was used for the individual evaluation of resident presentations. This robust assessment tool was designed at OHSU and provides a validated framework for objectively evaluating successes and deficiencies in M&M presentations. The survey was administered to every medical student, resident, and otolaryngology faculty member present at each M&M quality assurance conference during the study period (Appendix C in the Supplement). Each audience member was asked to complete the survey evaluating 1 of the 4 powerpoint presentations. This assessment tool was administered prior to the initiation of the new curriculum (documented as time point 0 months in results) and then every 2 months after the initiation of the SBAR format (months 2-12) for 1 year.

The SBAR responses were analyzed via an ordinal logistic regression.10 This family of statistical models is custom tailored to categorical data, such as questionnaire scales. Because these models are regressions, we can include interaction effects to illuminate important differences between groups of respondents (eg, faculty vs residents) and time periods (eg, preintervention and postintervention). A similar approach was taken for the ranked responses, in the second part of the questionnaire, except a ranked version of the ordinal logit model was used.11-13

The ordinal logistic regressions were calculated using the R package and mlogit software (version 0.2-2, Free Software Foundation).14,15 Significance was calculated using the standard normal distribution and the test statistic which was calculated from the parameters and standard errors obtained via estimation of the logistic regression.

Results

Summary statistics of both resident and faculty attitudes about the success of the M&M format seem to improve over the 14 months between the prequestionnaires and postquestionnaires. The Figure illustrates the frequency of each response for all survey questions on the 10-point Likert scale. The results are organized by role (faculty vs resident) as well as pretest vs posttest status and indicate an improvement in scaled responses.

The results from ordinal logistic regression are presented in Table 1. As anticipated from the contrast in the Figure, the posttest questionnaire responses had a much greater probability of a higher rating (log odds ratio [OR], 4.79; 95% CI, 3.89 to 5.69). Residents also responded more positively on the pretest than attending clinicians did (log OR, 0.78; 95% CI, 0.16 to 1.40). This difference disappeared in the posttest set as indicated by the “resident and posttest term” (log OR, −0.96; 95% CI, −1.6 to −0.07).

The second portion of the questionnaire asked faculty and residents to rank the most important elements needed in the M&M conference. Table 2 illustrates the breakdown of ranked responses by role and by pretest or posttest period. “time-limited presentations” seemed to be of lowest importance during the pretest period. For both faculty and residents, “incorporating more quality improvement” became the most important category in the posttest.

The estimation results are given in Table 3. Positive estimates in the table correspond with categories that tended to be ranked as more important. In the pretest period, “established presentation format” was more important (log OR, 1.11; 95% CI, 0.27 to 1.98) while “time-limited presentations” were less important (log OR, −0.99; 95% CI, −1.85 to −0.16). In the posttest period, “incorporate more quality improvement” increased substantially in importance (log OR, 1.03; 95% CI, 0.19 to 1.88) while “categorize specific cause of complication” decreased in importance substantially (log OR, −0.91; 95% CI, −1.75 to −0.10).

Every 2 months, the SBAR evaluation tool was issued to all audience members at the otolaryngology quality assurance conference. Appendix D in the Supplement illustrates the distribution of scale responses broken down by role, questionnaire section, and month.

Once again, an ordinal logit modeling approach was employed to evaluate the significance of the data. The dependent variable is again the survey response, while the survey section, role, and month correspond to the variables of interest. Table 4 illustrates that residents tended to respond most positively, followed by faculty, then students. We can interpret these estimates as before: all else equal, residents were 2.18 times more likely than faculty and 2.71 times more likely than students to respond 1 value higher on the scale. In addition, ordering the most favorable to least favorable sections using the estimates from the ordinal logistic regression (controlling for month and role) is as follows: “Situation, Background, Review, Assessment, and Recommendations.” A crucial result is that all of the responses, regardless of section, increased in value over time. The estimates for each section’s monthly increase can be obtained using the following formula:

Base case (monthly) + interaction term (ie. monthly × background) = estimate for improvement.

Note that for all cases where this addition is carried out the sum is positive. The corresponding interpretation is that survey responses improved over time regardless of role or section—even though some sections increased at greater rates than others.

Discussion

The M&M conference is a prime opportunity to integrate the ACGME core competencies into residency education.4 While deficiencies in medical knowledge and patient care have long been cited as the principles underlying most medical errors, considering all 6 competencies has broadened our understanding of the root causes of many surgical complications. Several general surgery programs have already documented their incorporation of these competencies into their M&M conferences.4,16 We sought to incorporate the competencies in the “assessment” phase of our case presentations. For example, a patient in the operating room (OR) status after blepharoplasty had a periorbital hematoma form after coughing during extubation. On attempt to reopen the incision to evacuate the hematoma, no surgical instruments were available because they had already been broken down by the scrub tech after being cleared to do so by the anesthesiologist, rather than the surgeon. In this situation, the resident assessed that there was a failure in systems-based practice as the core competency being violated as well as a QI/PS representative action item. In this institution, we initiated a review and PDSA cycle evaluating which OR team should be in charge of instructing sterile instrument breakdown and, in turn, changed our previous practice.

The incorporation of quality improvement is also an essential component of any successful otolaryngology residency program. Not only is it mandated by the ACGME, it is considered, by most surgical residents, to be important to their training (88.1%) and future career (91.3%).17 To this point, our study found that after the implementation of the new quality assurance conference, both residents and faculty ranked the incorporation of quality improvement and patient safety as the most important component of the M&M conference. This was a stark change from the pre-SBAR ranking results, and suggests that exposure to practical applications of QI potentially increases appreciation for its importance and relevance to M&M.

With M&M conferences already in place in all otolaryngology training institutions, the addition of QI/PS concepts, projects, and results can provide programs with achievable means of incorporating these QI topics into their educational framework for faculty and trainees.5 Our incorporation of a PSR review into every M&M, compels residents and faculty to appreciate the importance of proactivity and transparency with regard to patient safety and provides insight into how PSRs are evaluated at both a departmental and hospital-system level. In addition, our incorporation of otolaryngology-specific quality metrics also serves to incorporate QI/PS through yet another means.

A variety of changes have previously been suggested to enhance the surgical M&M conference. Presentation standardization, evidence-based discussion, and scheduled protected conference time have been shown to optimize its educational value.18 Moreover, our pretest questionnaire results also showed that both residents and faculty valued incorporation of a structured presentation, supporting the introduction of our standardized M&M format. In addition, the results of using the SBAR evaluation tool indicate that with the implementation of the structured format all raters (medical students, residents, and faculty) recognized an improvement in every aspect of the M&M presentation. As one would expect, this improvement was less significant in the lower order areas of “situation,” “background,” and “review,” which purely involve data presentation and literature reviews, tasks with which the residents were relatively competent prior to the SBAR initiation. Alternatively, the higher order areas of “assessment” and “recommendation had a more significant improvement, likely secondary to their lower baseline and greater emphasis post-SBAR implementation. Ultimately, while this format appealed to our audience, we have yet to see if its implementation will actually result in a decrease in morbidities or PSRs and, in turn, an improvement in patient care. Another study evaluating this potential outcome is currently under way.

Limitations

This study does have some limitations. First, our investigation is a single departmental study and, therefore, relatively small. However, the overall large number of evaluations (133 pretest; 126 postest) and participants (32) improves the power of these findings. In addition, to our knowledge, this is the first description of an otolaryngology-specific M&M format, which strengthens its importance by virtue of its novelty. Second, all faculty and residents assessors were familiar with the presenting resident, which has the potential to introduce bias in their evaluation. However, the surveys were anonymous and the assessor could assess any 1 of the 4 presentations, which lessened the chance of the assesor purely rating the resident rather than the overall implementation of the process. Furthermore, we attempted to account for this through the inclusion of the medical students, who acted as a more impartial group of assessors with less bias toward certain residents and no knowledge of their previous presentation history. Third, since there was not a control group, we cannot explicitly rule out the possibility that changes occurred because of other trends in training.

Conclusions

The M&M conference is an essential component to any surgical training program. However, until now, this ubiquitous educational endeavor has never been assessed in otolaryngology. We present a unique framework which provides a guideline for the creation of a multi-faceted otolaryngology-specific quality assurance conference. In addition, after implementation of our new format, we found a significant improvement in the quality and educational value of the M&M presentations. Finally, this novel framework provides opportunities for the incorporation of QI/PS and increases awareness of its essential role in a successful M&M conference

Back to top
Article Information

Corresponding Author: Adrienne M. Laury, MD, Department of Otolaryngology–Head and Neck Surgery, San Antonio Uniformed Services Health Education Consortium (SAUSHEC), 3551 Roger Brooke Dr, ATTN: MCHE-SDT (Otolaryngology), JBSA- Fort Sam Houston, TX 78234 (Adrienne.laury@gmail.com).

Accepted for Publication: August 18, 2016.

Published Online: October 20, 2016. doi:10.1001/jamaoto.2016.2910

Author Contributions: Dr Laury had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: All authors.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Bowe, Lospinoso.

Obtained funding:

Administrative, technical, or material support: Laury, Bowe.

Study supervision: Laury, Bowe.

Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.

Disclaimer:The views expressed herein are those of the authors and do not reflect the official policy or position of Brooke Army Medical Center, the US Army Medical Department, the US Army Office of the Surgeon General, the Department of the Army, the Department of Defense, or the US Government.

References
1.
Gordon  LA. Gordon’s Guide to the Surgical Morbidity and Mortality Conference. Philadelphia, PA: Hanley & Belfus; 1994.
2.
Essentials and Information Items, Accreditation Council for Graduate Medical Education.  Graduate Medical Education Directory 1995-96. Chicago, IL: Accreditation Council for Graduate Medical Education; 1995.
3.
Rosenfeld  JC.  Using the Morbidity and Mortality conference to teach and assess the ACGME General Competencies.  Curr Surg. 2005;62(6):664-669.PubMedGoogle ScholarCrossref
4.
Kauffmann  RM, Landman  MP, Shelton  J,  et al.  The use of a multidisciplinary morbidity and mortality conference to incorporate ACGME general competencies.  J Surg Educ. 2011;68(4):303-308.PubMedGoogle ScholarCrossref
5.
McCormick  ME, Stadler  ME, Shah  RK.  Embedding quality and safety in otolaryngology-head and neck surgery education.  Otolaryngol Head Neck Surg. 2015;152(5):778-782.PubMedGoogle ScholarCrossref
6.
Bowe  SN.  Quality improvement in otolaryngology residency: survey of program directors.  Otolaryngol Head Neck Surg. 2016;154(2):349-354.PubMedGoogle ScholarCrossref
7.
Walner  DL, Karas  A.  Standardization of reporting post-tonsillectomy bleeding.  Ann Otol Rhinol Laryngol. 2013;122(4):277-282.PubMedGoogle ScholarCrossref
8.
Mitchell  EL, Lee  DY, Arora  S,  et al.  SBAR M&M: a feasible, reliable, and valid tool to assess the quality of, surgical morbidity and mortality conference presentations.  Am J Surg. 2012;203(1):26-31.PubMedGoogle ScholarCrossref
9.
Shah  RK, Kentala  E, Healy  GB, Roberson  DW.  Classification and consequences of errors in otolaryngology.  Laryngoscope. 2004;114(8):1322-1335.PubMedGoogle ScholarCrossref
10.
Greene  WH.  Econometric Analysis. 7th ed. Boston: Pearson Education; 2012:824-827.
11.
Beggs  S, Cardell  S, Hausman  J.  Assessing the potential demand for electric cars.  J Econom. 1981;17(1):1-19.Google ScholarCrossref
12.
Chapman  RG, Staelin  R.  Exploiting rank ordered choice set data within the stochastic utility model.  J Mark Res. 1982;19(3):288-301.Google ScholarCrossref
13.
Fok  D, Paap  R, Van Dijk  B.  A rank-ordered logit model with unobserved heterogeneity in ranking capabilities.  J Appl Econ. 2012;27(5):831-846.Google ScholarCrossref
14.
Croissant  Y. “Estimation of multinomial logit models in R: The mlogit Packages.” R package version 0.2-2. URL: https://cran.r-project.org/web/packages/mlogit/vignettes/mlogit.pdf. Accessed March 21, 2016.
15.
Citation: R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/. Accessed March 22, 2016.
16.
Williams  RG, Dunnington  GL.  Accreditation Council for Graduate Medical Education core competencies initiative: the road to implementation in the surgical specialties.  Surg Clin North Am. 2004;84(6):1621-1646, xi.PubMedGoogle ScholarCrossref
17.
Flynn-O’Brien  KT, Mandell  SP, Eaton  EV, Schleyer  AM, McIntyre  LK.  Surgery and medicine residents’ perspectives of morbidity and mortality conference: an interdisciplinary approach to improve ACGME core competency compliance.  J Surg Educ. 2015;72(6):e258-e266.PubMedGoogle ScholarCrossref
18.
Bhalla  VK, Boone  L, Lewis  F, Gucwa  AL, Kruse  EJ.  The utility of the matrix format for surgical morbidity and mortality conference.  Am Surg. 2015;81(5):503-506.PubMedGoogle Scholar
×