[Skip to Content]
[Skip to Content Landing]
Original Investigation
May 2016

Measuring Nontechnical Aspects of Surgical Clinician Development in an Otolaryngology Residency Training Program

Author Affiliations
  • 1Department of Otolaryngology–Head and Neck Surgery, Harvard Medical School, Boston, Massachusetts
JAMA Otolaryngol Head Neck Surg. 2016;142(5):423-428. doi:10.1001/jamaoto.2015.3642
Abstract

Importance  Surgical competency requires sound clinical judgment, a systematic diagnostic approach, and integration of a wide variety of nontechnical skills. This more complex aspect of clinician development has traditionally been difficult to measure through standard assessment methods.

Objective  This study was conducted to use the Clinical Practice Instrument (CPI) to measure nontechnical diagnostic and management skills during otolaryngology residency training; to determine whether there is demonstrable change in these skills between residents who are in postgraduate years (PGYs) 2, 4, and 5; and to evaluate whether results vary according to subspecialty topic or method of administration.

Design, Setting, and Participants  Prospective study using the CPI, an instrument with previously established internal consistency, reproducibility, interrater reliability, discriminant validity, and responsiveness to change, in an otolaryngology residency training program. The CPI was used to evaluate progression in residents’ ability to evaluate, diagnose, and manage case-based clinical scenarios. A total of 248 evaluations were performed in 45 otolaryngology resident trainees at regular intervals. Analysis of variance with nesting and postestimation pairwise comparisons were used to evaluate total and domain scores according to training level, subspecialty topic, and method of administration.

Interventions  Longitudinal residency educational initiative.

Main Outcomes and Measures  Assessment with the CPI during PGYs 2, 4, and 5 of residency.

Results  Among the 45 otolaryngology residents (248 CPI administrations), there were a mean (SD) of 5 (3) administrations (range, 1-4) during their training. Total scores were significantly different among PGY levels of training, with lower scores seen in the PGY-2 level (44 [16]) compared with the PGY-4 (64 [13]) or PGY-5 level (69 [13]) (P < .001). Domain scores related to information gathering and organizational skills were acquired earlier in training, while knowledge base and clinical judgment improved later in residency. Trainees scored higher in general otolaryngology (mean [SD], 72 [14]) than in subspecialties (range, 55 [12], P = .003, to 56 [19], P < .001). Neither administering the examination with an electronic scoring system, rather than a paper-based scoring system, nor the calendar year of administration affected these results.

Conclusions and Relevance  Standardized interval evaluation with the CPI demonstrates improvement in qualitative diagnostic and management capabilities as PGY levels advance.

×