[Skip to Navigation]
Sign In
Figure.  Resident Files That Include Documentation That Flags Were Addressed With the Resident Before and After Implementation of CBAS
Resident Files That Include Documentation That Flags Were Addressed With the Resident Before and After Implementation of CBAS

Proportion by cohort of files for residents with at least 1 flagged assessment where documentation could be found indicating that the flags had been addressed with the resident. CBAS indicates Competency-Based Achievement System.

Table 1.  Demographic Characteristics of the Residents Included in the File Reviews
Demographic Characteristics of the Residents Included in the File Reviews
Table 2.  Percentages of Residents Who Received a Rating of Less Than Satisfactory on a Competency Domain on Any Summative Assessment Before vs After Implementation of CBASa
Percentages of Residents Who Received a Rating of Less Than Satisfactory on a Competency Domain on Any Summative Assessment Before vs After Implementation of CBASa
Table 3.  Proportions of Residents Within Each Cohort Who Met Criteria for Designation of Resident in Difficulty Before vs After Implementation of CBASa
Proportions of Residents Within Each Cohort Who Met Criteria for Designation of Resident in Difficulty Before vs After Implementation of CBASa
1.
Frank  JR, Snell  L, Englander  R, Holmboe  ES; ICBME Collaborators.  Implementing competency-based medical education: moving forward.  Med Teach. 2017;39(6):568-573. doi:10.1080/0142159X.2017.1315069PubMedGoogle ScholarCrossref
2.
Batalden  P, Leach  D, Swing  S, Dreyfus  H, Dreyfus  S.  General competencies and accreditation in graduate medical education.  Health Aff (Millwood). 2002;21(5):103-111. doi:10.1377/hlthaff.21.5.103PubMedGoogle ScholarCrossref
3.
Tannenbaum  D, Kerr  J, Konkin  J,  et al.  Triple C Competency-Based Curriculum: Report of the Working Group on Postgraduate Curriculum Review—Part 1. College of Family Physicians of Canada. http://www.cfpc.ca/uploadedfiles/education/_pdfs/wgcr_triplec_report_english_final_18mar11.pdf. Published March 2011. Accessed January 3, 2018.
4.
Frank  JR, Danoff  D.  The CanMEDS initiative: implementing an outcomes-based framework of physician competencies.  Med Teach. 2007;29(7):642-647. doi:10.1080/01421590701746983PubMedGoogle ScholarCrossref
5.
Simpson  JG, Furnace  J, Crosby  J,  et al.  The Scottish doctor—learning outcomes for the medical undergraduate in Scotland: a foundation for competent and reflective practitioners.  Med Teach. 2002;24(2):136-143. doi:10.1080/01421590220120713PubMedGoogle ScholarCrossref
6.
Scheele  F, Teunissen  P, Van Luijk  S,  et al.  Introducing competency-based postgraduate medical education in the Netherlands.  Med Teach. 2008;30(3):248-253. doi:10.1080/01421590801993022PubMedGoogle ScholarCrossref
7.
Australian Curriculum Revision Working Group. Australian curriculum framework for junior doctors. http://curriculum.cpmec.org.au/. Accessed December 22, 2017.
8.
Whitcomb  ME.  Transforming medical education: is competency-based medical education the right approach?  Acad Med. 2016;91(5):618-620. doi:10.1097/ACM.0000000000001049PubMedGoogle ScholarCrossref
9.
Klamen  DL, Williams  RG, Roberts  N, Cianciolo  AT.  Competencies, milestones, and EPAs—are those who ignore the past condemned to repeat it?  Med Teach. 2016;38(9):904-910. doi:10.3109/0142159X.2015.1132831PubMedGoogle ScholarCrossref
10.
Boyd  VA, Whitehead  CR, Thille  P, Ginsburg  S, Brydges  R, Kuper  A.  Competency-based medical education: the discourse of infallibility.  Med Educ. 2018;52(1):45-57. doi:10.1111/medu.13467PubMedGoogle ScholarCrossref
11.
Whitehead  CR, Kuper  A.  Faith-based medical education.  Adv Health Sci Educ Theory Pract. 2017;22(1):1-3. doi:10.1007/s10459-016-9748-8PubMedGoogle ScholarCrossref
12.
Talbot  M.  Monkey see, monkey do: a critique of the competency model in graduate medical education.  Med Educ. 2004;38(6):587-592. doi:10.1046/j.1365-2923.2004.01794.xPubMedGoogle ScholarCrossref
13.
Brooks  MA.  Medical education and the tyranny of competency.  Perspect Biol Med. 2009;52(1):90-102. doi:10.1353/pbm.0.0068PubMedGoogle ScholarCrossref
14.
Glass  JM.  Competency based training is a framework for incompetence.  BMJ. 2014;348:g2909. PubMedGoogle ScholarCrossref
15.
Holmboe  ES, Sherbino  J, Englander  R, Snell  L, Frank  JR; ICBME Collaborators.  A call to action: the controversy of and rationale for competency-based medical education.  Med Teach. 2017;39(6):574-581. doi:10.1080/0142159X.2017.1315067PubMedGoogle ScholarCrossref
16.
Carraccio  CL, Englander  R.  From Flexner to competencies: reflections on a decade and the journey ahead.  Acad Med. 2013;88(8):1067-1073. doi:10.1097/ACM.0b013e318299396fPubMedGoogle ScholarCrossref
17.
Carraccio  C, Wolfsthal  SD, Englander  R, Ferentz  K, Martin  C.  Shifting paradigms: from Flexner to competencies.  Acad Med. 2002;77(5):361-367. doi:10.1097/00001888-200205000-00003PubMedGoogle ScholarCrossref
18.
Frank  JR, Snell  LS, Cate  OT,  et al.  Competency-based medical education: theory to practice.  Med Teach. 2010;32(8):638-645. doi:10.3109/0142159X.2010.501190PubMedGoogle ScholarCrossref
19.
Levine  MF, Shorten  G.  Competency-based medical education: its time has arrived.  Can J Anaesth. 2016;63(7):802-806. doi:10.1007/s12630-016-0638-6PubMedGoogle ScholarCrossref
20.
Schuwirth  LW, Van der Vleuten  CP.  Programmatic assessment: from assessment of learning to assessment for learning.  Med Teach. 2011;33(6):478-485. doi:10.3109/0142159X.2011.565828PubMedGoogle ScholarCrossref
21.
Schuwirth  L, Ash  J.  Assessing tomorrow’s learners: in competency-based education only a radically different holistic method of assessment will work: six things we could forget.  Med Teach. 2013;35(7):555-559. doi:10.3109/0142159X.2013.787140PubMedGoogle ScholarCrossref
22.
Schultz  K, Griffiths  J.  Implementing competency-based medical education in a postgraduate family medicine residency training program: a stepwise approach, facilitating factors, and processes or steps that would have been helpful.  Acad Med. 2016;91(5):685-689. doi:10.1097/ACM.0000000000001066PubMedGoogle ScholarCrossref
23.
Gruppen  L, Frank  JR, Lockyer  J,  et al; ICBME Collaborators.  Toward a research agenda for competency-based medical education.  Med Teach. 2017;39(6):623-630. doi:10.1080/0142159X.2017.1315065PubMedGoogle ScholarCrossref
24.
Ross  S, Poth  CN, Donoff  M,  et al.  Competency-based achievement system: using formative feedback to teach and assess family medicine residents’ skills.  Can Fam Physician. 2011;57(9):e323-e330.PubMedGoogle Scholar
25.
Donoff  MG.  Field notes: assisting achievement and documenting competence.  Can Fam Physician. 2009;55(12):1260-1262, e100-e102.PubMedGoogle Scholar
26.
Bok  HG, Teunissen  PW, Favier  RP,  et al.  Programmatic assessment of competency-based workplace learning: when theory meets practice.  BMC Med Educ. 2013;13(1):123. doi:10.1186/1472-6920-13-123PubMedGoogle ScholarCrossref
27.
Dijkstra  J, Galbraith  R, Hodges  BD,  et al.  Expert validation of fit-for-purpose guidelines for designing programmes of assessment.  BMC Med Educ. 2012;12(1):20. doi:10.1186/1472-6920-12-20PubMedGoogle ScholarCrossref
28.
van der Vleuten  CP, Schuwirth  LW, Driessen  EW,  et al.  A model for programmatic assessment fit for purpose.  Med Teach. 2012;34(3):205-214. doi:10.3109/0142159X.2012.652239PubMedGoogle ScholarCrossref
29.
Black  P, William  D.  Assessment and classroom learning.  Asess Educ. 1998;5(1):7-74. doi:10.1080/0969595980050102Google ScholarCrossref
30.
Black  P, Wiliam  D.  Classroom assessment and pedagogy [published online March 22, 2018].  Assess Educ. doi:10.1080/0969594X.2018.1441807Google Scholar
31.
von Elm  E, Altman  DG, Egger  M, Pocock  SJ, Gøtzsche  PC, Vandenbroucke  JP; STROBE Initiative.  Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies.  BMJ. 2007;335(7624):806-808. doi:10.1136/bmj.39335.541782.ADPubMedGoogle ScholarCrossref
32.
Smith  CS, Stevens  NG, Servis  M.  A general framework for approaching residents in difficulty.  Fam Med. 2007;39(5):331-336.PubMedGoogle Scholar
33.
Yao  DC, Wright  SM.  The challenge of problem residents.  J Gen Intern Med. 2001;16(7):486-492. doi:10.1046/j.1525-1497.2001.016007486.xPubMedGoogle ScholarCrossref
34.
Steinert  Y.  The “problem” learner: whose problem is it? AMEE guide No. 76.  Med Teach. 2013;35(4):e1035-e1045. doi:10.3109/0142159X.2013.774082PubMedGoogle ScholarCrossref
35.
Newcombe  RG.  Interval estimation for the difference between independent proportions: comparison of eleven methods.  Stat Med. 1998;17(8):873-890. doi:10.1002/(SICI)1097-0258(19980430)17:8<873::AID-SIM779>3.0.CO;2-IPubMedGoogle ScholarCrossref
36.
IBM SPSS Statistics for Windows, Version 25.0. Armonk, NY: IBM Corp; 2017.
37.
Gruppen  LD, Ten Cate  O, Lingard  LA, Teunissen  PW, Kogan  JR.  Enhanced requirements for assessment in a competency-based, time-variable medical education system.  Acad Med. 2018;93(3S competency-based, time-variable education in the health professions):S17-S21. doi:10.1097/ACM.0000000000002066PubMedGoogle ScholarCrossref
38.
Harris  P, Bhanji  F, Topps  M,  et al; ICBME Collaborators.  Evolving concepts of assessment in a competency-based world.  Med Teach. 2017;39(6):603-608. doi:10.1080/0142159X.2017.1315071PubMedGoogle ScholarCrossref
39.
Eva  KW, Bordage  G, Campbell  C,  et al.  Towards a program of assessment for health professionals: from training into practice.  Adv Health Sci Educ Theory Pract. 2016;21(4):897-913. doi:10.1007/s10459-015-9653-6PubMedGoogle ScholarCrossref
40.
Lockyer  J, Carraccio  C, Chan  MK,  et al; ICBME Collaborators.  Core principles of assessment in competency-based medical education.  Med Teach. 2017;39(6):609-616. doi:10.1080/0142159X.2017.1315082PubMedGoogle ScholarCrossref
41.
Sargeant  J, Armson  H, Chesluk  B,  et al.  The processes and dimensions of informed self-assessment: a conceptual model.  Acad Med. 2010;85(7):1212-1220. doi:10.1097/ACM.0b013e3181d85a4ePubMedGoogle ScholarCrossref
42.
Hauer  KE, Clauser  J, Lipner  RS,  et al.  The internal medicine reporting milestones: cross-sectional description of initial implementation in US residency programs.  Ann Intern Med. 2016;165(5):356-362. doi:10.7326/M15-2411PubMedGoogle ScholarCrossref
43.
Hauer  KE, Vandergrift  J, Hess  B,  et al.  Correlations between ratings on the resident annual evaluation summary and the internal medicine milestones and association with ABIM certification examination scores among US internal medicine residents, 2013-2014.  JAMA. 2016;316(21):2253-2262. doi:10.1001/jama.2016.17357PubMedGoogle ScholarCrossref
44.
Kinnear  B, Bensman  R, Held  J, O’Toole  J, Schauer  D, Warm  E.  Critical deficiency ratings in milestone assessment: a review and case study.  Acad Med. 2017;92(6):820-826. doi:10.1097/ACM.0000000000001383PubMedGoogle ScholarCrossref
45.
Ferguson  PC, Kraemer  W, Nousiainen  M,  et al.  Three-year experience with an innovative, modular competency-based curriculum for orthopaedic training.  J Bone Joint Surg Am. 2013;95(21):e166. doi:10.2106/JBJS.M.00314PubMedGoogle ScholarCrossref
46.
Nousiainen  MT, McQueen  SA, Hall  J,  et al.  Resident education in orthopaedic trauma: the future role of competency-based medical education.  Bone Joint J. 2016;98-B(10):1320-1325. doi:10.1302/0301-620X.98B10.37031PubMedGoogle ScholarCrossref
47.
Zeller  MP, Sherbino  J, Whitman  L, Skeate  R, Arnold  DM.  Design and implementation of a competency-based transfusion medicine training program in Canada.  Transfus Med Rev. 2016;30(1):30-36. doi:10.1016/j.tmrv.2015.11.001PubMedGoogle ScholarCrossref
48.
Powell  DE, Carraccio  C.  Toward competency-based medical education.  N Engl J Med. 2018;378(1):3-5. doi:10.1056/NEJMp1712900PubMedGoogle ScholarCrossref
49.
Chan  T, Sherbino  J; McMAP Collaborators.  The McMaster Modular Assessment Program (McMAP): a theoretically grounded work-based assessment system for an emergency medicine residency program.  Acad Med. 2015;90(7):900-905. doi:10.1097/ACM.0000000000000707PubMedGoogle ScholarCrossref
Original Investigation
Medical Education
November 9, 2018

Association of a Competency-Based Assessment System With Identification of and Support for Medical Residents in Difficulty

Author Affiliations
  • 1Department of Family Medicine, University of Alberta, Edmonton, Alberta, Canada
JAMA Netw Open. 2018;1(7):e184581. doi:10.1001/jamanetworkopen.2018.4581
Key Points

Question  Is competency-based assessment associated with changes in rates of identification of and support for residents in difficulty compared with traditional assessment?

Findings  In this cohort study of 458 Canadian medical residents, there were significant reductions in the proportions of residents receiving flagged assessments on multiple rotations, reductions in proportions of residents defined as being in difficulty, and increases in documented evidence identifying that gaps were discussed with the resident following introduction of a competency-based assessment program.

Meaning  Competency-based assessment may contribute to better identification of and support for residents in difficulty.

Abstract

Importance  Competency-based medical education is now established in health professions training. However, critics stress that there is a lack of published outcomes for competency-based medical education or competency-based assessment tools.

Objective  To determine whether competency-based assessment is associated with better identification of and support for residents in difficulty.

Design, Setting, and Participants  This cohort study of secondary data from archived files on 458 family medicine residents (2006-2008 and 2010-2016) was conducted between July 5, 2016, and March 2, 2018, using a large, urban family medicine residency program in Canada.

Exposures  Introduction of the Competency-Based Achievement System (CBAS).

Main Outcomes and Measures  Proportion of residents (1) with at least 1 performance or professionalism flag, (2) receiving flags on multiple distinct rotations, (3) classified as in difficulty, and (4) with flags addressed by the residency program.

Results  Files from 458 residents were reviewed (pre-CBAS: n = 163; 81 [49.7%] women; 90 [55.2%] aged >30 years; 105 [64.4%] Canadian medical graduates; post-CBAS: n = 295; 144 [48.8%] women; 128 [43.4%] aged >30 years; 243 [82.4%] Canadian medical graduates). A significant reduction in the proportion of residents receiving at least 1 flag during training after CBAS implementation was observed (0.38; 95% CI, 0.377-0.383), as well as a significant decrease in the numbers of distinct rotations during which residents received flags on summative assessments (0.24; 95% CI, 0.237-0.243). There was a decrease in the number of residents in difficulty after CBAS (from 0.13 [95% CI, 0.128-0.132] to 0.17 [95% CI, 0.168-0.172]) depending on the strictness of criteria defining a resident in difficulty. Furthermore, there was a significant increase in narrative documentation that a flag was discussed with the resident between the pre-CBAS and post-CBAS conditions (0.18; 95% CI, 0.178-0.183).

Conclusions and Relevance  The CBAS approach to assessment appeared to be associated with better identification of residents in difficulty, facilitating the program’s ability to address learners’ deficiencies in competence. After implementation of CBAS, residents experiencing challenges were better supported and their deficiencies did not recur on later rotations. A key argument for shifting to competency-based medical education is to change assessment approaches; these findings suggest that competency-based assessment may be useful.

Introduction

Competency-based medical education (CBME) has emerged as a predominant approach to health professions education for the foreseeable future.1 Competency-based medical education has been adopted in several countries, including by the Accreditation Council for Graduate Medical Education in the United States,2 both the College of Family Physicians of Canada3 and the Royal College of Physicians and Surgeons of Canada,4 and by accrediting and/or licensing bodies in Scotland,5 the Netherlands,6 and Australia.7

Despite the widespread endorsement of CBME by many accrediting or licensing bodies in health professions training, the shift to CBME is not without controversy.8-14 Although CBME is founded in educational and assessment theory,15,16 a prevalent criticism is that there is no evidence that CBME produces safer or more competent physicians than non-CBME approaches. Authors such as Klamen et al,9 Boyd et al,10 and Whitehead and Kuper11 call attention to the gap in the literature of outcomes data for CBME, competency-based assessment tools, or programs of assessment. This pushback against CBME will grow without evidence that CBME frameworks are more effective than traditional medical education in producing competent and safe physicians.15

The initial impetus for the CBME movement was a desire to address specific concerns about the varying abilities of graduates of health professions training programs and the potential association of that variation with the quality of care that patients receive.1,15-18 Proponents of CBME argue that competency-based assessment practices can aid with reducing barriers to reporting residents in difficulty through “formative assessments based on identifiable criteria and repeated observations.”19 Approaches to assessment in CBME are theorized to improve documentation of feedback shared with residents.17,20,21 Increased documentation of more-detailed formative feedback should allow for easier identification of performance patterns, red flags, and trajectory of progress toward competence.22 Gruppen at al23 recently emphasized the need for studies exploring the association between CBME and the frequency of identification of residents early in training who are not yet ready to be fully trusted in independent practice.

Although most CBME approaches are in the early stages of implementation within training programs, the Competency-Based Achievement System (CBAS),24 developed in the Department of Family Medicine at the University of Alberta, Edmonton, Alberta, Canada, has been in place since 2009. Before implementation of CBAS, assessment in the family medicine residency program followed traditional assessment approaches and focused on summative end-of-rotation forms to capture expert judgments of resident competence. Teaching and learning were somewhat disconnected from assessment. In addition, most assessments used forms with rating scales and checklists as the standard tools for capturing observer judgments. Some low-stakes assessment tools (field notes25) were used to capture formative feedback shared between residents and observers in the workplace, but their use was not consistent across all preceptors.

The CBAS is designed as programmatic assessment26-28 predicated on 2 fundamentals: assessment for learning20,29,30 and regular formative feedback shared with residents (documented with low-stakes assessment tools).17,21,25 The CBAS focuses on direct observation of residents in workplace-based training. In keeping with best practices of workplace-based assessment, CBAS helps to both facilitate and capture experts’ judgment and coaching after observation of learners. The assessment tools in CBAS are designed to allow preceptors to describe what they see the residents do in the workplace and tag or sort their observations according to high-level descriptions of areas of competence in family medicine (professionalism, communication skills, clinical reasoning, medical knowledge, patient-centered care, practice management, procedural skills, and appropriately prioritizing presenting issues).24 Although the competencies being assessed were similar for our pre-CBAS vs post-CBAS cohorts, the descriptors of those competencies were changed to enhance clarity and understanding.

The CBAS offers an opportunity to address some of the criticisms of CBME, particularly the need for evidence of proof of concept for CBME (ie, Does CBME result in a different outcome than traditional medical education and assessment?). The CBAS has been the mandated assessment system in a large, 2-year residency program for 8 years, allowing for accumulation of data across cohorts over time. Most of the clinical educators have been consistent for the past 15 years (ie, were teaching and assessing pre- and post-CBAS), which allowed examination of the change in the assessment system rather than a change in those performing the assessment.

In this study, we addressed one of the core assumptions of CBME by examining the extent to which use of competency-based assessment is associated with a change in rates of identification of residents in difficulty compared with traditional assessment. We performed a secondary data analysis of archived resident files from an urban family medicine residency program to compare rates of detection of and documentation of support for residents in difficulty before and after implementation of CBAS.

Methods

This retrospective, observational cohort study used secondary data analysis of data originally collected as part of the assessment process in the 2-year family medicine residency program at the University of Alberta. Data were extracted from residents’ permanent assessment files and entered into a spreadsheet with random codes replacing names with the express purpose of protecting confidentiality and anonymity of individual residents. This study was approved by the University of Alberta Human Research Ethics Board, and that board also stated that consent was not required for this secondary data analysis study. We adhered to the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline for reporting observational cohort studies.31

Cohorts of residents who started training between 2006 and 2016 were included in the study with the exception of the cohort that began in 2009. Entry cohorts before 2006 were excluded from the study owing to lack of availability of data; the 2009 entry cohort was excluded because this cohort was involved in CBAS pilot implementation. The present study was conducted between July 5, 2016, and March 2, 2018. The University of Alberta has family medicine residents in both urban (approximately 75%) and rural (approximately 25%) streams. Residents who were in the rural training program were excluded because of the heterogeneity of assessment across rural sites during the period of interest.

Demographic data included residents’ sex, residency start year (cohort membership), and age at the time of graduation from the residency program. The medical school from which a resident graduated was used to determine whether the resident was an international medical graduate (resident who completed medical school outside Canada) or a Canadian medical graduate.

Three program directors identified variables (referred to as flags) that indicated that a resident was having difficulty with 1 or more aspects of residency training. These variables are defined in the Box. These flags were identified in the resident’s file and included in the database. For all files with any flag identified, the file was further reviewed for evidence that a program representative had addressed the flag with the resident.

Box Section Ref ID
Box.

Definitions of the Variables

Flag
  1. Indication of below-average competence on a summative assessment at the end of a rotation or on an overall SPR program, or

  2. Any summative assessment that indicated a demonstration of unprofessional behavior.

Flag Addressed
  1. Comments on the SPR suggesting that the flag was discussed with the resident, and/or

  2. One of the following items was marked on the SPR:

    • Requires focused attention, or

    • Program attention required (on SPRs completed after 2010), or

    • Required remediation (on SPRs completed before 2010), and/or

  3. Comments on One45 (an online learning management system for summative assessment forms) from a program representative saying that the flag was addressed with the resident; and/or

  4. The resident’s file contained a copy of an email or a note about a telephone call indicating that the flag was discussed with the resident; and/or

  5. There was a formal assessment plan review to address the flags; and/or

  6. There was evidence of remediation related to the flags (eg, a remediation contract).

Resident in Difficulty
  1. A resident with more than 3 flags regardless of number of rotations flagged;

  2. A resident with flags indicated on final assessments from 2 or more rotations;

  3. A resident with flags indicated on final assessments from 3 or more rotations.

Abbreviation: SPR, Summative Progress Report.

Statistical Analysis

Descriptive statistics were calculated. Residents were grouped by cohort start year into either pre-CBAS condition or post-CBAS condition. Pre-CBAS included the cohorts who started residency in 2006, 2007, and 2008; post-CBAS included the cohorts who started in 2010, 2011, 2012, 2013, and 2014.

To compare the frequency of occurrence of flags between the pre-CBAS and post-CBAS conditions, residents were categorized into either yes (≥1 flag) or no (0 flags). Total numbers of flags on assessments were tabulated for each resident. Because residents could receive multiple flags on assessments from a single rotation, the numbers of distinct rotations during which a resident received a flag on at least 1 competency domain on summative rotation assessments were also tabulated.

Because of the categorical nature of the variables being examined, the χ2 test was used to examine differences in flags between the pre-CBAS and post-CBAS cohorts. For all tests, a significance level of a 1-tailed P < .05 was set and 95% CIs were calculated. The measures tested were (1) total flags received by residents, (2) specific numbers of flags received by individual residents, and (3) numbers of rotations in which residents received flags. The latter analysis was performed because it is possible for a resident to receive multiple flags on 1 rotation assessment, and this analysis was intended to examine whether there were residents who were identified as having deficiencies across multiple rotations (ie, receiving flags across assessments from multiple separate rotations).

The χ2 test was also used to compare differences in the frequency of residents in difficulty before and after CBAS. Three definitions of resident in difficulty were used, with each definition applying more strict criteria to reflect variation in the literature around definitions of resident in difficulty (Box).32-34 The 3 different definitions were used to ensure that the variations seen were reflective of changes in the numbers of residents in difficulty regardless of how stringent the defining criteria were.

The 95% CIs for differences between proportions were calculated using the Pearson χ2 formula.35 SPSS, version 25.0 (IBM)36 was used to perform all analyses.

Results

Files of 458 family medicine residents were analyzed (100% of all urban program residents in the cohort years included in this study; pre-CBAS: n = 163; 81 [49.7%] women; 90 [55.2%] aged >30 years; 105 [64.4%] Canadian medical graduates; post-CBAS: n = 295; 144 [48.8%] women; 128 [43.4%] aged >30 years; 243 [82.4%] Canadian medical graduates). The basic demographic characteristics of the residents (Table 1) were similar between the pre-CBAS and post-CBAS cohorts with the exception of the proportion of international medical graduates. The pre-CBAS cohort included a higher proportion of international medical graduates (57 [35.0%]) than did the post-CBAS cohorts (52 [17.6%]). Because of this difference in proportions of international medical graduates, all analyses were performed twice, once with the full data set and once with a data set in which the international medical graduates had been removed. This separate analysis was done to determine whether international medical graduates may be disproportionately represented among residents with flags, which would skew the data. In both sets of analyses, all findings were significant (ie, international medical graduates were not skewing the data). A further check was done using logistic regression to determine whether international medical graduate status was associated with receiving a flagged assessment, but the association was not significant. Given this result, we present all findings herein with the full data set.

Differences in the percentage of residents receiving flags on summative assessments are presented in Table 2. Before CBAS implementation, 44.9% (n = 22) to 50.8% (n = 30) of residents in each cohort received at least 1 flag during training; after CBAS implementation, between 16.1% (n = 10) and 27.0% (n = 27) of residents received at least 1 flagged assessment. In addition, there was a significant decrease in the aggregate proportion of residents who received multiple flags on summative assessments after CBAS (0.38; 95% CI, 0.377-0.383), including an observed difference in proportion of residents who were flagged in 5 or more areas. Prior to CBAS implementation, between 16.3% (n = 8) and 27.1% (n = 16) of residents received 5 or more flags, compared with between 0% and 10.7% (n = 6) in the post-CBAS cohorts (n = 458; χ225 = 64.5; P < .001). A significant decrease after CBAS was also found in the aggregate numbers of distinct rotations during which residents received flags on summative assessments (0.24; 95% CI, 0.237-0.243), most notably for the proportion of residents who received flags on assessments from 4 or more distinct rotations. Before CBAS, this occurred for between 4.1% (n = 2) and 7.3% (n = 4) of residents in each cohort, but in the post-CBAS cohorts, only 1 resident across all post-CBAS cohorts received flagged assessments from 4 or more distinct rotations (n = 458; χ27 = 47.04; P < .001).

Table 3 reports changes in the proportions of residents within each cohort who met the criteria for designation as a resident in difficulty, according to the definitions given. Regardless of definition, a significant decrease in the aggregate proportion of residents designated to be in difficulty was found after CBAS, with the magnitude of the change increasing with the strictness of the definition (definition 1: 0.17; 95% CI, 0.168-0.172; definition 2: 0.17; 95% CI, 0.168-0.172; definition 3: 0.13; 95% CI, 0.128-0.132) (Table 3).

We also analyzed changes between pre-CBAS training and post-CBAS training in the frequency of evidence of documentation that a flag on an assessment had been addressed with the resident (Figure). Results indicated that, for residents who had 1 or more flags on assessments, there was a significant increase in documentation that the flag was discussed with the resident between the pre-CBAS and post-CBAS aggregate conditions (0.18; 95% CI, 0.178-0.183). Before CBAS implementation, between 56.7% (n = 17) and 63.6% (n = 14) of files of residents who had received at least 1 flag included documentation that flag(s) had been addressed with the resident. After CBAS implementation, 62.5% (n = 10) to 100% (n = 17) of files included this documentation (n = 155; χ22 = 16.83; P < .001).

Discussion

These findings begin to answer some of the questions raised in the literature about justification for the shift to CBME,8-14 specifically, the need for evidence that CBME is an improvement over traditional medical education. Compared with the traditional assessment approach used in our program before the switch to CBME, competency-based assessment was associated with better identification of residents who encountered difficulties in training and improvement in how concerns about resident competence were addressed.

Since implementation of CBAS, there has been a significant decrease in the proportion of residents receiving at least 1 flag on a summative assessment. Before CBAS, approximately half of the residents in each cohort were flagged at least once during training; after CBAS, fewer than one-third of residents in each cohort were flagged at least once. These findings suggest that ongoing problems continue to be identified in a summative assessment, which is required for any effective assessment system.

There were large decreases in the proportion of residents who were receiving multiple flags. Before CBAS, between 16.3% (n = 8) and 23.6% (n = 13) of residents in each cohort received 5 or more flags on summative assessments. Proportions decreased after CBAS: there were 2 cohorts early in implementation in which approximately 11% of the residents received 5 or more flags (2010-2012 [11.3%] and 2011-2013 [10.7%]), and less than 2% of residents received 5 or more flags in the final 2 cohorts examined (Table 2).

A potential association was found between the decrease in flags on multiple discrete rotations and the reduction in the proportions of residents who met criteria for resident in difficulty. Although the proportion of residents who received a flagged assessment from 1 rotation remained stable across the study period, the proportion of residents who received a flag on assessments from more than 2 rotations decreased to approximately 0% with the exception of 1 resident in the 2012-2014 cohort. In keeping with these findings of a reduction in multiple flagged rotations, there was also a decrease in the number of residents in difficulty regardless of the definition used. Before CBAS, between 14% and 30% of residents in any given cohort could be classified as residents in difficulty (Table 2). After CBAS, a steady decrease in residents in difficulty was seen except in 1 outlier cohort (2012-2014). Even for the outlier cohort, the proportion of residents in difficulty was 6% to 12% lower than in the pre-CBAS cohorts. These findings suggest that the CBAS approach to assessment is associated with better identification of residents who were struggling in 1 or more areas and that those residents were supported so that their deficiencies in competence were not observable on later rotations.

The likelihood that the CBAS approach to assessment is associated with better support of residents who are flagged for deficiencies in competence is further supported by the finding of an increase in documentation showing that flags on summative assessments were discussed with the resident. Identified difficulties should be discussed with learners, but such coaching needs to be facilitated. Before CBAS, 35% to 40% of the residents who received 1 or more flags had no evidence in their files that the flag had been addressed or discussed with them (Figure). After CBAS, 3 of the 5 cohorts examined were found to have documentation that flags on assessments had been discussed with the resident for 88% to 100% of flagged residents. For 2 cohorts (2012-2014 and 2013-2015), the percentage of files with such documentation was lower but higher than the percentage before CBAS.

Overall, this study suggests that a competency-based assessment framework such as CBAS is associated with better identification of residents who have competence gaps. Furthermore, CBAS appears to be associated with better support for residents to address and ameliorate identified gaps. Although the previous assessment approach in this residency program had processes in place that were intended to identify when residents were struggling, the system was ineffective, perhaps because summative assessments were disconnected from daily observations. This failure to identify struggling residents is not unique to this one residency program; rather, this problem has been identified across multiple assessment approaches in medical education and is one of the key justifications for moving to CBME.1,15-17

It would be possible to dismiss these findings as being a result of improving processes of assessment. However, assessment in a CBME culture must be different,17,21,37-40 and the CBAS approach is fundamentally different from the previous approach to assessment in the residency program examined. In contrast to assessment that focused on capturing end of rotation judgements, the CBAS tools, forms, and processes capture evidence of progress toward competence across clinical experiences, including a representative sampling of the formative feedback shared by the clinical coaches who work with the resident. These low-stakes assessments may reflect and foster learning.

Summative assessments of progress toward competence occur regularly. High stakes in training evaluation reports are completed at the end of every rotation. High-stakes periodic progress reviews occur every 4 months (previously every 6 months). The difference after CBAS is that the periodic progress review is now a shared process in which resident self-reflections on progress toward competence are documented and then discussed between the faculty advisor (competence coach) and the resident, with the low-stakes assessments collected in CBAS used as the evidence base for guided self-assessment.41

The transparent nature of assessment in the CBAS framework, as well as the regular provision of formative feedback, has created a culture in which residents in difficulty can be identified early. Two factors contribute to this culture: the proliferation of documented evidence of progress toward competence (which can identify both strengths and gaps) and the regular discussion of the resident’s learning. Addressing a gap, such as a flag, is less stigmatizing in a culture in which supporting residents to be the best physicians that they can be is the focus of assessment. The process of flagging a resident on a summative assessment has not changed: before and after CBAS, a flag means that there are 1 or more topics on which a resident has not demonstrated competence. The difference is that concerns about competence are often discussed with the resident throughout a clinical experience, which means that in many cases, deficiencies are remedied before the final summative assessment at the end of the rotation.

The findings from this study build on the evidence that is emerging that supports the transition to CBME. United States internal medicine residency programs are beginning to publish data about implementation of milestones, which suggests that more information is being collected about the competence of residents in those programs,42,43 including increased identification of residents with areas of deficiency.44 A competency-based orthopedic residency program at the University of Toronto, Toronto, Ontario, Canada, has been successful in tailoring residency training to ensure that gaps in competencies can be addressed through tailored educational plans and accelerated demonstration of competence can result in reduced time needed to complete training.45,46 Evidence is emerging from other pilot programs across North America,22,47-49 but assessing outcomes takes time.

Limitations

The study has limitations. Although several hundred resident files across several years were reviewed, the project included residents from only one program and results are reported in aggregate. In addition, although we looked specifically at cohorts before and after implementation of CBAS, other factors may have contributed to the outcomes observed. During the period studied, changes were made in the selection process for the family medicine program, paired with a steady increase in interest in the specialty of family medicine. It is possible that increased competition for family medicine residency positions resulted in admission of higher-achieving candidates. Determining whether our results are a product of CBAS or of higher-caliber residents would be difficult to assess objectively. One area for future research would be to look at rates of residents encountering difficulty in family medicine programs across Canada and comparing those rates with the rates in our program.

There are other future research areas that are relevant to examining the outcome of competency-based assessment, but they are beyond the scope of this study. These include looking at whether the specific competencies that are flagged differ before vs after implementation of CBAS and whether there are clear individual differences within faculty members of what competencies they flag before vs after CBAS.

Conclusions

The findings from this multiyear comparison of implementation of competency-based assessment and traditional assessment support a proof of concept for CBME. Changing the focus of assessment to an emphasis on direct observation, increased documentation, and assessment for learning may be associated with improved identification of learners who are deficient in 1 or more competency and with how those deficiencies are addressed.

Back to top
Article Information

Accepted for Publication: September 20, 2018.

Published: November 9, 2018. doi:10.1001/jamanetworkopen.2018.4581

Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2018 Ross S et al. JAMA Network Open.

Corresponding Author: Shelley Ross, PhD, Department of Family Medicine, University of Alberta, 6-10 University Terrace, Edmonton T6G 2T4, Alberta, Canada (sross@ualberta.ca).

Author Contributions: Dr Ross had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: Ross, Binczyk, Schipper, Humphries, Nichols, Donoff.

Acquisition, analysis, or interpretation of data: Ross, Binczyk, Hamza, Schipper, Humphries.

Drafting of the manuscript: Ross, Binczyk, Hamza.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Ross, Binczyk, Hamza.

Obtained funding: Ross, Binczyk.

Administrative, technical, or material support: Schipper, Humphries.

Supervision: Ross, Schipper, Humphries.

Conflict of Interest Disclosures: Drs Ross, Humphries, and Donoff hold a copyright of the intellectual property for the Competency-Based Achievement System but receive no financial gain. No other disclosures were reported.

Funding/Support: Initial data extraction and coding for this project was funded through the Health Professions Education Research Summer Studentship, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Alberta, Canada (Ms Binczyk).

Role of the Funder/Sponsor: The funding organization had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Additional Contributions: Todd Milford, PhD (University of Victoria), assisted in reviewing our statistical analyses, and Mirella Chiodo (University of Alberta) assisted with retrieving data. There was no financial compensation.

References
1.
Frank  JR, Snell  L, Englander  R, Holmboe  ES; ICBME Collaborators.  Implementing competency-based medical education: moving forward.  Med Teach. 2017;39(6):568-573. doi:10.1080/0142159X.2017.1315069PubMedGoogle ScholarCrossref
2.
Batalden  P, Leach  D, Swing  S, Dreyfus  H, Dreyfus  S.  General competencies and accreditation in graduate medical education.  Health Aff (Millwood). 2002;21(5):103-111. doi:10.1377/hlthaff.21.5.103PubMedGoogle ScholarCrossref
3.
Tannenbaum  D, Kerr  J, Konkin  J,  et al.  Triple C Competency-Based Curriculum: Report of the Working Group on Postgraduate Curriculum Review—Part 1. College of Family Physicians of Canada. http://www.cfpc.ca/uploadedfiles/education/_pdfs/wgcr_triplec_report_english_final_18mar11.pdf. Published March 2011. Accessed January 3, 2018.
4.
Frank  JR, Danoff  D.  The CanMEDS initiative: implementing an outcomes-based framework of physician competencies.  Med Teach. 2007;29(7):642-647. doi:10.1080/01421590701746983PubMedGoogle ScholarCrossref
5.
Simpson  JG, Furnace  J, Crosby  J,  et al.  The Scottish doctor—learning outcomes for the medical undergraduate in Scotland: a foundation for competent and reflective practitioners.  Med Teach. 2002;24(2):136-143. doi:10.1080/01421590220120713PubMedGoogle ScholarCrossref
6.
Scheele  F, Teunissen  P, Van Luijk  S,  et al.  Introducing competency-based postgraduate medical education in the Netherlands.  Med Teach. 2008;30(3):248-253. doi:10.1080/01421590801993022PubMedGoogle ScholarCrossref
7.
Australian Curriculum Revision Working Group. Australian curriculum framework for junior doctors. http://curriculum.cpmec.org.au/. Accessed December 22, 2017.
8.
Whitcomb  ME.  Transforming medical education: is competency-based medical education the right approach?  Acad Med. 2016;91(5):618-620. doi:10.1097/ACM.0000000000001049PubMedGoogle ScholarCrossref
9.
Klamen  DL, Williams  RG, Roberts  N, Cianciolo  AT.  Competencies, milestones, and EPAs—are those who ignore the past condemned to repeat it?  Med Teach. 2016;38(9):904-910. doi:10.3109/0142159X.2015.1132831PubMedGoogle ScholarCrossref
10.
Boyd  VA, Whitehead  CR, Thille  P, Ginsburg  S, Brydges  R, Kuper  A.  Competency-based medical education: the discourse of infallibility.  Med Educ. 2018;52(1):45-57. doi:10.1111/medu.13467PubMedGoogle ScholarCrossref
11.
Whitehead  CR, Kuper  A.  Faith-based medical education.  Adv Health Sci Educ Theory Pract. 2017;22(1):1-3. doi:10.1007/s10459-016-9748-8PubMedGoogle ScholarCrossref
12.
Talbot  M.  Monkey see, monkey do: a critique of the competency model in graduate medical education.  Med Educ. 2004;38(6):587-592. doi:10.1046/j.1365-2923.2004.01794.xPubMedGoogle ScholarCrossref
13.
Brooks  MA.  Medical education and the tyranny of competency.  Perspect Biol Med. 2009;52(1):90-102. doi:10.1353/pbm.0.0068PubMedGoogle ScholarCrossref
14.
Glass  JM.  Competency based training is a framework for incompetence.  BMJ. 2014;348:g2909. PubMedGoogle ScholarCrossref
15.
Holmboe  ES, Sherbino  J, Englander  R, Snell  L, Frank  JR; ICBME Collaborators.  A call to action: the controversy of and rationale for competency-based medical education.  Med Teach. 2017;39(6):574-581. doi:10.1080/0142159X.2017.1315067PubMedGoogle ScholarCrossref
16.
Carraccio  CL, Englander  R.  From Flexner to competencies: reflections on a decade and the journey ahead.  Acad Med. 2013;88(8):1067-1073. doi:10.1097/ACM.0b013e318299396fPubMedGoogle ScholarCrossref
17.
Carraccio  C, Wolfsthal  SD, Englander  R, Ferentz  K, Martin  C.  Shifting paradigms: from Flexner to competencies.  Acad Med. 2002;77(5):361-367. doi:10.1097/00001888-200205000-00003PubMedGoogle ScholarCrossref
18.
Frank  JR, Snell  LS, Cate  OT,  et al.  Competency-based medical education: theory to practice.  Med Teach. 2010;32(8):638-645. doi:10.3109/0142159X.2010.501190PubMedGoogle ScholarCrossref
19.
Levine  MF, Shorten  G.  Competency-based medical education: its time has arrived.  Can J Anaesth. 2016;63(7):802-806. doi:10.1007/s12630-016-0638-6PubMedGoogle ScholarCrossref
20.
Schuwirth  LW, Van der Vleuten  CP.  Programmatic assessment: from assessment of learning to assessment for learning.  Med Teach. 2011;33(6):478-485. doi:10.3109/0142159X.2011.565828PubMedGoogle ScholarCrossref
21.
Schuwirth  L, Ash  J.  Assessing tomorrow’s learners: in competency-based education only a radically different holistic method of assessment will work: six things we could forget.  Med Teach. 2013;35(7):555-559. doi:10.3109/0142159X.2013.787140PubMedGoogle ScholarCrossref
22.
Schultz  K, Griffiths  J.  Implementing competency-based medical education in a postgraduate family medicine residency training program: a stepwise approach, facilitating factors, and processes or steps that would have been helpful.  Acad Med. 2016;91(5):685-689. doi:10.1097/ACM.0000000000001066PubMedGoogle ScholarCrossref
23.
Gruppen  L, Frank  JR, Lockyer  J,  et al; ICBME Collaborators.  Toward a research agenda for competency-based medical education.  Med Teach. 2017;39(6):623-630. doi:10.1080/0142159X.2017.1315065PubMedGoogle ScholarCrossref
24.
Ross  S, Poth  CN, Donoff  M,  et al.  Competency-based achievement system: using formative feedback to teach and assess family medicine residents’ skills.  Can Fam Physician. 2011;57(9):e323-e330.PubMedGoogle Scholar
25.
Donoff  MG.  Field notes: assisting achievement and documenting competence.  Can Fam Physician. 2009;55(12):1260-1262, e100-e102.PubMedGoogle Scholar
26.
Bok  HG, Teunissen  PW, Favier  RP,  et al.  Programmatic assessment of competency-based workplace learning: when theory meets practice.  BMC Med Educ. 2013;13(1):123. doi:10.1186/1472-6920-13-123PubMedGoogle ScholarCrossref
27.
Dijkstra  J, Galbraith  R, Hodges  BD,  et al.  Expert validation of fit-for-purpose guidelines for designing programmes of assessment.  BMC Med Educ. 2012;12(1):20. doi:10.1186/1472-6920-12-20PubMedGoogle ScholarCrossref
28.
van der Vleuten  CP, Schuwirth  LW, Driessen  EW,  et al.  A model for programmatic assessment fit for purpose.  Med Teach. 2012;34(3):205-214. doi:10.3109/0142159X.2012.652239PubMedGoogle ScholarCrossref
29.
Black  P, William  D.  Assessment and classroom learning.  Asess Educ. 1998;5(1):7-74. doi:10.1080/0969595980050102Google ScholarCrossref
30.
Black  P, Wiliam  D.  Classroom assessment and pedagogy [published online March 22, 2018].  Assess Educ. doi:10.1080/0969594X.2018.1441807Google Scholar
31.
von Elm  E, Altman  DG, Egger  M, Pocock  SJ, Gøtzsche  PC, Vandenbroucke  JP; STROBE Initiative.  Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies.  BMJ. 2007;335(7624):806-808. doi:10.1136/bmj.39335.541782.ADPubMedGoogle ScholarCrossref
32.
Smith  CS, Stevens  NG, Servis  M.  A general framework for approaching residents in difficulty.  Fam Med. 2007;39(5):331-336.PubMedGoogle Scholar
33.
Yao  DC, Wright  SM.  The challenge of problem residents.  J Gen Intern Med. 2001;16(7):486-492. doi:10.1046/j.1525-1497.2001.016007486.xPubMedGoogle ScholarCrossref
34.
Steinert  Y.  The “problem” learner: whose problem is it? AMEE guide No. 76.  Med Teach. 2013;35(4):e1035-e1045. doi:10.3109/0142159X.2013.774082PubMedGoogle ScholarCrossref
35.
Newcombe  RG.  Interval estimation for the difference between independent proportions: comparison of eleven methods.  Stat Med. 1998;17(8):873-890. doi:10.1002/(SICI)1097-0258(19980430)17:8<873::AID-SIM779>3.0.CO;2-IPubMedGoogle ScholarCrossref
36.
IBM SPSS Statistics for Windows, Version 25.0. Armonk, NY: IBM Corp; 2017.
37.
Gruppen  LD, Ten Cate  O, Lingard  LA, Teunissen  PW, Kogan  JR.  Enhanced requirements for assessment in a competency-based, time-variable medical education system.  Acad Med. 2018;93(3S competency-based, time-variable education in the health professions):S17-S21. doi:10.1097/ACM.0000000000002066PubMedGoogle ScholarCrossref
38.
Harris  P, Bhanji  F, Topps  M,  et al; ICBME Collaborators.  Evolving concepts of assessment in a competency-based world.  Med Teach. 2017;39(6):603-608. doi:10.1080/0142159X.2017.1315071PubMedGoogle ScholarCrossref
39.
Eva  KW, Bordage  G, Campbell  C,  et al.  Towards a program of assessment for health professionals: from training into practice.  Adv Health Sci Educ Theory Pract. 2016;21(4):897-913. doi:10.1007/s10459-015-9653-6PubMedGoogle ScholarCrossref
40.
Lockyer  J, Carraccio  C, Chan  MK,  et al; ICBME Collaborators.  Core principles of assessment in competency-based medical education.  Med Teach. 2017;39(6):609-616. doi:10.1080/0142159X.2017.1315082PubMedGoogle ScholarCrossref
41.
Sargeant  J, Armson  H, Chesluk  B,  et al.  The processes and dimensions of informed self-assessment: a conceptual model.  Acad Med. 2010;85(7):1212-1220. doi:10.1097/ACM.0b013e3181d85a4ePubMedGoogle ScholarCrossref
42.
Hauer  KE, Clauser  J, Lipner  RS,  et al.  The internal medicine reporting milestones: cross-sectional description of initial implementation in US residency programs.  Ann Intern Med. 2016;165(5):356-362. doi:10.7326/M15-2411PubMedGoogle ScholarCrossref
43.
Hauer  KE, Vandergrift  J, Hess  B,  et al.  Correlations between ratings on the resident annual evaluation summary and the internal medicine milestones and association with ABIM certification examination scores among US internal medicine residents, 2013-2014.  JAMA. 2016;316(21):2253-2262. doi:10.1001/jama.2016.17357PubMedGoogle ScholarCrossref
44.
Kinnear  B, Bensman  R, Held  J, O’Toole  J, Schauer  D, Warm  E.  Critical deficiency ratings in milestone assessment: a review and case study.  Acad Med. 2017;92(6):820-826. doi:10.1097/ACM.0000000000001383PubMedGoogle ScholarCrossref
45.
Ferguson  PC, Kraemer  W, Nousiainen  M,  et al.  Three-year experience with an innovative, modular competency-based curriculum for orthopaedic training.  J Bone Joint Surg Am. 2013;95(21):e166. doi:10.2106/JBJS.M.00314PubMedGoogle ScholarCrossref
46.
Nousiainen  MT, McQueen  SA, Hall  J,  et al.  Resident education in orthopaedic trauma: the future role of competency-based medical education.  Bone Joint J. 2016;98-B(10):1320-1325. doi:10.1302/0301-620X.98B10.37031PubMedGoogle ScholarCrossref
47.
Zeller  MP, Sherbino  J, Whitman  L, Skeate  R, Arnold  DM.  Design and implementation of a competency-based transfusion medicine training program in Canada.  Transfus Med Rev. 2016;30(1):30-36. doi:10.1016/j.tmrv.2015.11.001PubMedGoogle ScholarCrossref
48.
Powell  DE, Carraccio  C.  Toward competency-based medical education.  N Engl J Med. 2018;378(1):3-5. doi:10.1056/NEJMp1712900PubMedGoogle ScholarCrossref
49.
Chan  T, Sherbino  J; McMAP Collaborators.  The McMaster Modular Assessment Program (McMAP): a theoretically grounded work-based assessment system for an emergency medicine residency program.  Acad Med. 2015;90(7):900-905. doi:10.1097/ACM.0000000000000707PubMedGoogle ScholarCrossref
×