The study by Schumacher et al1 reports on the developmental growth curves of nearly 2000 residents from 23 pediatric residency programs who received 25 503 supervision level reports for the American Board of Pediatrics’ 17 entrustable professional activities (EPAs) for general pediatrics. This study, conducted by the Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network General Pediatrics Entrustable Professional Activities Study Group,1 found that after 36 months of training, the percentage of residency program graduates rated ready for unsupervised clinical practice varied according to EPA from 53% to 98%, and that a performance standard of 90% of trainees achieving unsupervised practice level would be met for only 8 of 17 EPAs (47%). These findings suggest that gaps exist between observed readiness for independent clinical practice and the standards that the American Board of Pediatrics determined were needed to produce physicians able to meet health needs of the patient populations they serve. Importantly, the study by Schumacher et al1 provides empirical evidence for 2 things: (1) curricular shortfalls in residency training requirements that can now be addressed and (2) potential inadequacies in the EPAs chosen for competency assessment in general pediatrics.
As context to the study by Schumacher et al,1 in 2005, ten Cate2 first published the concept of EPAs as an approach that supervising physicians could use to decide when a trainee could be trusted to independently undertake the responsibilities needed to perform physicians’ professional activities. During the ensuing 15 years, the approach has been discussed around the world, initially regarding coming to agreement on relevant concepts and implementation.3,4 This period was followed by a focus on assessments conducted with supervisory scales developed to advance the operational aspects of EPAs for competency-based assessments in undergraduate5 and graduate6 medical education. Additional work focused on faculty assessments of medical residents on rotation-specific observed activities to align these with semiannual Accreditation Council for Graduate Medical Education reports based on the internal medicine milestones.7
Schumacher et al1 present a rigorously conducted multisite study performed in general pediatric clinics, and their findings suggest that more work may be needed to ensure that graduates of pediatric residency training can be trusted for independent clinical practice. For decades, medical educators have assumed US physician training programs were not just adequate but optimal. Schools and programs did not share or were hesitant to pool data, and the primary evaluative focus for schools and programs was on attaining Liaison Committee for Medical Education or Accreditation Council for Graduate Medical Education accreditation, neither of which was solidly based on evidence. The study by Schumacher et al1 illustrates the application of a new framework for competency-based education, including many supervisory level assessments based on direct observations, allowing for more sophisticated analyses to be conducted to learn vitally important information.
This is a new era of assessment. It is one that is far more collaborative than in years past, such as the case of this study5 with 23 pediatric residency programs being courageous enough to collect data on resident supervision, analyze it within the context of how the study was conducted, and share it with the world, even if the results were not as expected and some limitations existed, such as the lack of reported EPA assessment accuracy, which may be affected by observer variability. This is important because EPAs, competencies, and milestones are only as strong as the assessment information that informs the judgment.
Twenty years ago, a mentor told me that there is no perfect study. Yet investigators must start somewhere, and it can be helpful to find issues with emerging research for the sake of advancing future science. Robust approaches to educational research have long been needed across the training continuum, and the study by Schumacher et al1 is a good example of one that helps inform the journey of competency-based assessments.
The general pediatrics community has made an important first step, and the next steps will be critical. Wrangling the conundrum of changing curricular requirements or changing the EPAs will certainly build character. However, this process can only be undertaken while fully understanding the strength of the direct observations on which such findings are based and including patient quality indicators associated with observed performance. Future work must develop an evidence base that accurately reflects readiness for independent clinical practice, which will likely require multiple assessment tools linked to curricular experiences, so that improvements in educational programs and application of EPAs can be fully realized. It would be amazing if all disciplines built longitudinal educational research networks, as the study by Schumacher et al1 has done, but which take the next step of adding resident performance data linked to quality of assessor judgments and patient health metrics, all of which could be pooled for rigorous analyses. Many disciplines could inform each other in meaningful ways to advance competency-based education, and such an approach would likely allow for future use of EPAs in higher-stakes assessments, such as board certification. There is much work to be done, and the study by Schumacher et al1 represents an important step in the journey of creating a new era of assessment.
Published: January 15, 2020. doi:10.1001/jamanetworkopen.2019.19583
Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2020 Carney PA. JAMA Network Open.
Corresponding Author: Patricia A. Carney, PhD, MS, Department of Family Medicine, School of Medicine, Oregon Health & Science University, School of Medicine, 3181 SW Sam Jackson Park Rd, Mail Code: FM, Portland, OR 97239 (carneyp@ohsu.edu).
Conflict of Interest Disclosures: None reported.
1.Schumacher
DJ, West
DC, Schwartz
A,
et al; Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network General Pediatrics Entrustable Professional Activities Study Group. Longitudinal assessment of resident performance using entrustable professional activities.
JAMA Netw Open. 2020;3(1):e1919316. doi:
10.1001/jamanetworkopen.2019.19316Google Scholar 6.Mink
RB, Schwartz
A, Herman
BE,
et al; Steering Committee of the Subspecialty Pediatrics Investigator Network. Validity of level of supervision scales for assessing pediatric fellows on the common pediatric subspecialty entrustable professional activities.
Acad Med. 2018;93(2):283-291. doi:
10.1097/ACM.0000000000001820PubMedGoogle ScholarCrossref 7.Choe
JH, Knight
CL, Stiling
R, Corning
K, Lock
K, Steinberg
KP. Shortening the miles to the milestones: connecting EPA-based evaluations to ACGME milestone reports for internal medicine residency programs.
Acad Med. 2016;91(7):943-950. doi:
10.1097/ACM.0000000000001161PubMedGoogle ScholarCrossref