Author Affiliations: Knowledge Translation Program of the Li Ka Shing Knowledge Institute at St Michael's Hospital (Dr Davis and Mr Thorpe), Departments of Health Policy, Management, and Evaluation (Dr Davis), Family and Community Medicine (Dr Davis), and Public Health Sciences (Mr Thorpe), and the Office of Continuing Education and Professional Development (Ms Perrier), University of Toronto, Toronto, Ontario; Departments of Family Medicine and Epidemiology and Community Health, School of Medicine, Virginia Commonwealth University, Richmond (Dr Mazmanian); Center for Collaborative and Interactive Technologies, Baylor College of Medicine, Houston, Tex (Dr Fordis); and Department of Medical Education, University of Michigan, Ann Arbor (Dr Harrison).
Context Core physician activities of lifelong learning, continuing medical education credit, relicensure, specialty recertification, and clinical competence are linked to the abilities of physicians to assess their own learning needs and choose educational activities that meet these needs.
Objective To determine how accurately physicians self-assess compared with external observations of their competence.
Data Sources The electronic databases MEDLINE (1966-July 2006), EMBASE (1980-July 2006), CINAHL (1982-July 2006), PsycINFO (1967-July 2006), the Research and Development Resource Base in CME (1978-July 2006), and proprietary search engines were searched using terms related to self-directed learning, self-assessment, and self-reflection.
Study Selection Studies were included if they compared physicians' self-rated assessments with external observations, used quantifiable and replicable measures, included a study population of at least 50% practicing physicians, residents, or similar health professionals, and were conducted in the United Kingdom, Canada, United States, Australia, or New Zealand. Studies were excluded if they were comparisons of self-reports, studies of medical students, assessed physician beliefs about patient status, described the development of self-assessment measures, or were self-assessment programs of specialty societies. Studies conducted in the context of an educational or quality improvement intervention were included only if comparative data were obtained before the intervention.
Data Extraction Study population, content area and self-assessment domain of the study, methods used to measure the self-assessment of study participants and those used to measure their competence or performance, existence and use of statistical tests, study outcomes, and explanatory comparative data were extracted.
Data Synthesis The search yielded 725 articles, of which 17 met all inclusion criteria. The studies included a wide range of domains, comparisons, measures, and methodological rigor. Of the 20 comparisons between self- and external assessment, 13 demonstrated little, no, or an inverse relationship and 7 demonstrated positive associations. A number of studies found the worst accuracy in self-assessment among physicians who were the least skilled and those who were the most confident. These results are consistent with those found in other professions.
Conclusions While suboptimal in quality, the preponderance of evidence suggests that physicians have a limited ability to accurately self-assess. The processes currently used to undertake professional development and evaluate competence may need to focus more on external assessment.
Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of Physician Self-assessment Compared With Observed Measures of Competence: A Systematic Review. JAMA. 2006;296(9):1094–1102. doi:10.1001/jama.296.9.1094
Coronavirus Resource Center
Customize your JAMA Network experience by selecting one or more topics from the list below.