Core Competencies in Evidence-Based Practice for Health Professionals: Consensus Statement Based on a Systematic Review and Delphi Survey | Medical Education and Training | JAMA Network Open | JAMA Network
[Skip to Navigation]
Sign In
Figure.  Flow Diagram of the Process of Developing the Set of Evidence-Based Practice (EBP) Core Competencies
Flow Diagram of the Process of Developing the Set of Evidence-Based Practice (EBP) Core Competencies

Participants in the 2-round Delphi survey rated the relative importance of each competency as “omitted: is not a priority to be included in an EBP teaching program,” “mentioned: should be just mentioned in an EBP teaching program (ie, provide common knowledge of the competency),” “explained: should be briefly explained in an EBP teaching program (ie, provide understanding of the competency but without practical exercises),” or “practiced with exercises: should be practiced with exercises in an EBP teaching program (ie, provide a detailed understanding of the competency, enhanced with practical exercises).”

Table 1.  Characteristics of Participants in Each Stage of Modified Delphi Survey
Characteristics of Participants in Each Stage of Modified Delphi Survey
Table 2.  Final Set of EBP Core Competencies Grouped Into the Main EBP Domains
Final Set of EBP Core Competencies Grouped Into the Main EBP Domains
1.
Straus  S, Glasziou  P, Richardson  WS, Haynes  B.  Evidence-Based Medicine: How to Practice and Teach It. London, UK: Churchill Livingstone; 2010.
2.
Haynes  RB, Devereaux  PJ, Guyatt  GH.  Clinical expertise in the era of evidence-based medicine and patient choice.  ACP J Club. 2002;136(2):A11-A14.PubMedGoogle Scholar
3.
Frenk  J, Chen  L, Bhutta  ZA,  et al.  Health professionals for a new century: transforming education to strengthen health systems in an interdependent world.  Lancet. 2010;376(9756):1923-1958.PubMedGoogle ScholarCrossref
4.
Institute of Medicine (US) Committee on the Health Professions Education Summit. The core competencies needed for health care professionals. In: Greiner  AC, Knebel  E, eds.  Health Professions Education: A Bridge to Quality. Washington, DC: National Academies Press; 2003:chap 3.
5.
Accreditation Council of Graduate Medical Education website. Program and institutional guidelines. http://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements. Accessed April 20, 2018.
6.
Institute of Medicine. Evidence-based medicine and the changing nature of healthcare: 2007 IOM annual meeting summary. Washington, DC: Institute of Medicine; 2008.
7.
Sadeghi-Bazargani  H, Tabrizi  JS, Azami-Aghdash  S.  Barriers to evidence-based medicine: a systematic review.  J Eval Clin Pract. 2014;20(6):793-802.PubMedGoogle ScholarCrossref
8.
Zwolsman  S, te Pas  E, Hooft  L, Wieringa-de Waard  M, van Dijk  N.  Barriers to GPs’ use of evidence-based medicine: a systematic review.  Br J Gen Pract. 2012;62(600):e511-e521.PubMedGoogle ScholarCrossref
9.
Glasziou  P, Burls  A, Gilbert  R.  Evidence based medicine and the medical curriculum.  BMJ. 2008;337:a1253.PubMedGoogle ScholarCrossref
10.
Hatala  R, Guyatt  G.  Evaluating the teaching of evidence-based medicine.  JAMA. 2002;288(9):1110-1112.PubMedGoogle ScholarCrossref
11.
Moynihan  S, Paakkari  L, Välimaa  R, Jourdan  D, Mannix-McNamara  P.  Teacher competencies in health education: results of a Delphi study.  PLoS One. 2015;10(12):e0143703.PubMedGoogle ScholarCrossref
12.
Smith  SR.  AMEE guide No. 14: outcome-based education: part 2—planning, implementing and evaluating a competency-based curriculum.  Med Teach. 1999;21(1):15-22.Google ScholarCrossref
13.
Harden  RM, Crosby  JR, Davis  MH, Friedman  M.  AMEE guide No. 14: outcome-based education: part 5—from competency to meta-competency: a model for the specification of learning outcomes.  Med Teach. 1999;21(6):546-552.PubMedGoogle ScholarCrossref
14.
Carraccio  C, Englander  R, Van Melle  E,  et al; International Competency-Based Medical Education Collaborators.  Advancing competency-based medical education: a charter for clinician-educators.  Acad Med. 2016;91(5):645-649.PubMedGoogle ScholarCrossref
15.
Martin  M, Vashisht  B, Frezza  E,  et al.  Competency-based instruction in critical invasive skills improves both resident performance and patient safety.  Surgery. 1998;124(2):313-317.PubMedGoogle ScholarCrossref
16.
Antonoff  MB, Swanson  JA, Green  CA, Mann  BD, Maddaus  MA, D’Cunha  J.  The significant impact of a competency-based preparatory course for senior medical students entering surgical residency.  Acad Med. 2012;87(3):308-319.PubMedGoogle ScholarCrossref
17.
Long  DM.  Competency-based residency training: the next advance in graduate medical education.  Acad Med. 2000;75(12):1178-1183.PubMedGoogle ScholarCrossref
18.
Calhoun  JG, Ramiah  K, Weist  EM, Shortell  SM.  Development of a core competency model for the master of public health degree.  Am J Public Health. 2008;98(9):1598-1607.PubMedGoogle ScholarCrossref
19.
Penciner  R, Langhan  T, Lee  R, McEwen  J, Woods  RA, Bandiera  G.  Using a Delphi process to establish consensus on emergency medicine clerkship competencies.  Med Teach. 2011;33(6):e333-e339.PubMedGoogle ScholarCrossref
20.
Kilroy  DA, Mooney  JS.  Determination of required pharmacological knowledge for clinical practice in emergency medicine using a modified Delphi technique.  Emerg Med J. 2007;24(9):645-647.PubMedGoogle ScholarCrossref
21.
Moser  JM.  Core academic competencies for master of public health students: one health department practitioner’s perspective.  Am J Public Health. 2008;98(9):1559-1561.PubMedGoogle ScholarCrossref
22.
Lalloo  D, Demou  E, Kiran  S, Cloeren  M, Mendes  R, Macdonald  EB.  International perspective on common core competencies for occupational physicians: a modified Delphi study.  Occup Environ Med. 2016;73(7):452-458.PubMedGoogle ScholarCrossref
23.
Albarqouni  L, Glasziou  P, Hoffmann  T.  Completeness of the reporting of evidence-based practice educational interventions: a review.  Med Educ. 2018;52(2):161-170.PubMedGoogle ScholarCrossref
24.
Dawes  M, Summerskill  W, Glasziou  P,  et al; Second International Conference of Evidence-Based Health Care Teachers and Developers.  Sicily statement on evidence-based practice.  BMC Med Educ. 2005;5(1):1.PubMedGoogle ScholarCrossref
25.
Austvoll-Dahlgren  A, Oxman  AD, Chalmers  I,  et al.  Key concepts that people need to understand to assess claims about treatment effects.  J Evid Based Med. 2015;8(3):112-125.PubMedGoogle ScholarCrossref
26.
Chalmers  I, Oxman  AD, Austvoll-Dahlgren  A,  et al.  Key concepts for informed health choices: a framework for helping people learn how to assess treatment claims and make informed choices.  BMJ Evid Based Med. 2018;23(1):29-33.PubMedGoogle ScholarCrossref
27.
Dunn  WR, Hamilton  DD, Harden  RM.  Techniques of identifying competencies needed of doctors.  Med Teach. 1985;7(1):15-25.PubMedGoogle ScholarCrossref
28.
Fink  A, Kosecoff  J, Chassin  M, Brook  RH.  Consensus methods: characteristics and guidelines for use.  Am J Public Health. 1984;74(9):979-983.PubMedGoogle ScholarCrossref
29.
Humphrey-Murto  S, Varpio  L, Wood  TJ,  et al.  The use of the Delphi and other consensus group methods in medical education research: a review.  Acad Med. 2017;92(10):1491-1498.PubMedGoogle ScholarCrossref
30.
Diamond  IR, Grant  RC, Feldman  BM,  et al.  Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies.  J Clin Epidemiol. 2014;67(4):401-409.PubMedGoogle ScholarCrossref
31.
Miller  GE.  The assessment of clinical skills/competence/performance.  Acad Med. 1990;65(9)(suppl):S63-S67.PubMedGoogle ScholarCrossref
32.
Melnyk  BM, Gallagher-Ford  L, Long  LE, Fineout-Overholt  E.  The establishment of evidence-based practice competencies for practicing registered nurses and advanced practice nurses in real-world clinical settings: proficiencies to improve healthcare quality, reliability, patient outcomes, and costs.  Worldviews Evid Based Nurs. 2014;11(1):5-15.PubMedGoogle ScholarCrossref
33.
Guyatt  GH, Meade  MO, Jaeschke  RZ, Cook  DJ, Haynes  RB.  Practitioners of evidence based care. not all clinicians need to appraise evidence from scratch but all need some skills.  BMJ. 2000;320(7240):954-955.PubMedGoogle ScholarCrossref
34.
Straus  SE, Green  ML, Bell  DS,  et al; Society of General Internal Medicine Evidence-Based Medicine Task Force.  Evaluating the teaching of evidence based medicine: conceptual framework.  BMJ. 2004;329(7473):1029-1032.PubMedGoogle ScholarCrossref
35.
Castle  JC, Chalmers  I, Atkinson  P,  et al.  Establishing a library of resources to help people understand key concepts in assessing treatment claims—the “Critical thinking and Appraisal Resource Library” (CARL).  PLoS One. 2017;12(7):e0178666.PubMedGoogle ScholarCrossref
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    1 Comment for this article
    Better training of health professionals
    Frederick Rivara, MD, MPH | University of Washington
    One of the cardinal tenets of quality improve is that we first have to measure what we want to improve. This study provides those outcome measurements that we can use to improve the education and training of physicians and help to develop the best workforce possible.
    CONFLICT OF INTEREST: Editor in Chief, JAMA Network Open
    Consensus Statement
    Medical Education
    June 22, 2018

    Core Competencies in Evidence-Based Practice for Health Professionals: Consensus Statement Based on a Systematic Review and Delphi Survey

    Author Affiliations
    • 1Centre for Research in Evidence-Based Practice, Bond University, Robina, Queensland, Australia
    • 2Li Ka Shing Knowledge Institute, St. Michael’s Hospital, Toronto, Ontario, Canada
    • 3Department of Medicine, University of Toronto, Toronto, Ontario, Canada
    • 4Department of Health and Functioning, Faculty of Health and Social Sciences, Western Norway University of Applied Sciences, Bergen, Norway
    • 5Centre for Evidence-based Health Care, Division of Epidemiology and Biostatistics, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
    • 6Cochrane South Africa, South African Medical Research Council, Cape Town, South Africa
    • 7Medical Education Research and Quality Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, Victoria, Australia
    • 8Department of Veterans Affairs, University of Alabama at Birmingham
    • 9Department of General Internal Medicine, University of Alabama at Birmingham
    • 10Department of Health Research Methods, Evidence and Impact, McMaster University Faculty of Health Sciences, Hamilton, Ontario, Canada
    JAMA Netw Open. 2018;1(2):e180281. doi:10.1001/jamanetworkopen.2018.0281
    Key Points español 中文 (chinese)

    Question  What are the core competencies in evidence-based practice (EBP) that health professionals should meet?

    Findings  In this systematic, multistage, modified Delphi survey study, a contemporary set of 68 core competencies in EBP grouped into the main EBP domains was developed.

    Meaning  This consensus statement of the core competencies in EBP should inform the development of EBP curricula for health professionals.

    Abstract

    Importance  Evidence-based practice (EBP) is necessary for improving the quality of health care as well as patient outcomes. Evidence-based practice is commonly integrated into the curricula of undergraduate, postgraduate, and continuing professional development health programs. There is, however, inconsistency in the curriculum content of EBP teaching and learning programs. A standardized set of minimum core competencies in EBP that health professionals should meet has the potential to standardize and improve education in EBP.

    Objective  To develop a consensus set of core competencies for health professionals in EBP.

    Evidence Review  For this modified Delphi survey study, a set of EBP core competencies that should be covered in EBP teaching and learning programs was developed in 4 stages: (1) generation of an initial set of relevant EBP competencies derived from a systematic review of EBP education studies for health professionals; (2) a 2-round, web-based Delphi survey of health professionals, selected using purposive sampling, to prioritize and gain consensus on the most essential EBP core competencies; (3) consensus meetings, both face-to-face and via video conference, to finalize the consensus on the most essential core competencies; and (4) feedback and endorsement from EBP experts.

    Findings  From an earlier systematic review of 83 EBP educational intervention studies, 86 unique EBP competencies were identified. In a Delphi survey of 234 participants representing a range of health professionals (physicians, nurses, and allied health professionals) who registered interest (88 [61.1%] women; mean [SD] age, 45.2 [10.2] years), 184 (78.6%) participated in round 1 and 144 (61.5%) in round 2. Consensus was reached on 68 EBP core competencies. The final set of EBP core competencies were grouped into the main EBP domains. For each key competency, a description of the level of detail or delivery was identified.

    Conclusions and Relevance  A consensus-based, contemporary set of EBP core competencies has been identified that may inform curriculum development of entry-level EBP teaching and learning programs for health professionals and benchmark standards for EBP teaching.

    Introduction

    The term evidence-based medicine was first developed in the field of medicine in the early 1990s, but as its use expanded to include other health disciplines, it became known as evidence-based practice (EBP). Evidence-based practice provides a framework for the integration of research evidence and patients’ values and preferences into the delivery of health care.1,2 Implementation of EBP principles has resulted in major advances in improving the quality of delivered health care as well as patient outcomes. The last 20 years have seen EBP increasingly integrated as a core component into the curriculum of undergraduate, postgraduate, and continuing education health programs worldwide.3,4 Many national registration bodies and accreditation councils (eg, the Accreditation Council for Graduate Medical Education in the United States) expect that all clinicians (ie, health professionals and learners of any discipline) should be competent in EBP.5 The National Academy of Medicine (formerly the Institute of Medicine), an independent, nongovernmental, nonprofit organization that provides advice, counsel, and independent research on major topics in health care, has recognized EBP as one of the core competencies necessary for continuous improvement of the quality and safety of health care.6

    Although many teaching strategies have been used and evaluated, a lack of EBP knowledge and skills is still one of the most commonly reported barriers to practicing EBP.7,8 One of the potential explanations is the inconsistency in the quality and content of the EBP teaching programs9 (also L.A., P.G., T.H., unpublished data, 2018). A standardized set of core competencies in EBP for clinicians and students may therefore improve EBP teaching and learning programs as well as EBP knowledge and skills.10

    Core competencies have been defined as the essential minimal set of a combination of attributes, such as applied knowledge, skills, and attitudes, that enable an individual to perform a set of tasks to an appropriate standard efficiently and effectively.11 Core competencies offer a common shared language for all health professions for defining what all are expected to be able to do to work optimally.

    Recognizing it as a promising way of reforming and managing medical education and ultimately improving quality of care,12,13 the Institute of Medicine report Health Professions Education: A Bridge to Quality endorsed competency-based education across the health professions.4 Implementation of competency-based education involves the identification of core competencies, designing curricula and teaching programs that clearly articulate the attributes underpinning each core competency, and developing assessment tools that provide a valid and reliable evaluation of these core competencies.14

    A clear outline of core competencies is critical in any health care education setting, as it informs the blueprinting of a curriculum, including learning outcomes, assessment strategies, and graduate attributes.15-17 Therefore, defining core competencies is a priority in health care education.11,18-22 Unaware of any systematically derived set of core competencies in EBP, we set out to remedy this deficiency. The objective of this study was to develop a consensus-based set of core EBP competencies that EBP teaching and learning programs should cover.

    Methods

    We conducted a multistage, modified Delphi study, in which we (1) generated, from a systematic review, an initial set of potential competencies to be considered for inclusion in the EBP core competencies set; (2) conducted a 2-round modified Delphi survey to prioritize and gain consensus on the most essential EBP core competencies; (3) held a meeting to finalize the consensus on the set of EBP core competencies; and (4) sought feedback and endorsement from EBP experts and planned for dissemination.

    Generation of an Initial Set of Relevant EBP Competencies

    We previously completed a systematic review of EBP educational studies, following Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) reporting guidelines.23 Studies were eligible if they were controlled (that is, had a separate group for comparison) and had investigated the effect of EBP education among clinicians (irrespective of the level of training, profession, or intervention format). Of 1682 articles identified, we screened 714 titles and abstracts for eligibility. Of these, 286 full-text articles were obtained for review, and 83 articles proved eligible. Results of the review, rather than competencies, are reported elsewhere.23 We reviewed included studies to identify EBP competencies addressed in these studies. In addition, EBP curricula and key statements (eg, Sicily statement on EBP,24 Institute of Medicine reports,4 and the Informed Health Choice key concepts25,26) were identified by contacting experts in this field and reviewing suggested documents. These were reviewed for relevant EBP competencies, which were defined as “attributes such as applied knowledge, skills and attitudes that enable an individual to perform a set of tasks to an appropriate standard efficiently and effectively.”11 Three of us (L.A., T.H., and P.G.) independently extracted EBP competencies from a random sample of 20 articles and continued discussion until consensus was attained. Afterward, one of us (L.A.) extracted EBP competencies from the rest of the included articles. These authors reviewed this set of initial EBP competencies for duplication, overlap, and clarity, leaving uniquely specified competencies. The same 3 authors grouped these competencies into the relevant EBP steps (introductory, ask, acquire, appraise and interpret, apply, and evaluate). eMethods 1 in the Supplement presents detailed methods of this stage.

    Two-Round Delphi Survey

    We used a modified 2-round Delphi survey to obtain the input of a broad range of experts and stakeholders on the most essential EBP core competencies.27-30 We used a purposive and snowball sampling approach to invite clinicians who had significant experience in teaching and/or practicing EBP to register their interest in participating in our Delphi survey (February 2017). We sent email invitations to the evidence-based health care listserv and other networks of national and international evidence-based health societies and posted announcements on social media (eg, Twitter and Facebook).

    The Figure illustrates the process of the modified Delphi survey. The round 1 survey (March-April 2017) consisted of 86 competencies grouped into EBP steps (introductory, ask, acquire, appraise and interpret, apply, and evaluate). We invited participants who responded and registered their interest to participate in round 1. Participants rated the relative importance of each competency as “omitted: is not a priority to be included in an EBP teaching program,” “mentioned: should be just mentioned in an EBP teaching program (ie, provide common knowledge of the competency),” “explained: should be briefly explained in an EBP teaching program (ie, provide understanding of the competency but without practical exercises),” or “practiced with exercises: should be practiced with exercises in an EBP teaching program (ie, provide a detailed understanding of the competency, enhanced with practical exercises).” We chose this rating scale to reflect the desired learning outcome and clinical competence (ie, Miller’s Pyramid of Clinical Competence31) and the required level of detail and time commitment to be delivered. For round 2, we retained EBP competencies that attained a predefined consensus level of at least 70% of participants per competency or a combined rating of greater than or equal to 85% across 2 rating categories (eg, combined rating of mentioned and explained ≥85%).

    Participants who responded and completed the round 1 survey were invited to participate in round 2 (May-June 2017). For this round, we revised the retained competencies based on feedback from participants and arranged them into 5 groups (Figure). Group A included competencies that a predefined consensus (≥70%) agreed should be practiced with exercises or explained or mentioned; participants were advised that these would be included in the final set of core competencies unless strong objection was received in that round. Groups B, C, and D were competencies that did not achieve the predefined consensus level in round 1, but most (≥85%) agreed should be practiced with exercises or explained; explained or mentioned; or mentioned or omitted, respectively. Participants in round 2 were asked to rate whether these competencies should be practiced with exercises or explained, explained or mentioned, or mentioned or omitted. Group E included new competencies that were suggested by round 1 participants, who then rated them omitted, mentioned, explained, or practiced with exercises.

    Survey Monkey, a web-based survey service, provided the platform for the surveys. In both rounds, participants were given a chance to suggest additional competencies, argue for or against proposed competencies, and comment on competency wording and comprehension. We obtained ethics approval for this study from the Human Research Ethics Committee at Bond University. Participants were informed that consent was assumed if they responded to the survey. Detailed methods of the Delphi survey are presented in eMethods 2 through 4 in the Supplement.

    Consensus Meeting and Postmeeting Activities

    A 2-day consensus meeting (July 10-11, 2017) was organized by the Centre for Research in Evidence-Based Practice (L.A., T.H., and P.G.) and involved 10 participants purposively chosen to represent a range of health professions, experience in teaching EBP, geographical locations, and representation of EBP societies and organizations. We presented the results of the systematic review and the 2-round Delphi survey. Following presentation of the results, the group participated in focused discussions addressing the proposed set of core competencies and made final decisions on the inclusion of each competency and its wording and description. To ensure that the consensus set of competencies reflected the decisions made, participants reviewed a document presenting the consensus set of competencies after the meeting. To ensure the validity, applicability, utility, and clarity of the competencies, we sent the final set of EBP core competencies for external feedback to 15 EBP experts (purposively identified to represent different EBP organizations and societies, including the International Society for Evidence-Based Health Care board members). Based on feedback from EBP experts, we further revised the wording and explanation of the competencies. All coauthors were emailed the draft document and provided minor wording suggestions.

    Results
    Generation of an Initial Set of Relevant EBP Competencies

    We identified 234 EBP competencies, which decreased to 86 unique competencies after removal of duplicates. eTables 1 and 2 and the eFigure in the Supplement present details.

    Delphi Survey and Consensus Meeting

    Of the 234 individuals who registered their interest (88 [61.1%] women; mean [SD] age, 45.2 [10.2] years), 184 (78.6%) participated in round 1 of the Delphi survey, and 144 participated in round 2 (61.5%, or 78.3% of round 1 participants). Of the 144 round 2 participants, 88 (61.1%) were women, 63 (43.8%) were 30 to 44 years old, 60 (41.7%) were 45 to 59 years old, and 115 (79.9%) currently taught EBP, with a mean (SD) of 10.9 (7.4) years of EBP teaching experience. Participants were from 28 different countries. In total, 59 participants (41.0%) were medical professionals (not including nurses, who were categorized separately) and 56 (38.9%) were allied health professionals. More than one-third of participants (n = 54 [37.5%]) had both clinical and academic (teaching or research) roles. The majority (n = 118 [81.9%]) were working in a university setting, and 53 participants (36.8%) worked in hospitals (Table 1).

    After round 1, 11 competencies attained the predefined consensus level (≥70%) (group A); 25 competencies were rated by the majority (≥85%) practiced with exercises or explained (group B); 28 were rated by the majority (≥85%) explained or mentioned (group C); 4 were rated by the majority (≥85%) mentioned or omitted (group D); and 9 new competencies were suggested by participants (group E). After round 2, 48 competencies had achieved the consensus level (≥70%): 20 competencies were rated as practiced with exercises; 20 as explained; and 8 as mentioned. In total, 29 competencies did not achieve the a priori consensus level and were retained for further discussion at the consensus meeting; 20 were subsequently included. The Figure illustrates the results of the modified Delphi survey. eTables 3 and 4 in the Supplement present detailed results of rounds 1 and 2.

    Core Competencies in EBP

    After the 2 rounds of Delphi survey and the consensus meeting, a total of 68 competencies achieved consensus for inclusion in the final set of EBP core competencies. Table 2 presents the final set of EBP core competencies (eTable 5 in the Supplement includes the set and an elaboration of each competency). The final set of EBP core competencies are grouped into the main EBP domains: introductory (n = 5); ask (n = 3); acquire (n = 4); appraise and interpret (n = 9); apply (n = 4); and evaluate (n = 2). We also provide a description of each key competency and the level of detail or delivery for each one (a proxy of the time that should be dedicated to teaching each competency—M, mentioned; E, explained; and P, practiced with exercise). We found that most of the core competencies could be classified within the 5-step model of EBP, which is also used by the Sicily statement,24 except for the introductory competencies, which we therefore retained.

    Discussion

    This study was a rigorous process, which involved integrating evidence from a systematic review, conducting a modified Delphi survey, holding a consensus meeting, and receiving external feedback from EBP experts, to achieve consensus on the most essential core competencies that should be taught in EBP educational programs for clinicians and students. The final consensus set includes 68 core competencies.

    A previous study has developed a set of EBP competencies, but it was limited to a single discipline (nursing) and country (United States) and did not use a systematic review to inform the Delphi survey.32 Some competencies appear in this previously identified set (eg, critical appraisal of a research article, formulate a clinical question using PICO [patient, intervention, comparison, outcome]). However, our competencies are more specific and extend to include the application of evidence, including through shared decision making, and evidence implementation at the individual clinical level. The set of EBP core competencies highlights the required level of detail needed (ie, mentioned, explained, and practiced with exercises) for each EBP competency as a proxy for the amount of time that should be dedicated to each. Additionally, we view this set of EBP core competencies as a contemporary and dynamic set. As the field matures, new competencies will undoubtedly need to be added, and others removed. For instance, shared decision making and the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach are 2 recent competencies that were not taught in EBP curricula previously. We plan to review this set periodically and welcome any feedback.

    With the increased availability of trustworthy preappraised evidence resources, clinicians can practice EBP without being fully competent in detailed critical appraisal of individual studies. What they must know, however, is how to critically interpret and apply the results presented in these preappraised sources.33,34 This full understanding is necessary to trade off desirable and undesirable consequences, particularly when they are closely balanced. Furthermore, shared decision making requires clearly communicating about the trade-offs with patients. However, clinicians may still sometimes need to critically appraise individual studies (for example, when there are no trusted preappraised resources that answer a clinical question, or when a new study challenges their current practice). In addition, skills in critical appraisal are helpful in determining the trustworthiness of preappraised evidence.

    The core competencies should be suitable to inform the curricula for an introductory course in EBP for clinicians of any level of education and any discipline. The competencies provide building blocks for EBP educators to use to develop their own curriculum, tailored to local learning needs, time availability, discipline, and the previous EBP experience of the learners. Competencies are unlikely to be exhaustive or tailored to the specific needs of any one discipline. However, some of the competencies might be more relevant to one discipline than another (eg, diagnosis is more relevant to the discipline of medicine than to others). The order of the EBP core competencies in the set does not reflect the order of their importance or sequence in teaching. Educators can modify their approach to teaching these competencies based on case-based scenarios or articles, and it is likely that optimal communication of competencies will require teaching in more than one setting using a number of different scenarios and/or articles. For example, a teaching session can be initiated using an equivocal risk-benefit balance case scenario and teaching the shared decision-making skills needed, providing patient decision aids where possible. Then, teachers can explain the evidence incorporated into the decision aids and the derivation and interpretation of quantities, such as absolute risk difference and number needed to treat or harm.

    Educators and curriculum developers in EBP are encouraged to evaluate the content of their current curriculum and integrate these competencies into it. Educators may find mapping core competencies to existing curricula will allow identification of any gaps in the coverage of essential content. Programs can address other additional advanced competencies (eg, implementation science, economic analysis) depending on the needs and desires of their learners.

    This set of core competencies in EBP represents just one of several needed steps for the implementation of competency-based EBP education. Dissemination and integration of this set of core competencies in academic and clinical practice may assist in delivering a more uniform and harmonized education to EBP learners. Open access online databases of learning resources (eg, the Critical Thinking and Appraisal Resource Library [CARL])35 represent an important resource to enhance the sharing and accessibility of learning resources relevant for the EBP core competencies.

    The development of appropriate assessment tools to evaluate the identified EBP competencies is challenging but useful for monitoring learners’ progress in each of the competencies or evaluating the effectiveness of different teaching methods. A systematic review of 85 studies evaluating EBP educational interventions found that more than half of the included studies did not use a psychometrically robust, high-quality instrument to measure their outcomes (L.A., P.G., T.H., unpublished data, 2018). Therefore, EBP education researchers should identify, and if necessary develop, specific assessment tools (both formative and summative) that provide accurate, reliable, and timely evaluation of the EBP competencies of learners. Future work should also focus on defining core competencies needed for each training level and comparing different modalities (including the sequence) when teaching these competencies.

    Limitations

    A key strength of the study is the systematic review and Delphi survey approach to achieving international consensus about a contemporary set of core competencies in EBP curricula. Although we selected Delphi participants to represent a diverse range of health professions and expertise, they may not adequately represent the full spectrum of views held by individuals within a single profession.

    Conclusions

    Based on a systematic consensus process, a set of core competencies in EBP to inform the development of EBP curricula for health professional learners has been developed and described.

    Back to top
    Article Information

    Accepted for Publication: March 26, 2018.

    Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2018 Albarqouni L et al. JAMA Network Open.

    Corresponding Author: Loai Albarqouni, MD, MSc, Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, 14 University Dr, Robina, Queensland, 4229, Australia (loai.albarqouni@student.bond.edu.au).

    Author Contributions: Drs Albarqouni and Hoffmann had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

    Concept and design: Albarqouni, Hoffmann, Straus, Olsen, Young, Ilic, Shaneyfelt, Glasziou.

    Acquisition, analysis, or interpretation of data: Albarqouni, Hoffmann, Straus, Young, Ilic, Shaneyfelt, Haynes, Guyatt, Glasziou.

    Drafting of the manuscript: Albarqouni, Hoffmann, Straus, Ilic, Haynes, Glasziou.

    Critical revision of the manuscript for important intellectual content: All authors.

    Statistical analysis: Albarqouni.

    Administrative, technical, or material support: Albarqouni, Hoffmann, Straus, Ilic, Shaneyfelt, Glasziou.

    Supervision: Hoffmann, Glasziou.

    Conflict of Interest Disclosures: Dr Albarqouni reported grants from the Australian Government Research Training Program Scholarship during the conduct of the study. Dr Hoffmann reported personal fees from Elsevier outside the submitted work. Dr Glasziou reported membership on the board of the International Society for Evidence-Based Health Care. No other disclosures were reported.

    Funding/Support: Dr Albarqouni is supported by an Australian Government Research Training Program Scholarship.

    Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

    Additional Contributions: We gratefully acknowledge Andy Oxman, MD, and the International Society for Evidence-Based Health Care board members for their feedback on the set and the Delphi participants for their contribution. Dr Oxman did not receive any compensation for his assistance.

    References
    1.
    Straus  S, Glasziou  P, Richardson  WS, Haynes  B.  Evidence-Based Medicine: How to Practice and Teach It. London, UK: Churchill Livingstone; 2010.
    2.
    Haynes  RB, Devereaux  PJ, Guyatt  GH.  Clinical expertise in the era of evidence-based medicine and patient choice.  ACP J Club. 2002;136(2):A11-A14.PubMedGoogle Scholar
    3.
    Frenk  J, Chen  L, Bhutta  ZA,  et al.  Health professionals for a new century: transforming education to strengthen health systems in an interdependent world.  Lancet. 2010;376(9756):1923-1958.PubMedGoogle ScholarCrossref
    4.
    Institute of Medicine (US) Committee on the Health Professions Education Summit. The core competencies needed for health care professionals. In: Greiner  AC, Knebel  E, eds.  Health Professions Education: A Bridge to Quality. Washington, DC: National Academies Press; 2003:chap 3.
    5.
    Accreditation Council of Graduate Medical Education website. Program and institutional guidelines. http://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements. Accessed April 20, 2018.
    6.
    Institute of Medicine. Evidence-based medicine and the changing nature of healthcare: 2007 IOM annual meeting summary. Washington, DC: Institute of Medicine; 2008.
    7.
    Sadeghi-Bazargani  H, Tabrizi  JS, Azami-Aghdash  S.  Barriers to evidence-based medicine: a systematic review.  J Eval Clin Pract. 2014;20(6):793-802.PubMedGoogle ScholarCrossref
    8.
    Zwolsman  S, te Pas  E, Hooft  L, Wieringa-de Waard  M, van Dijk  N.  Barriers to GPs’ use of evidence-based medicine: a systematic review.  Br J Gen Pract. 2012;62(600):e511-e521.PubMedGoogle ScholarCrossref
    9.
    Glasziou  P, Burls  A, Gilbert  R.  Evidence based medicine and the medical curriculum.  BMJ. 2008;337:a1253.PubMedGoogle ScholarCrossref
    10.
    Hatala  R, Guyatt  G.  Evaluating the teaching of evidence-based medicine.  JAMA. 2002;288(9):1110-1112.PubMedGoogle ScholarCrossref
    11.
    Moynihan  S, Paakkari  L, Välimaa  R, Jourdan  D, Mannix-McNamara  P.  Teacher competencies in health education: results of a Delphi study.  PLoS One. 2015;10(12):e0143703.PubMedGoogle ScholarCrossref
    12.
    Smith  SR.  AMEE guide No. 14: outcome-based education: part 2—planning, implementing and evaluating a competency-based curriculum.  Med Teach. 1999;21(1):15-22.Google ScholarCrossref
    13.
    Harden  RM, Crosby  JR, Davis  MH, Friedman  M.  AMEE guide No. 14: outcome-based education: part 5—from competency to meta-competency: a model for the specification of learning outcomes.  Med Teach. 1999;21(6):546-552.PubMedGoogle ScholarCrossref
    14.
    Carraccio  C, Englander  R, Van Melle  E,  et al; International Competency-Based Medical Education Collaborators.  Advancing competency-based medical education: a charter for clinician-educators.  Acad Med. 2016;91(5):645-649.PubMedGoogle ScholarCrossref
    15.
    Martin  M, Vashisht  B, Frezza  E,  et al.  Competency-based instruction in critical invasive skills improves both resident performance and patient safety.  Surgery. 1998;124(2):313-317.PubMedGoogle ScholarCrossref
    16.
    Antonoff  MB, Swanson  JA, Green  CA, Mann  BD, Maddaus  MA, D’Cunha  J.  The significant impact of a competency-based preparatory course for senior medical students entering surgical residency.  Acad Med. 2012;87(3):308-319.PubMedGoogle ScholarCrossref
    17.
    Long  DM.  Competency-based residency training: the next advance in graduate medical education.  Acad Med. 2000;75(12):1178-1183.PubMedGoogle ScholarCrossref
    18.
    Calhoun  JG, Ramiah  K, Weist  EM, Shortell  SM.  Development of a core competency model for the master of public health degree.  Am J Public Health. 2008;98(9):1598-1607.PubMedGoogle ScholarCrossref
    19.
    Penciner  R, Langhan  T, Lee  R, McEwen  J, Woods  RA, Bandiera  G.  Using a Delphi process to establish consensus on emergency medicine clerkship competencies.  Med Teach. 2011;33(6):e333-e339.PubMedGoogle ScholarCrossref
    20.
    Kilroy  DA, Mooney  JS.  Determination of required pharmacological knowledge for clinical practice in emergency medicine using a modified Delphi technique.  Emerg Med J. 2007;24(9):645-647.PubMedGoogle ScholarCrossref
    21.
    Moser  JM.  Core academic competencies for master of public health students: one health department practitioner’s perspective.  Am J Public Health. 2008;98(9):1559-1561.PubMedGoogle ScholarCrossref
    22.
    Lalloo  D, Demou  E, Kiran  S, Cloeren  M, Mendes  R, Macdonald  EB.  International perspective on common core competencies for occupational physicians: a modified Delphi study.  Occup Environ Med. 2016;73(7):452-458.PubMedGoogle ScholarCrossref
    23.
    Albarqouni  L, Glasziou  P, Hoffmann  T.  Completeness of the reporting of evidence-based practice educational interventions: a review.  Med Educ. 2018;52(2):161-170.PubMedGoogle ScholarCrossref
    24.
    Dawes  M, Summerskill  W, Glasziou  P,  et al; Second International Conference of Evidence-Based Health Care Teachers and Developers.  Sicily statement on evidence-based practice.  BMC Med Educ. 2005;5(1):1.PubMedGoogle ScholarCrossref
    25.
    Austvoll-Dahlgren  A, Oxman  AD, Chalmers  I,  et al.  Key concepts that people need to understand to assess claims about treatment effects.  J Evid Based Med. 2015;8(3):112-125.PubMedGoogle ScholarCrossref
    26.
    Chalmers  I, Oxman  AD, Austvoll-Dahlgren  A,  et al.  Key concepts for informed health choices: a framework for helping people learn how to assess treatment claims and make informed choices.  BMJ Evid Based Med. 2018;23(1):29-33.PubMedGoogle ScholarCrossref
    27.
    Dunn  WR, Hamilton  DD, Harden  RM.  Techniques of identifying competencies needed of doctors.  Med Teach. 1985;7(1):15-25.PubMedGoogle ScholarCrossref
    28.
    Fink  A, Kosecoff  J, Chassin  M, Brook  RH.  Consensus methods: characteristics and guidelines for use.  Am J Public Health. 1984;74(9):979-983.PubMedGoogle ScholarCrossref
    29.
    Humphrey-Murto  S, Varpio  L, Wood  TJ,  et al.  The use of the Delphi and other consensus group methods in medical education research: a review.  Acad Med. 2017;92(10):1491-1498.PubMedGoogle ScholarCrossref
    30.
    Diamond  IR, Grant  RC, Feldman  BM,  et al.  Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies.  J Clin Epidemiol. 2014;67(4):401-409.PubMedGoogle ScholarCrossref
    31.
    Miller  GE.  The assessment of clinical skills/competence/performance.  Acad Med. 1990;65(9)(suppl):S63-S67.PubMedGoogle ScholarCrossref
    32.
    Melnyk  BM, Gallagher-Ford  L, Long  LE, Fineout-Overholt  E.  The establishment of evidence-based practice competencies for practicing registered nurses and advanced practice nurses in real-world clinical settings: proficiencies to improve healthcare quality, reliability, patient outcomes, and costs.  Worldviews Evid Based Nurs. 2014;11(1):5-15.PubMedGoogle ScholarCrossref
    33.
    Guyatt  GH, Meade  MO, Jaeschke  RZ, Cook  DJ, Haynes  RB.  Practitioners of evidence based care. not all clinicians need to appraise evidence from scratch but all need some skills.  BMJ. 2000;320(7240):954-955.PubMedGoogle ScholarCrossref
    34.
    Straus  SE, Green  ML, Bell  DS,  et al; Society of General Internal Medicine Evidence-Based Medicine Task Force.  Evaluating the teaching of evidence based medicine: conceptual framework.  BMJ. 2004;329(7473):1029-1032.PubMedGoogle ScholarCrossref
    35.
    Castle  JC, Chalmers  I, Atkinson  P,  et al.  Establishing a library of resources to help people understand key concepts in assessing treatment claims—the “Critical thinking and Appraisal Resource Library” (CARL).  PLoS One. 2017;12(7):e0178666.PubMedGoogle ScholarCrossref
    ×