[Skip to Content]
[Skip to Content Landing]
Views 15,023
Citations 0
Viewpoint
March 6, 2020

Medical Education Takes a Step in the Right Direction: Where Does That Leave Students?

Author Affiliations
  • 1Johns Hopkins University School of Medicine, Baltimore, Maryland
JAMA. Published online March 6, 2020. doi:10.1001/jama.2020.2950

With guidance from the Federation of State Medical Boards (FSMB) and the National Board of Medical Examiners (NBME), the United States Medical Licensing Examination (USMLE) announced on February 12, 2020, that beginning at the earliest on January 1, 2022, the Step 1 licensure examination would change from reporting the results as a 3-digit score to reporting the results as pass/fail. The potential consequences of this decision are wide ranging, and the implementation of this policy will have substantial implications for training a new generation of physicians. This modification in score reporting requires careful consideration of the options moving forward, including potential benefits and challenges this change may create. While this change is part of a concerted effort to improve student wellness and clinically focused education, this announcement, perhaps paradoxically, also will generate uncertainty and anxiety for a cohort of students caught in the transition period, and for future medical students.

The stated purpose of the Step 1 examination is to ensure eligibility for medical licensure.1 However, given the increasing proportion of US medical schools that have transitioned toward pass/fail preclerkship grades (76.8% of Liaison Committee on Medical Education–accredited schools in 2018-2019, up from 61.7% of schools in 2014-2015),2 many residency program directors across the country have prioritized an objective, quantitative measurement of medical knowledge in the form of the Step 1 examination to consider applicants for interview. The numerical score of the test has been used as a cutoff value for evaluating residency applicants without evidence that small differences in score provide predictive value for clinical success.3 According to the 2018 National Resident Matching Program (NRMP) Program Director Survey, an applicant’s Step 1 score is the most widely cited factor in determining which candidates to interview, with 94% of programs reporting its use.4 Regardless of this utility, the increasing use of scores as a cutoff for residency applications is contrary to the purpose and design of the Step 1 examination.1,5 In the context of this increasingly competitive examination, test preparation companies have found a niche, producing “high-yield” study resources to distill a massive preclinical education to the material most likely to appear on the Step 1 examination.5,6

Some students, especially those seeking more competitive residencies, have not focused on their schools’ preclinical curricula and instead have focused on studying for the Step 1 examination.6 Combined with the pass/fail preclinical curriculum now present at many schools, it is the impression of some educators that students are incentivized to rely on Step 1 test preparation material rather than lectures and laboratory sessions to cover in-house curriculum assessments. This perspective may be valid: average Step 1 scores have increased by approximately 0.9 points per year (up from a mean score of 200 in 1992 to a score of 233 in 2018 among medical students who successfully match).5

The primary focus of medical education should be to obtain the knowledge and skills required to be effective, knowledgeable, and empathetic physicians, and much of this comes from the education provided by medical schools. The pursuit of a competitive Step 1 score may come at the expense of these overarching objectives, with students placing an emphasis on test preparation strategies rather than the pursuit of clinical excellence in less easily tested skills such as communication and empathy.6 Given these realities, the pass/fail change for Step 1 may foster a new environment in which students will learn more for the sake of becoming excellent physicians rather than performing well on a high-stakes examination.

But this change comes with challenges. The evolution of medical school curricula has led to differences in the length of preclinical training among institutions. While it is customary for students to take the Step 1 examination at the end of preclinical training (usually in year 2 of medical school), the actual timing varies significantly. This variation is amplified by numerous students who pursue additional degrees or interrupt their medical education with research years. Therefore, beginning at earliest in 2022, residency program directors will encounter the challenge of evaluating a heterogeneous applicant pool with respect to reporting results from the Step 1 examination, that is, some applicants will have 3-digit scores while others will have only pass/fail designations. Individual programs will develop internal guidelines for handling the situation, but barring external oversight, such guidelines will vary among institutions and residency programs. This ambiguity could create uncertainty with the already complex process of the NRMP. Students are likely unsure how a numerical vs a pass/fail Step 1 result will affect the relative success of their applications to residency programs. On this point, greater clarification is needed.

To understand how to prepare for residency success, it would be helpful if the NBME and FSMB announced how Step 1 scores (for examinations taken both before and after January 2022) will be reported to program directors. As a possible approach, after January 2022, the NBME could consider not issuing 3-digit scores regardless of the actual testing date. The fundamental impetus for the score change (ie, the limited utility of Step 1 as a stratifying tool) exists now and will continue to exist beyond 2022. Current students applying to residencies in the future should not be affected by these well-described limitations with examination scores.

Eliminating Step 1 numerical scores may alleviate an immediate pressure to “test well” for competitive residencies, but it also could leave a challenge in differentiating applicants. This void may be filled by a combination of factors that may create their own issues. One likely factor is the USMLE Step 2 Clinical Knowledge examination, the second board examination that medical students complete, typically in their fourth year of medical school. To ease the transition of the pass/fail change to Step 1, the USMLE has determined that Step 2 Clinical Knowledge should remain graded numerically.1 Very likely, given its quantitative nature, Step 2 Clinical Knowledge will begin to have a more prominent role in resident selection, and more programs may potentially require the scores from this examination as part of a residency application. Although Step 2 Clinical Knowledge has a larger clinical focus, it could create the same challenges as Step 1 by assigning undue weight in application success to a numerically scored examination intended for licensure, not applicant stratification. Furthermore, there are potential adverse effects on students and medical school culture if subjective (and nonstandardized) grading criteria, such as school-specific clerkship evaluations, become more influential in the NRMP.

Most important, residency program directors may increasingly favor medical school prestige as they make their decisions about applicants. Medical students from some schools may end up applying to more programs than in the past, another major concern in medical education.7 This presents a worrisome scenario: when students from highly ranked academic institutions apply to residencies with a slate of “pass” grades and a “pass” Step 1 examination score, they may be seen as uniformly more qualified than students from lower-ranked academic institutions with the same denotations on their applications. Should medical school ranking become the next most important stratifying feature, this shift will only exacerbate existing concerns about how medical students are selected for residencies. The potential effects on premedical students could be substantial as well. If institutional prestige becomes a determining factor in future residency options, additional stress may accompany the medical school application process. The stratifying nature of Step 1 should not transfer forward to Step 2 Clinical Knowledge, but it should also not transfer backward to the Medical College Admission Test. Despite its shortcomings, Step 1 provides an opportunity for comparison of students across medical institutions, allowing performance rather than pedigree to be an important determinant of a student’s competitiveness for the match. By removing one of the few universal and objective preclinical measures without an immediate replacement, more opportunities are inevitably being introduced for disparities and biases to manifest.

The priority now should be to provide an objective and fair opportunity for medical students to distinguish themselves without relying on subjective assessments or transferring focus onto another single high-stakes examination like Step 2 Clinical Knowledge. One proposal is to use NBME Subject Examinations (“Shelf Exams”), which are administered during clerkships, as a measure of objectively assessable clinical knowledge. Developing specialty-specific standardized clerkship evaluations with national standards could also improve objectivity in evaluating medical students. Taken together, a series of standardized assessments of knowledge and objective evaluations of clinical skills could provide program directors with sufficient information to make preliminary decisions regarding possible interviews. These measures could potentially prevent any individual assessment from having a disparate effect on a candidate and also could help avoid the use of licensing examinations for something they were not designed for.

Removing numerical scoring from the USMLE Step 1 is a welcomed change to the process of residency applications. But without other concurrent changes to the system, the stress and uncertainty that may be alleviated from not having to perform well on a high-stakes Step 1 examination may simply be transferred to other aspects of medical education, and there is a risk of conflating institutional prestige with individual ability. A potential solution is providing clear guidelines and methods of evaluation that avoid transferring Step 1 anxiety to less controllable or similarly flawed evaluation metrics. One immediate helpful measure could be to change the manner of reporting for Step 1 scores after January 2022. More important, perhaps, is to recognize that current tools to assess residency candidates are limited and remain in need of improvement.

(This Viewpoint is available for online commenting.)

Back to top
Article Information

Corresponding Author: Matthew A. Crane, BS, Johns Hopkins University School of Medicine, 733 N Broadway, Edward D. Miller Research Bldg, Ste 137, Baltimore, MD 21205-2196 (mcrane9@jhmi.edu).

Published Online: March 6, 2020. doi:10.1001/jama.2020.2950

Conflict of Interest Disclosures: None reported.

Additional Information: Messrs Crane, Chang, and Azamfirei are first-year medical students and have not taken the Step 1 examination. Prior to the announcement, Messrs Crane and Chang were planning to take the USMLE Step 1 examination in 2021 and Mr Azamfirei was planning to take the USMLE Step 1 examination in 2022, after the completion of a research year.

References
1.
Change to pass/fail score reporting for Step 1. United States Medical Licensing Examination. Accessed February 19, 2020. https://www.usmle.org/inCus/
2.
Grading systems use by US medical schools. Association of American Medical Colleges. Accessed February 19, 2020. https://www.aamc.org/data-reports/curriculum-reports/interactive-data/grading-systems-use-us-medical-schools
3.
Swails  JL, Aibana  O, Stoll  BJ.  The conundrum of the United States Medical Licensing Examination score reporting structure.  JAMA. 2019;322(7):605. doi:10.1001/jama.2019.9669PubMedGoogle ScholarCrossref
4.
Results of the 2018 NRMP Program Director Survey. National Resident Matching Program Data Release and Research Committee. Accessed February 19, 2020. https://www.nrmp.org/wp-content/uploads/2018/07/NRMP-2018-Program-Director-Survey-for-WWW.pdf
5.
Carmody  JB, Sarkany  D, Heitkamp  DE.  The USMLE Step 1 pass/fail reporting proposal: another view.  Acad Radiol. 2019;26(10):1403-1406. doi:10.1016/j.acra.2019.06.002PubMedGoogle ScholarCrossref
6.
Chen  DR, Priest  KC, Batten  JN, Fragoso  LE, Reinfeld  BI, Laitman  BM.  Student perspectives on the “Step 1 climate” in preclinical medical education.  Acad Med. 2019;94(3):302-304. doi:10.1097/ACM.0000000000002565PubMedGoogle ScholarCrossref
7.
Hammoud  MM, Andrews  J, Skochelak  SE.  Improving the residency application and selection process: an optional early result acceptance program.  JAMA. 2020;323(6):503-504. doi:10.1001/jama.2019.21212PubMedGoogle ScholarCrossref
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    2 Comments for this article
    EXPAND ALL
    USMLE Scoring
    James Campbell, MD | Prior academic and VA
    Scores were helpful but only a limited portion in evaluation for dermatology residency. This looks like a way to blind selection to get more diversity instead of quality.
    CONFLICT OF INTEREST: None Reported
    One Less Objective Data Point
    Matthew Snyder, BS | UVA
    "Despite its shortcomings, Step 1 provides an opportunity for comparison of students across medical institutions, allowing performance rather than pedigree to be an important determinant of a student’s competitiveness for the match. By removing one of the few universal and objective preclinical measures without an immediate replacement, more opportunities are inevitably being introduced for disparities and biases to manifest."

    I couldn't have stated this better. To be frank, applying to medical school is a wild card, with a lot of factors outside an applicant's control determining which specific schools they might receive an acceptance at. Removal of another objective
    data point from the residency application will make which school you end up at much more important.

    I do think that it's unfortunate that so much weight ended up on one test... but it's also unfortunate that clerkships are graded in the way that they are (i.e. subjectively), which contributed to the weight placed on this exam in the first place. It seems that reporting shelf scores might be the best next option, as they are standardized, but also greater in number. This distributes the risk across multiple data points, such that "one bad test day" wouldn't hurt a student the same way a similar bad day would on Step 1.
    CONFLICT OF INTEREST: None Reported
    READ MORE
    ×