Customize your JAMA Network experience by selecting one or more topics from the list below.
Ward M, Pingree C, Laury AM, Bowe SN. Applicant Perspectives on the Otolaryngology Residency Application Process. JAMA Otolaryngol Head Neck Surg. 2017;143(8):782–787. doi:10.1001/jamaoto.2017.0231
What do otolaryngology applicants think about the current residency application process, including suggestions for reform?
In this cross-sectional, anonymous survey of 2016 otolaryngology residency applicants, several themes emerged from the data, including (1) the need for consistency of mentorship and guidance, (2) development of a uniform set of criteria for residency program websites, and (3) consideration of opportunities to adjust or depart from the current system (eg, hard caps, flagging, and wave application cycles).
Understanding the perspective of the otolaryngology residency applicant is essential to improving the current application process.
It has been nearly 25 years since medical students were queried regarding their perspectives on otolaryngology–head and neck surgery (OHNS) residency selection. Understanding this viewpoint is critical to improving the current application process.
To evaluate the perceptions of 2016 OHNS residency applicants regarding the application process and offer suggestions for reform.
Design, Setting, and Participants
In this cross-sectional study of anonymous online survey data, a 14-question survey was designed based on resources obtained from a computerized PubMed, Ovid, and GoogleScholar database search of the English language from January 1, 1990, through December 31, 2015, was conducted using the following search terms: (medical student OR applicant) AND (application OR match) AND otolaryngology. The survey was administered to 2016 OHNS residency applicants to examine 4 primary areas: current attitudes toward the match, effect of the new Otolaryngology Program Directors Organization personal statement mandate, sources of advice and information, and suggestions for improvement. In January 2016, an email was sent to 100 program directors asking them to distribute the survey to current OHNS applicants at their institution. One follow-up reminder email was sent in February 2016. A link to the survey was posted on the Otomatch.com homepage on January 28, 2016, with the last response received on March 28, 2016.
Main Outcome and Measures
Survey responses regarding the residency application process.
A total of 150 of 370 residency applicants (40.5%) responded to the survey. Of these, 125 respondents (90.6%) noted applying to programs in which they had no specific interest simply to improve their chances of matching. Applicants intended to apply to more programs than they actually did (63.6 vs 60.8; r = 0.19; 95% CI, −0.03 to 0.40). Program directors advised fewer applications than other sources; however, 58 respondents (38.7%) did not receive advice from a program director. A total of 121 respondents (80.7%) found online program information to be insufficient. Finally, 90 of 140 respondents (64.3%) noted that they would agree to a hard cap on applications, among other suggestions for improvement.
Conclusions and Relevance
Several main themes emerged from the data, providing a foundation for process improvement opportunities: careful consideration to applicant mentorship, including peers; uniform set of criteria for residency program websites; and investigating alternative match platforms, which may allow hard caps, flagging programs of higher interest, or wave application cycles. Overall, the otolaryngology applicant provides a unique perspective regarding the current state of the match and potential opportunities for system-wide improvement.
It has been nearly 25 years, to our knowledge, since medical students were queried regarding their perspectives on the otolaryngology–head and neck surgery (OHNS) residency application process. At that time, the average student submitted 26 applications, which was considered an inordinate number of applications per student, and a call was made for reevaluation and reform of the process.1 However, more than 2 decades later, OHNS programs are still seeking a universally adoptable solution to improve the efficacy of the application process. In 2008, Baroody et al2 urged OHNS faculty to advise thoughtful restriction of applications to only 10 to 20 programs. A few years later, Christophel and Levine3 further endorsed this proposal but suggested a more realistic upper limit of 45 applications per medical student. Other recommendations to help curtail the application inundation have included a program-specific secondary essay and an Electronic Residency Application Service (ERAS) cap on the total number of applications allowed.4-7 This secondary essay suggestion was adapted and implemented by the Otolaryngology Program Directors Organization (OPDO) in the 2015 to 2016 match in the form of a single program-specific paragraph at the end of each applicant’s personal statement.
Despite these thoughtful suggestions and actions from OHNS leadership and program directors, application numbers per applicant have continued to steadily increase, reaching an all-time high of 64.5 applications per medical student in 2015.8 Thus far, the otolaryngology community has only evaluated this challenge from a singular viewpoint, yet the applicants themselves largely contribute to this conundrum. We present the results of our survey that evaluated applicant perceptions of the 2016 OHNS application process and offer suggestions for reform.
The literature relating to the burden of excessive applications in the OHNS match process during the past 25 years was reviewed. A computerized PubMed, Ovid, and Google Scholar database search of the English language from January 1, 1990, through December 31, 2015, was conducted using the following search terms: (medical student OR applicant) AND (application OR match) AND otolaryngology. Additional articles were identified from the references in these articles. From these sources as well as informal discussions with multiple OHNS program directors and current applicants, a 14-question survey was developed (eAppendix in the Supplement). The survey focused on 4 primary areas: current applicant attitudes toward the OHNS application process, effect of the new OPDO personal statement mandate, applicant sources of advice and information, and suggestions for improvement. The research protocol was determined to be exempt by the Brooke Army Medical Center Institutional Review Board, and participant informed consent was not required.
FREIDA Online, a database of residency and fellowship programs that is sponsored by the American Medical Association, was used to identify current residency programs and obtain program director email addresses. The 6 military residency programs in the database were excluded from consideration because of their lack of participation in the National Resident Matching Program (NRMP), leaving 100 OHNS residency programs. In January 2016, an email including a brief background on the survey and a link to the survey on SurveyMonkey was sent to these 100 program directors, asking them to distribute the survey to current OHNS applicants at their institution. One follow-up reminder email was sent in February 2016. In addition, given the popularity of Otomatch.com among OHNS applicants, a link to the survey on SurveyMonkey was posted on the Otomatch.com homepage on January 28, 2016, with the last response received on March 28, 2016.
Descriptive statistics with relative frequencies were used to assess the distribution of the survey responses. The median number of applications for each range was used to calculate weighted means. These means were then used in calculating Pearson correlations and effect size and 95% CIs using Wilson approximation. Calculations were performed using SPSS statistical software, version 22 (IBM Corp).
Of the 370 applicants for OHNS residency in 2016, a total of 150 applicants responded to the survey, representing 40.5% of the applicant population.8 To assess current attitudes, applicants were asked whether they had applied to programs in which they did not have high interest to improve their chances of matching to any program. Of the respondents, 125 (90.6%) answered yes, whereas 13 (9.4%) responded no. Respondents who answered no applied to the same mean number of programs as respondents who replied yes.
The survey also assessed the difference between the number of programs to which the applicants intended to apply at the beginning of the application cycle vs the number to which they actually applied. Results were notable for a 10.9% decrease in the number of respondents applying to more than 70 programs and a 9.2% increase in the number of respondents applying to 41 to 50 programs. Figure 1 depicts response rates by category, indicating that applicants applied to a fewer mean number of programs than they originally intended (63.6 vs 60.8; r = 0.19; 95% CI, −0.03 to 0.40).
In light of the differences between intended and actual number of applications per applicant, 96 respondents (64.0%) indicated that the OPDO-mandated, program-specific paragraph did not affect the number of programs to which they applied. Of those respondents who reported being affected, 3 (2.0%) applied to more programs and 51 (34.0%) applied to fewer. Of those respondents who applied to fewer, 27 of 49 (55.1%) reported doing so because they were not interested in applying to particular programs after performing further research to write personal statements, whereas 33 of 49 (67.4%) cited not having enough time to complete program-specific personal statements (respondents could select more than one reason for submitting fewer applications).
When deciding on the number of programs to which to apply, OHNS applicants may receive advice from a number of sources, including medical school deans, mentors, or advisers; program directors; residents; or other medical students. The percentages of respondents who received advice from these sources are presented in Figure 2. Of note, nearly one-third of all respondents (deans, mentors, and advisers, 29.3%; program directors, 38.7%; residents, 28.7%; and medical students, 28.8%) did not receive advice from any of these sources. Program directors recommended submitting fewer applications compared with deans, mentors, or advisers; residents; or other medical students (weighted mean number of recommended applications: deans, mentors, and advisers, 54.2; program directors, 49.7; residents, 62.5; and medical students, 67.2). Apart from these members of the OHNS community, commonly used resources for application advice included previously published NRMP match data (18 respondents [12.0%]), Otomatch.com (15 respondents [10.0%]), match data from their own medical schools (3 respondents [2.0%]), and StudentDoctor.net (2 respondents [1.3%]). Other resources less commonly cited were Headmirror.com, Doximity.com, medical school alumni, and Iserson’s Getting Into a Residency.9
A total of 121 respondents (81.4%) found that less than half of OHNS training program websites had enough information to identify personally appealing aspects of that program. Only 4 of 150 respondents (2.7%) thought that 76% to 100% of OHNS websites had enough information.
Thirty-nine respondents (26.0%) expected application reviewers to spend 0 to 5 minutes reviewing each application, 53 (35.3%) expected 6 to 10 minutes per application, 35 respondents (23.3%) expected 11 to 15 minutes per application, and 23 respondents (15.3%) expected more than 15 minutes per application.
Ninety-eight (65.3%) of the 150 respondents gave suggestions on how to improve the current application system to allow more time to be spent on each application. The top suggestions put forth by respondents are listed in the Table. Other less common suggestions included requiring preinterview teleconference or personality tests, offering and conducting interviews in 2 waves, and increasing the cost of applications.
When asked whether they would agree with a standardized limit to the number of applications that each applicant could submit, 90 respondents (64.3%) answered that they would agree, whereas 50 (35.8%) would not agree. The applicants suggested limits of 40 or fewer (47 [57.3%]), 41 to 50 (26 [31.7%]), 51 to 60 (7 [8.5%]), and more than 60 (2 [2.4%]).
During the past 2 decades, the mean number of applications per applicant for OHNS has increased by approximately 250%.8 This increase has been more extreme in recent years, leading some OHNS residency programs to feel that they are “drowning in applications.”5(p695) Original proposed solutions focused on improved communication and guidance for medical students by suggesting an upper limit for the number of applications.2,3 However, in subsequent years, this number continued to increase, prompting multiple stakeholders to propose adding an additional step to the process that requires a program-specific statement of interest. As reported by Puscas and Esclamado,6 the addition of a secondary essay to Duke University’s application process decreased the number of applications by 25%. However, as Chang and Erhardt4 observed, if this requirement were applied uniformly to all programs, applicants would likely increase their effort to meet the hurdle. As one respondent to our survey stated, “In general, otolaryngology residency applicants are resilient and won’t be stymied by something as minuscule as a few extra sentences.” Indeed, more than 90% of our respondents indicated that they applied to programs in which they did not have a high or specific interest just to ensure they had the best chance of matching.
The 2016 preliminary ERAS match data also support that sentiment. Although the mean number of applications received per residency program decreased from 275 in 2015 to 225 in 2016, the number of applicants also decreased from 443 to 370 during that time. In addition, the mean number of applications per applicant decreased by only 3.7 (64.5 in 2015 to 60.8 in 2016).8 This finding may indicate that the OPDO mandate for program-specific personal statements may have prevented an additional increase in the number of applications per applicant, but it did not greatly reduce this number and certainly not by the 25% seen at Duke University.6
Timing of the release of the OPDO mandate possibly influenced the lack of increase in the number of applications per applicant. Of the 34.0% of respondents who applied to fewer programs secondary to the OPDO mandate, more than two-thirds cited not having enough time to complete the statements. In contrast, future applicants will have the benefit of advanced warning, which may limit its effectiveness. Still, the OPDO mandate appeared to have some effect in pushing applicants to make program selections on the front end of the application process, as evidenced by the 27 respondents (18.0% of total respondents) who applied to fewer programs because they lost interest in applying to some programs after performing further research to write their personal statements.
Previous publications2,3 have suggested that faculty and program directors in OHNS should advise students to apply to fewer programs. Analysis of the data indicated that program directors advised significantly fewer applications per student than any other source (dean or faculty, residents, and students). However, 45.6% of program directors still advised students that they should apply to more than 40 programs.
A trend was also noticed toward advising more applications when the adviser was temporally closer to the match process (residents and students). Because residents have more recent experience and success applying in the OHNS match, their opinion carries particular weight. This inconsistency in advice given to students may help to explain why application numbers continued to increase.
Although most students are receiving advice from various sources, there are still significant disparities. Program directors may have been adhering to the recommendations; however, medical students and residents may actually exert the greatest influence on the number of programs to which an applicant applies. This finding highlights the need to encourage appropriate guidance from peers and advisers.
Apart from residents and other fourth-year students, the survey also shows that many applicants used online resources from the NRMP (12.0%) to discussion boards on Otomatch.com (10.0%). Kozin et al10 analyzed Otomatch.com traffic data and found that the most commonly viewed discussion threads were regarding program-specific information and interviews. Their work suggested that applicants were mostly interested in relatively basic questions regarding “operating case volume, faculty depth, and research opportunities.”10(p463)
In 2014, Svider et al11 found that less than half of OHNS program websites displayed data for case numbers, call schedules, surrounding location, and career paths of past graduates. This finding was exemplified by one applicant who noted, “Some websites were completely inadequate, leaving the only thing to write about in a paragraph is: (1) generic things about ENT [ear, nose, and throat], (2) the city in which the program is located, and (3) if you happen to know anybody. Nobody wants to read that.” In the absence of official website information, it appeared that students often have relied on information recorded on Otomatch.com.
We recommend identifying a uniform set of criteria that should be included on every program website to provide more comprehensive information to applicants, not only for use in the final paragraph but also in narrowing their programs of interest earlier in the application process. The otolaryngology-focused website Headmirror.com provides OHNS residents and interested medical students a centralized resource for information, including subspecialty choices, tips on the match process, and educational links. This site currently provides direct links to all OHNS program websites. Such a website could offer a consistent resource for residency programs to ensure that the link to their own site was accurate and provides thorough and pertinent information for applicants.
The sharp increase in application numbers during the past 10 years has caused many in the OHNS community to consider the idea of placing a hard cap on the number of applications per applicant. As Naclerio et al5 observed in 2014, the current system lacks the ability to identify applicants who are genuinely interested. By limiting the number of applications to 20 per applicant, they argue that applicants will be forced to decide their preferred programs earlier in the application process. Chang and Erhardt4 also discussed the possibility of an application cap but argued that this would require a departure from ERAS to an independent match system and may also stifle diversity of applicants because many applicants will not be willing to stray far from programs of greatest familiarity.
The idea of an application cap is not exclusive to OHNS faculty or residents. Before even being queried on this subject (question 13), 43 of 98 respondents (43.8%) suggested a hard cap on applications. Although there does not appear to be a clear consensus on this hard limit, 70.7% of those who agreed to the idea of a cap suggested between 31 and 50 applications per applicant. If these limits were applied to the 2016 match, programs would have required 15.2 to 24.5 hours to review applications if each application was reviewed for 8 minutes. Put another way, by using such a cap, programs could review an application for a mean of 11.6 to 18.7 minutes, spending the same total amount of time that Chang and Erhardt4 suggested was spent in the 2015 match (35.9 hours). Thus, an application limit clearly provides an opportunity for programs to invest more time per application. A few respondents to the survey also suggested a modification to the current system: a limited number of application flags allowed per applicant. By flagging programs during the application process, applicants can indicate programs of particular interest, helping programs differentiate between applicants with a sincere interest and those merely applying to a large number of programs. The option of flagging programs will also push applicants to make thoughtful decisions early in the application process. This option, however, would likely also require a departure from ERAS. Another potential modification to the current system could capitalize on a wave application process. During an initial application window, students may be able to apply to a limited number of programs (eg, 20 or 30 programs). Programs would then receive these applications, which would be smaller in number, and make an initial set of interview offers. Then, an additional application window would open, with or without a cap on the number of programs. Conceivably, some students may be satisfied with the number of programs to which they have received interviews and would not submit any further applications during this time. Again, this may help applicants focus on those programs with which they have a genuine interest earlier in the process. It may also limit the overall number of applications that a given program receives. This opportunity is unlikely to be possible using the ERAS and NRMP but is a possibility with the SF Match (Dennis Thomatos, BS, SF Matching Program, oral communication, May 27, 2016).
There are several limitations to this study. The survey instrument was developed from an informal literature review and was not proved to be a validated tool before administration. In addition, the applicant response rate was only 40.7%, which although fairly robust for a survey, did not capture most of the applicants. One possible cause of this limitation is reliance on programs and Otomatch.com for distribution of the survey, which may have resulted in students who did not use Otomatch.com or reapplicants to be unintentionally excluded. Finally, applicants who were strongly pleased or displeased with the application process or OPDO paragraph may have been more likely to complete the survey, limiting the generalizability of this study. However, the survey was closed before the 2016 match day, which could have helped to reduce this source of bias.
This is the first study in 25 years, to our knowledge, to survey applicants regarding their perspectives on the OHNS residency application process. Several main themes emerged from the data that were analyzed. Significant inconsistencies still exist in the advising of applicants with regard to the number of programs to which they should apply. Careful consideration must be given to the effect of peer advisers to the residency application process. The OPDO paragraph requirement had a subtle effect on applications; however, there are obvious deficiencies in the program-specific information that is available. Opportunities to develop a universal set of criteria to be provided from each residency program on their website should be encouraged. In addition, a specialty-focused website, such as Headmirror.com, could be used as an accessible and consistent point of reference for prospective applicants to locate this material. Alternative match platforms, which may allow for hard caps, flagging, or wave application cycles, could be investigated. The intent of this study was to identify opportunities for improvement of the OHNS application process by taking into account the perspective of the applicants. Numerous suggestions have been provided that may reduce the number of applications received by a given program, allowing more time for review of each application, with the ultimate goal to assist the student and program in obtaining a match of the best fit.
Corresponding Author: Sarah N. Bowe, MD, Department of Otolaryngology–Head and Neck Surgery, Massachusetts Eye & Ear Infirmary, 243 Charles St, Boston, MA 02114 (email@example.com).
Accepted for Publication: February 19, 2016.
Published Online: May 25, 2017. doi:10.1001/jamaoto.2017.0231
Author Contributions: Drs Laury and Bowe had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: All authors.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: All authors.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Ward, Pingree.
Administrative, technical, or material support: Pingree, Laury.
Study supervision: Laury, Bowe.
Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.
Disclaimer: The views expressed herein are those of the authors and do not reflect the official policy or position of Brooke Army Medical Center, the US Army Medical Department, the US Army Office of the US Surgeon General, the US Department of the Army, the US Department of Defense, or the US government.
Additional Contributions: Mark Wax, MD, Oregon Health & Science University, Portland, provided his insights and input during the development of the survey. Justin Golub, MD, Columbia University Medical Center, New York, New York, and Sam Reyes, MD, Buffalo ENT Specialists LLP, Buffalo, New York, posted the survey link on Otomatch.com, which was incredibly helpful for additional recruitment.