[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 54.159.129.152. Please contact the publisher to request reinstatement.
[Skip to Content Landing]
Download PDF
Table 1.  
Proportion of Residents at Programs Affiliated With Their Medical School
Proportion of Residents at Programs Affiliated With Their Medical School
Table 2.  
Proportion of Residents at Programs in the Same Region as Their Medical School vs a Different Region
Proportion of Residents at Programs in the Same Region as Their Medical School vs a Different Region
1.
National Resident Matching Program. Results and Data, 2008: Main Residency Match®. National Resident Matching Program: Washington, DC; 2008.
2.
National Resident Matching Program. Results and Data, 2013: Main Residency Match®. National Resident Matching Program: Washington, DC; 2013.
3.
Tang  CG, Hilsinger  RL  Jr, Cruz  RM, Schloegel  LJ, Byl  FM  Jr, Rasgon  BM.  Manual dexterity aptitude testing: a soap carving study. JAMA Otolaryngol Head Neck Surg. 2014;140(3):243-249.
PubMedArticle
4.
Chole  RA, Ogden  MA.  Predictors of future success in otolaryngology residency applicants. Arch Otolaryngol Head Neck Surg. 2012;138(8):707-712.
PubMedArticle
5.
Bent  JP, Colley  PM, Zahtz  GD,  et al.  Otolaryngology resident selection: do rank lists matter? Otolaryngol Head Neck Surg. 2011;144(4):537-541.
PubMedArticle
6.
Prager  JD, Myer  CM  IV, Hayes  KM, Myer  CM  III, Pensak  ML.  Improving methods of resident selection. Laryngoscope. 2010;120(12):2391-2398.
PubMedArticle
7.
Daly  KA, Levine  SC, Adams  GL.  Predictors for resident success in otolaryngology. J Am Coll Surg. 2006;202(4):649-654.
PubMedArticle
8.
National Resident Matching Program, Data Release and Research Committee. Results of the 2012 NRMP Program Director Survey. National Resident Matching Program: Washington, DC; 2012.
9.
Falcone  JL.  Home-field advantage: the role of selection bias in the general surgery national residency matching program. J Surg Educ. 2013;70(4):461-465.
PubMedArticle
10.
Makdisi  G, Takeuchi  T, Rodriguez  J, Rucinski  J, Wise  L.  How we select our residents: a survey of selection criteria in general surgery residents. J Surg Educ. 2011;68(1):67-72.
PubMedArticle
11.
Driver  TH, Loh  AR, Joseph  D, Keenan  JD, Naseri  A.  Predictors of matching in ophthalmology residency for international medical graduates. Ophthalmology. 2014;121(4):974-975.e2.
PubMedArticle
12.
National Resident Matching Program, Data Release and Research Committee. Results of the 2013 NRMP Applicant Survey by Preferred Specialty and Applicant Type. National Resident Matching Program: Washington, DC; 2013.
Original Investigation
May 2015

An Evaluation of Geographic Trends in the Otolaryngology Residency MatchHome Is Where the Heart Is

Author Affiliations
  • 1Department of Otolaryngology–Head and Neck Surgery, Wayne State University School of Medicine, Detroit, Michigan
  • 2Section of Otolaryngology, Department of Surgery, John D. Dingell VA Medical Center, Detroit, Michigan
  • 3Department of Otolaryngology–Head and Neck Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
JAMA Otolaryngol Head Neck Surg. 2015;141(5):424-428. doi:10.1001/jamaoto.2015.0219
Abstract

Importance  Securing an otolaryngology residency position has become an increasingly competitive endeavor in recent years. Recent studies have investigated the applicant criteria used by residency programs as part of the ranking process. However, to our knowledge, no studies have comprehensively investigated the role of geographic location in the match process.

Objective  To evaluate geographic trends in the otolaryngology national residency match process.

Design, Setting, and Participants  We conducted a cross-sectional examination of 56 otolaryngology residency programs including 810 residents to determine resident demographic information, including matriculated medical schools.

Main Outcomes and Measures  The geographic locations of residency programs and the residents’ matriculated medical schools were evaluated for trends. Residents’ program locations were compared with the locations of their medical schools of matriculation, and the numbers of residents attending a program affiliated with their medical schools were also identified.

Results  Overall, 810 residents were identified from the 56 programs included in our study. Of these, 169 residents (20.9%) attended the program affiliated with their medical school. The Midwest had the highest proportion of residents graduating from the affiliated medical school (25.7%), and the West had the lowest proportion (12.5%) (P = .008). A total of 473 residents attended a program within the same region as their medical school (58.4%). The South had the highest proportion of residents from the same region (68.2%), and the West had the lowest proportion (31.3%) (P < .001).

Conclusions and Relevance  While it is not clear why a geographic bias was identified, a significant proportion of residents in our study attended a program in the same region as their medical school. This geographic association was strongest in the Midwest and South. Furthermore, a significant proportion of residents attended the program affiliated with their medical schools. This information is valuable to all future applicants as they choose where to apply, and to all residency programs as they decide how geographic location factors in to whom they decide to interview.

Introduction

The otolaryngology residency matching process has become increasingly competitive in recent years. In 2008, the first year matching data were made available by the National Residency Match Program (NRMP), there were 105 programs in the country offering 273 first-year residency positions. That same year, 313 US medical school seniors applied, and only 253 matched into a program (80.8% match rate).1 In 2013, there were 107 programs with a total of 292 available residency positions: 387 US medical school seniors applied, and 276 matched into a residency program (71.3% match rate).2 Over this 5-year period, there has been a 23.6% increase in the number of US medical school seniors applying for residency in otolaryngology, with only a 7% increase in the number of residency positions.1,2

With this increase in the number of applicants yearly, several studies have been performed to identify factors that may lead to successfully obtaining an interview and possibly matching at a program. Numerous evaluations have investigated whether any correlation exists between select applicant criteria and subsequent success as an otolaryngology resident.37 Fewer analyses have examined factors affecting decisions about whom to interview for residency positions. In a survey targeting otolaryngology program directors,8 the NRMP queried which specific criteria were utilized in selecting applicants to interview. Criteria reported to be significant included US Medical Licensing Examination scores and extracurricular activities, but no role of geographic location was noted. In fact, to our knowledge, there have been no studies focused on the role of geographic location in the otolaryngology residency application process.

The general surgery literature includes several investigations exploring the importance of geographic location in the match process.9,10 One study demonstrated that a significant proportion of applicants (24.6%) matched at a program affiliated with their medical school. This phenomenon was found to be significant especially in states with 2 or fewer medical schools.9 Another recent study in ophthalmology also revealed geographic bias within the match process, with a significant proportion of residents remaining in the same region as their medical schools.11 In light of these findings and the relative paucity of data relevant to otolaryngology applicants, the primary goal of our present analysis is to evaluate geographic trends among matriculated otolaryngology residents. We were interested in examining whether there is any association of residents matching at the program affiliated with their medical schools or within the same region as their medical schools.

Methods

Data collection for this cross-sectional study was completed in March 2014, covering the resident classes at all programs for the 2013-2014 academic year. FREIDA online (http://www.ama-assn.org/ama/pub/education-careers/graduate-medical-education/freida-online.page), a residency and fellowship database sponsored by the American Medical Association, was used to identify current otolaryngology residency programs. The departmental websites were then accessed through the FREIDA online system or, if not available, through an internet search. This study qualified as nonhuman subject research and was thus exempted from institutional review board (IRB) approval per the standing policy of the Rutgers New Jersey Medical School IRB.

From this database, 106 otolaryngology residency programs were identified, 56 of which were included in the study. The inclusion criteria were (1) availability of a roster of the current otolaryngology residents and (2) a listing of what medical schools the residents attended. Of the 50 programs excluded, 8 had no affiliation with a medical school, while the other 42 were either missing a roster of current residents (n = 13) or were missing the medical schools that the residents attended (n = 29). In this study, research fellowships and other fellowship positions were not included in the data. If a program had an incomplete listing of the medical schools attended, a Google search was performed to identify the medical school attended for the residents not listed on the program website. If this information was not available, the remainder of the program’s information was not utilized in the study. Furthermore, residency programs with no affiliated medical school were not included in the study. The medical school affiliation for each program was obtained through the FREIDA online database or through departmental websites. Additionally, the number of medical schools within the state for each program was determined through the Association of American Medical Colleges Electronic Residency Application Service (ERAS) website (https://www.aamc.org/services/eras/).

For the 56 programs included, the region of the program location was identified through the FREIDA online database (New England, Mid Atlantic, East North Central, West North Central, South Atlantic, East South Central, West South Central, Mountain, and Pacific). These subregions were then grouped together into the regions as defined by the United States Census Bureau (Northeast, Midwest, South, and West). The same categories were applied to the associated medical schools. This led to the following data being collected: (1) residency programs, (2) regions of residency programs, (3) numbers of current residents, (4) numbers of residents from the affiliated medical schools, and (5) numbers of residents from schools in all regions.

Mann-Whitney U and Fisher exact tests were used for comparison of continuous and categorical data, respectively. The Pearson correlation coefficient was used to determine correlation strength for data compared in a linear relationship. Threshold for statistical significance was set at P < .05, and SPSS software, version 20 (IBM Corp), was used for statistical analysis.

Results

The 56 included otolaryngology residency programs included a total of 810 residents; 169 of these residents (20.9%) graduated from their residency program’s affiliated medical school. The mean (SD) percentage of “home” residents per residency program was 14.5% (13.2%), with a mean (SD) of 3.0 (2.1) residents graduating from the affiliated medical school per program. No correlation was found between residency program size and the number of residents who graduated from the program’s affiliated medical school (R2 = 0.005)

Programs were further organized into 2 additional categories: (1) those with 2 or fewer medical schools in the state of the residency program and (2) those with 3 or more medical schools in the state of the residency program. Fourteen residency programs were in states with 2 or fewer medical schools in the state, while the other 42 residency programs were in states with 3 or more medical schools. The number of residents who graduated from the programs’ affiliated medical schools, and those who graduated from other schools, were compared between the 2 groups: 21.6% of residents in states with 3 or more medical schools graduated from their program’s affiliated medical school, while 18.5% of residents in states with 2 or fewer medical schools graduated from their program’s affiliated medical school (P = .41).

Programs were also organized by their regional locations. There were 8 programs located in the West with a total of 112 residents, 12 in the Midwest with 194 residents, 21 in the South with 289 residents, and 15 in the Northeast with 215 residents. In comparisons between regions, it was found that the programs in the Midwest had the greatest proportion of residents graduating from their affiliated medical schools (25.7%), and the West had the lowest proportion (12.5%) (P = .01). Direct comparisons were made among the other regions, and there was no statistically significant difference found between the regions (P = .09 to P = .63). When these same regions were compared using the number of residents who graduated from a program’s affiliated medical school and those who graduated from another school, it was found that there were differences among these numbers between the Midwest and West (P = .01) and the South and West (P = .046) (Table 1).

The medical schools attended by residents were categorized by geographic region as well. In comparisons of the regional location of the program with the regional location of the residents’ associated medical schools, it was found that the programs in the South had the highest percentage of residents who graduated from a medical school in the South (68.16%), and the West had the lowest percentage of residents who graduated from a medical school in the West (31.25%) (Table 2). Comparisons between the regions of the program location and the number of residents who attended medical school from the same region were performed. Statistically significant differences in the numbers of residents who graduated from a medical school in the same region as the program location were found between the Midwest and West (P < .001), South and Northeast (P = .003), South and West (P < .001), and Northeast and West (P < .001).

Discussion

In 2013, the NRMP conducted a residency applicant survey to assess which factors contributed to a resident’s decision where to apply for residency and how to construct the rank list.12 This information was collected for every specialty that participates in the NRMP. Within otolaryngology, 92% of the US medical school seniors responding reported that geographic location was a factor in determining where to apply. This percentage was higher than that for the reputation of the program (90%), the quality of faculty (85%), and even preparation for fellowship training (67%).12 Similar results were obtained for factors influential in the rank process, with 85% of US seniors reporting geographic location as a factor. This percentage was just below that for the quality of the faculty and reputation of the program (86%) but was reported more frequently as being a factor in the rank list than all other factors in the study.

It is clear that geographic location plays a major role in determining where prospective residents interview and how a rank list is ultimately formed. This information highlights the importance of our analysis, especially combined with the data from general surgery residency programs revealing that an average of 24.6% general surgery residents end up in the residency of the program associated with their medical school.9,10 To our knowledge, the present study is the first analysis specific to otolaryngology assessing how geographic location affects where applicants will perform their residency. Our study suggests that there is a significant proportion of residents within each program who matriculated from their program’s affiliated medical school.

We found no statistically significant difference in the number of residents who graduated from their program’s affiliated medical school when programs were organized by the number of medical schools within each state (P = .41). This differs from findings described in general surgery,9 where a greater average number of residents were found to have come from an affiliated program in those states with 2 or fewer medical schools per state. This is likely owing to the greater number of general surgery residency programs within each state compared with the number of otolaryngology residency positions. In the study of general surgery residency programs,9 no comparisons were made between different regions to assess for statistically significant differences in the percentages of residents who graduated from their affiliated medical schools within each region.

Evaluating residency programs by the regions in which the medical schools and programs were located, we found that a significant proportion of the residents attended schools in the same region as their current program (31.3%-68.2%) (Table 2). Four of the 6 comparisons between regions revealed statistically significant differences between the number of residents from a medical school in that region and those who were from another region. It is unknown whether this geographic bias is due to residency applicant selection of schools or if there is a bias in the residency programs themselves.

While this analysis reveals statistically significant differences in the number of residents who graduated from medical school within the same region, there are several limitations that should be emphasized. The primary limitation is that this is an internet-based study, with only 56 programs included, highlighting the importance of having a complete and thorough website for each program to provide the most information for the future applicants. With 50 programs not included in our analysis, the total number of residents we could include in the study was reduced, and the number of programs available in certain regions was limited.

Another limitation is that the source of the geographic bias could not be determined. Are residency programs interviewing more applicants within the same region, potentially believing that those applicants would be more likely to rank their program highly? Are the applicants more likely to interview with and rank programs within the same region based on location alone? Answers to these questions would potentially help medical students and residency program directors during the application process. For example, if residency programs are interviewing applicants from only certain regions, there would be a substantial cost benefit to the applicant to apply only to programs within the same region as his or her medical school.

Finally, there are several confounding variables. The applicant survey revealed several factors that play a role in determining where to apply and how programs will be ranked, including (but not limited to) perceived quality of faculty, house staff morale, and the importance of an away rotation.12 These likely affect the process in several ways; however, at this point it is uncertain how much these factors influence decisions and the subsequent data in our analysis.

Conclusions

As the otolaryngology residency application process has become increasingly competitive, assessing which factors play a role in determining where applicants ultimately end up for their residency program is invaluable for individual applicants and programs alike. The geographic associations reported herein provide a critical piece of information for future otolaryngology applicants. Overall, a higher percentage of residents attend a program within the same region as their medical school. It is unclear if this trend is based on applicants selectively applying to certain regions or if programs are regionally limiting their interviewees. Although a survey-based design asking additional detailed questions to program directors, residency selection committees, and applicants would be invaluable in exploring this issue, further analysis of future NRMP ranking data may also provide valuable insights into the extent to which a “home field advantage” comes into play. However, the information presented herein is helpful for both prospective residents and residency programs: equipped with these findings, applicants might sharpen focus and reduce the number of applications they send, and residency program administrators can better determine which applicants are truly interested in attending their program.

Back to top
Article Information

Submitted for Publication: December 13, 2014; final revision received December 22, 2014; accepted December 28, 2014.

Corresponding Author: Andrew P. Johnson, MD, Department of Otolaryngology–Head and Neck Surgery, Wayne State University School of Medicine, 4201 St Antoine, 5E-UHC, Detroit, MI 48201 (apjohnso@med.wayne.edu).

Published Online: March 12, 2015. doi:10.1001/jamaoto.2015.0219.

Author Contributions: Drs Johnson and Svider had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Johnson, Svider, Folbe, Shkoukani, Eloy, Zuliani.

Acquisition, analysis, or interpretation of data: Johnson, Svider, Raza.

Drafting of the manuscript: Johnson, Svider.

Critical revision of the manuscript for important intellectual content: Johnson, Svider, Folbe, Raza, Shkoukani, Eloy, Zuliani.

Statistical analysis: Johnson, Svider.

Obtained funding: Svider.

Administrative, technical, or material support: Svider, Eloy.

Study supervision: Svider, Folbe, Raza, Shkoukani, Eloy, Zuliani.

Conflict of Interest Disclosures: None reported.

References
1.
National Resident Matching Program. Results and Data, 2008: Main Residency Match®. National Resident Matching Program: Washington, DC; 2008.
2.
National Resident Matching Program. Results and Data, 2013: Main Residency Match®. National Resident Matching Program: Washington, DC; 2013.
3.
Tang  CG, Hilsinger  RL  Jr, Cruz  RM, Schloegel  LJ, Byl  FM  Jr, Rasgon  BM.  Manual dexterity aptitude testing: a soap carving study. JAMA Otolaryngol Head Neck Surg. 2014;140(3):243-249.
PubMedArticle
4.
Chole  RA, Ogden  MA.  Predictors of future success in otolaryngology residency applicants. Arch Otolaryngol Head Neck Surg. 2012;138(8):707-712.
PubMedArticle
5.
Bent  JP, Colley  PM, Zahtz  GD,  et al.  Otolaryngology resident selection: do rank lists matter? Otolaryngol Head Neck Surg. 2011;144(4):537-541.
PubMedArticle
6.
Prager  JD, Myer  CM  IV, Hayes  KM, Myer  CM  III, Pensak  ML.  Improving methods of resident selection. Laryngoscope. 2010;120(12):2391-2398.
PubMedArticle
7.
Daly  KA, Levine  SC, Adams  GL.  Predictors for resident success in otolaryngology. J Am Coll Surg. 2006;202(4):649-654.
PubMedArticle
8.
National Resident Matching Program, Data Release and Research Committee. Results of the 2012 NRMP Program Director Survey. National Resident Matching Program: Washington, DC; 2012.
9.
Falcone  JL.  Home-field advantage: the role of selection bias in the general surgery national residency matching program. J Surg Educ. 2013;70(4):461-465.
PubMedArticle
10.
Makdisi  G, Takeuchi  T, Rodriguez  J, Rucinski  J, Wise  L.  How we select our residents: a survey of selection criteria in general surgery residents. J Surg Educ. 2011;68(1):67-72.
PubMedArticle
11.
Driver  TH, Loh  AR, Joseph  D, Keenan  JD, Naseri  A.  Predictors of matching in ophthalmology residency for international medical graduates. Ophthalmology. 2014;121(4):974-975.e2.
PubMedArticle
12.
National Resident Matching Program, Data Release and Research Committee. Results of the 2013 NRMP Applicant Survey by Preferred Specialty and Applicant Type. National Resident Matching Program: Washington, DC; 2013.
×