Key PointsQuestion
How common are unverifiable publications among ophthalmology residency applications?
Findings
In this cross-sectional study of 322 applicants invited to interview for the ophthalmology residency program at Vanderbilt University School of Medicine for entering classes 2012 to 2017, 7% had an unverifiable publication. Of students listing any full-length publications, 9% had at least 1 unverifiable scholarly work.
Meaning
Unverifiable publications are not rare among ophthalmology residency applications; modifying the San Francisco Match application may help ensure that the most ethical ophthalmology residents are recruited.
Importance
Unverifiable publications in applications for ophthalmology residencies could be a serious concern if they represent publication dishonesty.
Objective
To determine the rate of unverifiable publications among applicants offered an interview.
Design
Retrospective review of 322 ophthalmology residency applications for entering classes 2012 to 2017 at Vanderbilt University School of Medicine, Nashville, Tennessee.
Interventions
Full-length publications reported in the applications were searched in PubMed, Google, Google Scholar, and directly on the journal’s website. Applications were deemed unverifiable if there was no record of the publication by any of these means or if substantial discrepancies existed, such as incorrect authorship, incorrect journal, or a meaningful discrepancy in title or length (full-length article vs abstract).
Main Outcomes and Measures
Inability to locate publication with search, incorrect author position, applicant not listed as an author, article being an abstract and not a published paper, substantial title discrepancy suggesting an alternative project, and incorrect journal.
Results
Of the 322 applicants offered interviews during the 6-year study period, 22 (6.8%) had 24 unverifiable publications. Two hundred thirty-nine of these applicants (74.2%) reported at least 1 qualifying publication; of this group, 22 (9.2%) had an unverifiable publication. The applications with unverifiable publications were evenly distributed across the years of the study (range, 2-6 per cycle; Pearson χ25 = 3.65; P = .60). Two applicants had 2 unverifiable publications each. Two of the 22 applicants (9.1%) with unverifiable publications were graduates of medical schools outside the United States. Among the unverifiable publications, the most common reason was inability to locate the publication (13 [54%]). Additional issues included abstract rather than full-length publication (5 [20.8%]), incorrect author position (4 [16.7%]), applicant not listed as an author on the publication (1 [4.2%]), and substantial title discrepancy (1 [4.2%]). One listed publication had an incorrect author position and incorrect journal (1 [4.2%]).
Conclusions and Relevance
Unverifiable publications among ophthalmology residency applicants is a persistent problem. Possible strategies to modify the review process include asking applicants to provide copies of their full-length works or the relevant PMCID (PubMed Central reference number) or DOI (digital object identifier) for their publications.
Misrepresentation of self-reported scholarly work at the residency and fellowship application level has been documented in the literature for more than 20 years and across numerous medical subspecialties. Previous articles have reported widely divergent results.1-18 In one of the earliest studies exploring the misrepresentation of academic accomplishments, Sekas and Hutson15 in 1995 reported that research activity could not be confirmed in 47 of 138 applications (34.1%) to a gastroenterology fellowship program.15 Subsequent investigations of misrepresentation of academic publications in neurosurgery,1,4 otolaryngology,2 plastic surgery,3,13 pediatric surgery,6 orthopedic surgery,5,11 emergency medicine,7,14 internal medicine,8 urology,9 pathology,10 dermatology,12 pain medicine,16 and ophthalmology17 have reported rates from as low as 1.9% in ophthalmology applications submitted from 2000 to 200417 to as high as 45% in neurosurgery applications submitted in 2012.1
In a previous study of verification of publications of ophthalmology residency applicants, Wiggins17 reviewed all 821 applications to a single ophthalmology residency program across 5 application cycles from 2000 to 2004. That study found a 1.9% rate of unverifiable publications; however, among applicants who reported at least 1 publication, the rate of unverifiable publications was 8.1%. Because this outcome could be a serious concern if these unverifiable works represent publication dishonesty, we sought to determine the current rate of unverifiable scholarly work among ophthalmology residency applicants. We proposed to study the applications to train in ophthalmology at our institution, the Vanderbilt Eye Institute, Vanderbilt University Medical Center, in Nashville, Tennessee. To avoid introducing bias into the percentage of unverifiable publications across the entire applicant pool, we elected to limit our examination to applicants invited to interview, who represent the physicians most likely to ultimately practice ophthalmology.
Quiz Ref IDWe selected applicants who submitted a San Francisco Match (SF Match) application to our ophthalmology residency program for entering class years 2012 to 2017 and who were subsequently invited to interview. Approval for this study was obtained through the Vanderbilt University Institutional Review Board, which also waived the need to obtain informed consent from the applicants. Data were deidentified.
Eligible applications were taken from the application years for the entering postgraduate year 2 ophthalmology residency classes 2012 to 2017. For the 6 application cycles available, 3013 records were present in the archived database. From the 3013 applications, 322 candidates were offered an interview; these applications were examined for accuracy in reporting of publications.
All 322 applications were read completely. A standardized data collection form was used to record the data from the applications. The metrics recorded included year of application, sex, medical school, US News and World Report Medical School Rank for 2016,19 undergraduate major, undergraduate grade point average, undergraduate institution, advanced degrees other than MD (eg, PhD, MPH, MBA, or MS), US Medical Licensing Examination (USMLE) Step 1 score, Alpha Omega Alpha (AOA) status, medical school class rank (if provided), and number of full-length scholarly works listed as “published,” “accepted,” or “in press.” Publications listed were searched in PubMed, Google, Google Scholar, and via a literature search directly on the journal’s website. Small discrepancies, such as incorrect page numbers or journal abbreviations, were not considered to represent dishonesty. Two observers, 1 ophthalmology resident (H.M.T.) and 1 medical student (R.T.), independently extracted data and reviewed qualifying publications from each application.
The statistical analysis of the data was performed with R statistical software (version 3.3.2; R Foundation Inc).20 Primary end points included any published works and unverifiable scholarly works. Descriptive statistics for continuous variables were represented as median (interquartile range), and categorical variables were represented as percentages. Categorical data were compared using the Pearson test, and continuous data were analyzed using the Wilcoxon rank sum test. Statistical significance was set a priori at P < .05. All P values are 2-sided unless otherwise stated.
Logistic regression using the covariates of sex, advanced degree in addition to MD, and the USMLE Step 1 score was performed to investigate the likelihood of an applicant having any published works. Among applicants with at least 1 published work, a similar logistic regression was used to identify any associations between these factors and the likelihood of identifying unverifiable publications. A Fisher exact test was used to determine if the rates of unverifiable publications were different for graduates of medical schools outside the United States compared with US medical students. The nonlinear association of USMLE Step 1 score with the likelihood of an applicant producing any published works was modeled via a restricted cubic spline with 4 df.
Quiz Ref IDBetween October 2010 and October 2015 a total of 3013 applications to our ophthalmology residency program were received. The data from 2691 applications were excluded because these applicants were not offered an interview at our academic institution. Three hundred twenty-two applications of the candidates invited to interview were studied. A mean of 53 applicants (range, 49-57) were interviewed in each of the 6 application cycles studied. Of the 322 candidates, 239 (74.2%) reported at least 1 full-length published or accepted scholarly work, and several reported multiple publications, whereas 83 (26%) did not. Twenty-two of the 322 applicants (6.8%) who were offered interviews had 24 unverifiable publications. Of the 239 candidates listing at least 1 full-length publication, 22 applicants (9.2%) had unverifiable publications. The unverifiable applications were evenly distributed across the years of the study (range, 2-6 per cycle; Pearson χ25 = 3.65; P = .60).
Quiz Ref IDThe SF Match application did not include key identifiers, such as age and race. The demographic characteristics of the applicants are shown in the Table. The 322 candidates comprised 181 (56%) men and 141 (44%) women. Of the 22 applicants with unverifiable publications, 13 (59%) were male. Two applicants (9%) with unverifiable publications were graduates of medical schools outside the United States; this rate was not different from the rate among US medical school graduates (9%; P > .99). Most medical students invited to interview had an undergraduate degree and were expected to obtain a medical degree. Ninety-four applicants (29.2%) invited to interview had an additional advanced degree other than an MD, and those students were more likely to have published scholarly works (odds ratio, 2.74; 95% CI, 1.38-5.45; P = .004) (Figure 1). Among applicants with unverifiable publications, 4 (18.2%) had additional degrees (Pearson χ21 = 2.8; P = .09). The mean USMLE Step 1 score of applicants invited to interview was 244 (range, 188-273). Of those with unverifiable publications, the mean score was 242 (range, 188-258; F1,237 = 0; P = .94). Applicants with USMLE Step 1 scores closest to the mean were more likely to publish compared with those candidates with higher or lower scores. One hundred seventy-eight of the invited applicants (55.3%) reported AOA status on their application. Of these students, 61 (34%) had been awarded AOA membership. Four of the 11 candidates with unverifiable publications and verified AOA status (36.4%) were elected members (Pearson χ21 = 0.26; P = .61). In summary, the presence of an advanced degree other than an MD and the USMLE Step 1 score were significantly associated with having any published scholarly work. None of the covariates, including application year, sex, an additional advanced degree, the USMLE Step 1 score, or AOA status, were significantly associated with an applicant producing unverifiable publications (Figure 2).
Types of Unverifiable Publications
The types of unverifiable publications are summarized in Figure 3. Two applicants had 2 unverifiable publications each. The most common reason was our inability to locate the publication (13 of 24 publications [54%]). Additional issues included the work being an abstract rather than a full-length publication (5 [20.8%]), incorrect author position (4 [16.7%]), applicant not listed as an author at all on the publication in question (1 [4.2%]), and substantial title discrepancy such that a different project was suggested by the name (1 [4.2%]). One listed publication (4.2%) had an incorrect author position (not first author) and was also listed in an incorrect journal.
Previously reported rates of unverifiable publications diverge widely across multiple specialties.1-18 Rates as high as 45% among neurosurgery residency applicants1 in 2012 and as low as 1.9% among ophthalmology applicants17 from 2000 to 2004 have been reported. Our study endeavored to examine the prevalence of unverifiable publications among future physicians most likely to practice ophthalmology—those invited to interview rather than the entire applicant pool—in an effort to increase relevance in our field, which interviews a small proportion of total applicants.
Our observation that 6.8% of all interviewed ophthalmology residency applicants for entering classes 2012 to 2017 had at least 1 unverifiable publication on their applications is largely consistent with the disparate previous reports among multiple medical specialties; however, it is higher than the Wiggins report17 of 1.9% among all ophthalmology resident applicants to his institution from 2000 to 2004. Wiggins found a rate of unverifiable publications of 8.1% among applicants who reported at least 1 published scholarly work; our rate for the same metric was remarkably similar at 9.2%. The major difference between the 2 studies is that, in the Wiggins cohort, 24.5% of applicants reported at least 1 publication, whereas in our study, 74.2% of applicants had published.
In concert with the observation that our rates of unverifiable publications among applicants reporting published scholarly work are similar to the Wiggins cohort from 2000 to 2004 is that our study did not show change over time in rates of unverifiable publications for entering classes 2012 to 2017. This finding suggests relatively stable rates of unverifiable publications specific to the field of ophthalmology at around 8% to 9% among applicants reporting at least 1 published scholarly work for the past 17 years. The absolute number of applicants with unverifiable citations has risen in concert with more applicants reporting publications.
Although it is somewhat reassuring that the rate of unverifiable publications has not increased over time, it is still alarming that this issue persists in our field. Although the authors can only speculate as to the motives of the applicants in our study, previous studies have offered some ideas. Maverakis et al12 published on the widespread belief that it is an advantage for an applicant to have multiple scholarly works; anecdotally, we also believe this to be true. Medical students at our institution who are interested in pursuing a career in ophthalmology are strongly encouraged to become involved with ophthalmology research to strengthen their applications in addition to their knowledge. It is also possible that students perceive small falsifications or embellishments to curricula vitae to be a widespread practice; if this is the case, they may believe that they are at a disadvantage if they do not also make their resumes appear as competitive as possible even at the detriment of the truth.
Previous authors have postulated that some discrepancies in reporting could be accounted for by rigid categories on the Electronic Residency Application Service application. In the case that the applicants did not understand the terminology, such as “peer reviewed,” they could inadvertently categorize their work incorrectly in an honest mistake.1 This possibility is inapplicable to the field of ophthalmology, which instead relies on the SF Match. The SF Match application lists a section entitled “Research activities, papers, and/or additional information” and relies on self-report. Therefore, it is difficult to imagine how an applicant could unknowingly miscategorize his or her work.
Our study had several limitations. It was retrospective in design. Demographic information was limited: the SF Match application did not include key identifiers such as age and race. Other potentially important variables, such as AOA status and medical school class rank, were not reported by most applicants. Both of these variables were also self-reported, as were scholarly works, leaving open the possibility of dishonesty in these parts of the application as well. Using the variables that we were able to reliably identify, we were not able to establish any correlation between applicant characteristics and the likelihood of unverifiable publications. It is possible that associations do exist, but there were relatively too few instances of unverifiable publications in our study to determine such a relationship. We were able to show that an advanced degree and the USMLE Step 1 score were significantly related to an applicant producing any published works.
Quiz Ref IDWe chose to scrutinize only scholarly work listed as “published,” “accepted,” or “in press,” as these works should be readily accessible provided that the citation is accurate. An applicant could have been confused by this nomenclature and mistakenly classified his or her work as accepted when in fact it was accepted with revisions and as such would not yet be accessible via our search techniques. We did not scrutinize the unverifiable publications again at a later time, although this will be an area of further investigation. We did not consider small misspellings or inaccuracies such as page-number errors to represent dishonesty. In the small number of these minimal errors, the works in question were easily identified through our search methods. Although such carelessness is not ideal in a future resident, we did not believe that these errors warranted categorization as unverifiable. Finally, it is possible that the pool of interviewed applicants at our institution is not generalizable to all ophthalmology residency programs.
Our study underscores the persistent problem of unverifiable publications among applicants to ophthalmology residency programs. Although unverifiable publications cannot be assumed to represent academic dishonesty without due process, it is difficult to imagine scenarios in which these works would be unverifiable in the age of readily searchable databases, especially because all works in question were self-reported on the candidates’ applications. This situation is particularly concerning given that trustworthiness, a sound ethical background, and attention to detail are considered by most to be prerequisite to a career in medicine. It is imperative that physicians mentoring our medical students and residents uphold these values. Based on our findings, the Vanderbilt Eye Institute will be altering its ophthalmology applicant review process. We are considering asking applicants to bring copies of their published work with them to the interview or asking the applicants to provide the PMCID (PubMed Central reference number) or DOI (digital object identifier) of their scholarly work in a brief supplemental application. These strategies aim to resolve any honest discrepancies on the candidate’s application in a timely fashion and would also remove the burden of verifying all cited works from our staff. We also advocate for the SF Match to modify the “Research activities, papers, and/or additional information” section of the application to strongly recommend or require that applicants supply these identifiers to go along with their full-length scholarly works. This change would benefit all ophthalmology programs and help address the problem of unverifiable publications before inviting the candidate for an interview. These changes will help ensure that we recruit not only the most intelligent, hard-working future physicians but also only those with a sound ethical base to our field.
Accepted for Publication: February 17, 2018.
Corresponding Author: Louise A. Mawn, MD, Vanderbilt Eye Institute, Vanderbilt University Medical Center, 2311 Pierce Ave, Nashville, TN 37232-8808 (louise.a.mawn@vanderbilt.edu).
Published Online: April 19, 2018. doi:10.1001/jamaophthalmol.2018.0846
Author Contributions: Drs Tamez and Brown had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Tamez, Wayman, Mawn.
Acquisition, analysis, or interpretation of data: Tamez, Tauscher, Brown.
Drafting of the manuscript: Tamez, Brown, Mawn.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Tamez, Tauscher, Brown.
Administrative, technical, or material support: Wayman, Mawn.
Study supervision: Mawn.
Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.
Funding/Support: Supported in part by a Physician Scientist Award (Dr Mawn) and an unrestricted grant to the Vanderbilt Eye Institute from Research to Prevent Blindness.
Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
5.Dale
JA, Schmitt
CM, Crosby
LA. Misrepresentation of research criteria by orthopaedic residency applicants.
J Bone Joint Surg Am. 1999;81(12):1679-1681.
PubMedGoogle ScholarCrossref 11.Konstantakos
EK, Laughlin
RT, Markert
RJ, Crosby
LA. Follow-up on misrepresentation of research activity by orthopaedic residency applicants: has anything changed?
J Bone Joint Surg Am. 2007;89(9):2084-2088. doi:
10.2106/JBJS.G.00567PubMedGoogle Scholar 12.Maverakis
E, Li
CS, Alikhan
A, Lin
TC, Idriss
N, Armstrong
AW. The effect of academic “misrepresentation” on residency match outcomes.
Dermatol Online J. 2012;18(1):1.
PubMedGoogle Scholar 14.Roellig
MS, Katz
ED. Inaccuracies on applications for emergency medicine residency training.
Acad Emerg Med. 2004;11(9):992-994.
PubMedGoogle ScholarCrossref 15.Sekas
G, Hutson
WR. Misrepresentation of academic accomplishments by applicants for gastroenterology fellowships.
Ann Intern Med. 1995;123(1):38-41.
PubMedGoogle ScholarCrossref 20.R Core Team.
R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2013.
http://www.R-project.org/. Accessed December 21, 2016.