[Skip to Content]
[Skip to Content Landing]
Figure 1.
Likelihood of Reported Publications Among Ophthalmology Residency Applicants Based on US Medical Licensing Examination (USMLE) Step 1 Score, Sex, and Advanced Degree Other Than MD
Likelihood of Reported Publications Among Ophthalmology Residency Applicants Based on US Medical Licensing Examination (USMLE) Step 1 Score, Sex, and Advanced Degree Other Than MD

Odds ratios (ORs) for the likelihood of an applicant reporting any full-length scholarly works are statistically significant for the presence of an additional advanced degree (OR, 2.74; 95% CI, 1.38-5.45; P = .004). The 3 bars represent 90%, 95%, and 99% CIs.

Figure 2.
Likelihood of Unverifiable Publications Among Ophthalmology Residency Applicants With Reported Publications
Likelihood of Unverifiable Publications Among Ophthalmology Residency Applicants With Reported Publications

Odds ratios (ORs) for the likelihood of an applicant reporting unverifiable scholarly works. The 3 bars represent 90%, 95%, and 99% CIs. USMLE indicates US Medical Licensing Examination.

Figure 3.
Types of Unverifiable Publications Among Ophthalmology Residency Applicants
Types of Unverifiable Publications Among Ophthalmology Residency Applicants

The absolute number of occurrences of each type of error recorded from the 24 instances of unverifiable events.

Table.  
Demographic and Academic Characteristics of 322 Applicants to Vanderbilt University School of Medicine Ophthalmology Residency Program
Demographic and Academic Characteristics of 322 Applicants to Vanderbilt University School of Medicine Ophthalmology Residency Program
1.
Kistka  HM, Nayeri  A, Wang  L, Dow  J, Chandrasekhar  R, Chambless  LB.  Publication misrepresentation among neurosurgery residency applicants: an increasing problem.  J Neurosurg. 2016;124(1):193-198. doi:10.3171/2014.12.JNS141990PubMedGoogle ScholarCrossref
2.
Beswick  DM, Man  L-X, Johnston  BA, Johnson  JT, Schaitkin  BM.  Publication misrepresentation among otolaryngology residency applicants.  Otolaryngol Head Neck Surg. 2010;143(6):815-819. doi:10.1016/j.otohns.2010.08.054PubMedGoogle ScholarCrossref
3.
Chung  CK, Hernandez-Boussard  T, Lee  GK.  “Phantom” publications among plastic surgery residency applicants.  Ann Plast Surg. 2012;68(4):391-395. doi:10.1097/SAP.0b013e31823d2c4ePubMedGoogle ScholarCrossref
4.
Cohen-Gadol  AA, Koch  CA, Raffel  C, Spinner  RJ.  Confirmation of research publications reported by neurological surgery residency applicants.  Surg Neurol. 2003;60(4):280-283. doi:10.1016/S0090-3019(03)00429-4PubMedGoogle ScholarCrossref
5.
Dale  JA, Schmitt  CM, Crosby  LA.  Misrepresentation of research criteria by orthopaedic residency applicants.  J Bone Joint Surg Am. 1999;81(12):1679-1681.PubMedGoogle ScholarCrossref
6.
Gasior  AC, Knott  EM, Fike  FB,  et al.  Ghost publications in the pediatric surgery match.  J Surg Res. 2013;184(1):37-41. doi:10.1016/j.jss.2013.04.031PubMedGoogle ScholarCrossref
7.
Gurudevan  SV, Mower  WR.  Misrepresentation of research publications among emergency medicine residency applicants.  Ann Emerg Med. 1996;27(3):327-330. doi:10.1016/S0196-0644(96)70268-8PubMedGoogle ScholarCrossref
8.
Hebert  RS, Smith  CG, Wright  SM.  Minimal prevalence of authorship misrepresentation among internal medicine residency applicants: do previous estimates of “misrepresentation” represent insufficient case finding?  Ann Intern Med. 2003;138(5):390-392. doi:10.7326/0003-4819-138-5-200303040-00008PubMedGoogle ScholarCrossref
9.
Hsi  RS, Hotaling  JM, Moore  TN, Joyner  BD.  Publication misrepresentation among urology residency applicants.  World J Urol. 2013;31(3):697-702. doi:10.1007/s00345-012-0895-0PubMedGoogle ScholarCrossref
10.
Kaley  JR, Bornhorst  J, Wiggins  M, Yared  M.  Prevalence and types of misrepresentation of publication record by pathology residency applicants.  Arch Pathol Lab Med. 2013;137(7):979-982. doi:10.5858/arpa.2012-0253-OAPubMedGoogle ScholarCrossref
11.
Konstantakos  EK, Laughlin  RT, Markert  RJ, Crosby  LA.  Follow-up on misrepresentation of research activity by orthopaedic residency applicants: has anything changed?  J Bone Joint Surg Am. 2007;89(9):2084-2088. doi:10.2106/JBJS.G.00567PubMedGoogle Scholar
12.
Maverakis  E, Li  CS, Alikhan  A, Lin  TC, Idriss  N, Armstrong  AW.  The effect of academic “misrepresentation” on residency match outcomes.  Dermatol Online J. 2012;18(1):1.PubMedGoogle Scholar
13.
Phillips  JP, Sugg  KB, Murphy  MA, Kasten  SJ.  Misrepresentation of scholarly works by integrated plastic surgery applicants.  Plast Reconstr Surg. 2012;130(3):731-735. doi:10.1097/PRS.0b013e31825dc3f2PubMedGoogle ScholarCrossref
14.
Roellig  MS, Katz  ED.  Inaccuracies on applications for emergency medicine residency training.  Acad Emerg Med. 2004;11(9):992-994.PubMedGoogle ScholarCrossref
15.
Sekas  G, Hutson  WR.  Misrepresentation of academic accomplishments by applicants for gastroenterology fellowships.  Ann Intern Med. 1995;123(1):38-41.PubMedGoogle ScholarCrossref
16.
Thompson  KM, Neuman  S, Schroeder  DR,  et al.  Misrepresentation in multidisciplinary pain medicine fellowship applications to a single academic program.  Pain Med. 2015;16(2):274-279. doi:10.1111/pme.12322PubMedGoogle ScholarCrossref
17.
Wiggins  MN.  Misrepresentation by ophthalmology residency applicants.  Arch Ophthalmol. 2010;128(7):906-910. doi:10.1001/archophthalmol.2010.123PubMedGoogle ScholarCrossref
18.
Wiggins  MN.  A meta-analysis of studies of publication misrepresentation by applicants to residency and fellowship programs.  Acad Med. 2010;85(9):1470-1474. doi:10.1097/ACM.0b013e3181e2cf2bPubMedGoogle ScholarCrossref
19.
Graduate school search—medicine programs. U.S. News & World Report. 2018. https://www.usnews.com/best-graduate-schools/search?program=top-medical-schools&name=. Accessed March 20, 2018.
20.
R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2013. http://www.R-project.org/. Accessed December 21, 2016.
Original Investigation
June 2018

Rate of Unverifiable Publications Among Ophthalmology Residency Applicants Invited to Interview

Author Affiliations
  • 1Vanderbilt Eye Institute, Vanderbilt University Medical Center, Nashville, Tennessee
  • 2Medical student, Vanderbilt University School of Medicine, Nashville, Tennessee
JAMA Ophthalmol. 2018;136(6):630-635. doi:10.1001/jamaophthalmol.2018.0846
Key Points

Question  How common are unverifiable publications among ophthalmology residency applications?

Findings  In this cross-sectional study of 322 applicants invited to interview for the ophthalmology residency program at Vanderbilt University School of Medicine for entering classes 2012 to 2017, 7% had an unverifiable publication. Of students listing any full-length publications, 9% had at least 1 unverifiable scholarly work.

Meaning  Unverifiable publications are not rare among ophthalmology residency applications; modifying the San Francisco Match application may help ensure that the most ethical ophthalmology residents are recruited.

Abstract

Importance  Unverifiable publications in applications for ophthalmology residencies could be a serious concern if they represent publication dishonesty.

Objective  To determine the rate of unverifiable publications among applicants offered an interview.

Design  Retrospective review of 322 ophthalmology residency applications for entering classes 2012 to 2017 at Vanderbilt University School of Medicine, Nashville, Tennessee.

Interventions  Full-length publications reported in the applications were searched in PubMed, Google, Google Scholar, and directly on the journal’s website. Applications were deemed unverifiable if there was no record of the publication by any of these means or if substantial discrepancies existed, such as incorrect authorship, incorrect journal, or a meaningful discrepancy in title or length (full-length article vs abstract).

Main Outcomes and Measures  Inability to locate publication with search, incorrect author position, applicant not listed as an author, article being an abstract and not a published paper, substantial title discrepancy suggesting an alternative project, and incorrect journal.

Results  Of the 322 applicants offered interviews during the 6-year study period, 22 (6.8%) had 24 unverifiable publications. Two hundred thirty-nine of these applicants (74.2%) reported at least 1 qualifying publication; of this group, 22 (9.2%) had an unverifiable publication. The applications with unverifiable publications were evenly distributed across the years of the study (range, 2-6 per cycle; Pearson χ25 = 3.65; P = .60). Two applicants had 2 unverifiable publications each. Two of the 22 applicants (9.1%) with unverifiable publications were graduates of medical schools outside the United States. Among the unverifiable publications, the most common reason was inability to locate the publication (13 [54%]). Additional issues included abstract rather than full-length publication (5 [20.8%]), incorrect author position (4 [16.7%]), applicant not listed as an author on the publication (1 [4.2%]), and substantial title discrepancy (1 [4.2%]). One listed publication had an incorrect author position and incorrect journal (1 [4.2%]).

Conclusions and Relevance  Unverifiable publications among ophthalmology residency applicants is a persistent problem. Possible strategies to modify the review process include asking applicants to provide copies of their full-length works or the relevant PMCID (PubMed Central reference number) or DOI (digital object identifier) for their publications.

Introduction

Misrepresentation of self-reported scholarly work at the residency and fellowship application level has been documented in the literature for more than 20 years and across numerous medical subspecialties. Previous articles have reported widely divergent results.1-18 In one of the earliest studies exploring the misrepresentation of academic accomplishments, Sekas and Hutson15 in 1995 reported that research activity could not be confirmed in 47 of 138 applications (34.1%) to a gastroenterology fellowship program.15 Subsequent investigations of misrepresentation of academic publications in neurosurgery,1,4 otolaryngology,2 plastic surgery,3,13 pediatric surgery,6 orthopedic surgery,5,11 emergency medicine,7,14 internal medicine,8 urology,9 pathology,10 dermatology,12 pain medicine,16 and ophthalmology17 have reported rates from as low as 1.9% in ophthalmology applications submitted from 2000 to 200417 to as high as 45% in neurosurgery applications submitted in 2012.1

In a previous study of verification of publications of ophthalmology residency applicants, Wiggins17 reviewed all 821 applications to a single ophthalmology residency program across 5 application cycles from 2000 to 2004. That study found a 1.9% rate of unverifiable publications; however, among applicants who reported at least 1 publication, the rate of unverifiable publications was 8.1%. Because this outcome could be a serious concern if these unverifiable works represent publication dishonesty, we sought to determine the current rate of unverifiable scholarly work among ophthalmology residency applicants. We proposed to study the applications to train in ophthalmology at our institution, the Vanderbilt Eye Institute, Vanderbilt University Medical Center, in Nashville, Tennessee. To avoid introducing bias into the percentage of unverifiable publications across the entire applicant pool, we elected to limit our examination to applicants invited to interview, who represent the physicians most likely to ultimately practice ophthalmology.

Methods
Subjects

Quiz Ref IDWe selected applicants who submitted a San Francisco Match (SF Match) application to our ophthalmology residency program for entering class years 2012 to 2017 and who were subsequently invited to interview. Approval for this study was obtained through the Vanderbilt University Institutional Review Board, which also waived the need to obtain informed consent from the applicants. Data were deidentified.

Data Collection

Eligible applications were taken from the application years for the entering postgraduate year 2 ophthalmology residency classes 2012 to 2017. For the 6 application cycles available, 3013 records were present in the archived database. From the 3013 applications, 322 candidates were offered an interview; these applications were examined for accuracy in reporting of publications.

All 322 applications were read completely. A standardized data collection form was used to record the data from the applications. The metrics recorded included year of application, sex, medical school, US News and World Report Medical School Rank for 2016,19 undergraduate major, undergraduate grade point average, undergraduate institution, advanced degrees other than MD (eg, PhD, MPH, MBA, or MS), US Medical Licensing Examination (USMLE) Step 1 score, Alpha Omega Alpha (AOA) status, medical school class rank (if provided), and number of full-length scholarly works listed as “published,” “accepted,” or “in press.” Publications listed were searched in PubMed, Google, Google Scholar, and via a literature search directly on the journal’s website. Small discrepancies, such as incorrect page numbers or journal abbreviations, were not considered to represent dishonesty. Two observers, 1 ophthalmology resident (H.M.T.) and 1 medical student (R.T.), independently extracted data and reviewed qualifying publications from each application.

Statistical Analysis

The statistical analysis of the data was performed with R statistical software (version 3.3.2; R Foundation Inc).20 Primary end points included any published works and unverifiable scholarly works. Descriptive statistics for continuous variables were represented as median (interquartile range), and categorical variables were represented as percentages. Categorical data were compared using the Pearson test, and continuous data were analyzed using the Wilcoxon rank sum test. Statistical significance was set a priori at P < .05. All P values are 2-sided unless otherwise stated.

Logistic regression using the covariates of sex, advanced degree in addition to MD, and the USMLE Step 1 score was performed to investigate the likelihood of an applicant having any published works. Among applicants with at least 1 published work, a similar logistic regression was used to identify any associations between these factors and the likelihood of identifying unverifiable publications. A Fisher exact test was used to determine if the rates of unverifiable publications were different for graduates of medical schools outside the United States compared with US medical students. The nonlinear association of USMLE Step 1 score with the likelihood of an applicant producing any published works was modeled via a restricted cubic spline with 4 df.

Results

Quiz Ref IDBetween October 2010 and October 2015 a total of 3013 applications to our ophthalmology residency program were received. The data from 2691 applications were excluded because these applicants were not offered an interview at our academic institution. Three hundred twenty-two applications of the candidates invited to interview were studied. A mean of 53 applicants (range, 49-57) were interviewed in each of the 6 application cycles studied. Of the 322 candidates, 239 (74.2%) reported at least 1 full-length published or accepted scholarly work, and several reported multiple publications, whereas 83 (26%) did not. Twenty-two of the 322 applicants (6.8%) who were offered interviews had 24 unverifiable publications. Of the 239 candidates listing at least 1 full-length publication, 22 applicants (9.2%) had unverifiable publications. The unverifiable applications were evenly distributed across the years of the study (range, 2-6 per cycle; Pearson χ25 = 3.65; P = .60).

Demographics

Quiz Ref IDThe SF Match application did not include key identifiers, such as age and race. The demographic characteristics of the applicants are shown in the Table. The 322 candidates comprised 181 (56%) men and 141 (44%) women. Of the 22 applicants with unverifiable publications, 13 (59%) were male. Two applicants (9%) with unverifiable publications were graduates of medical schools outside the United States; this rate was not different from the rate among US medical school graduates (9%; P > .99). Most medical students invited to interview had an undergraduate degree and were expected to obtain a medical degree. Ninety-four applicants (29.2%) invited to interview had an additional advanced degree other than an MD, and those students were more likely to have published scholarly works (odds ratio, 2.74; 95% CI, 1.38-5.45; P = .004) (Figure 1). Among applicants with unverifiable publications, 4 (18.2%) had additional degrees (Pearson χ21 = 2.8; P = .09). The mean USMLE Step 1 score of applicants invited to interview was 244 (range, 188-273). Of those with unverifiable publications, the mean score was 242 (range, 188-258; F1,237 = 0; P = .94). Applicants with USMLE Step 1 scores closest to the mean were more likely to publish compared with those candidates with higher or lower scores. One hundred seventy-eight of the invited applicants (55.3%) reported AOA status on their application. Of these students, 61 (34%) had been awarded AOA membership. Four of the 11 candidates with unverifiable publications and verified AOA status (36.4%) were elected members (Pearson χ21 = 0.26; P = .61). In summary, the presence of an advanced degree other than an MD and the USMLE Step 1 score were significantly associated with having any published scholarly work. None of the covariates, including application year, sex, an additional advanced degree, the USMLE Step 1 score, or AOA status, were significantly associated with an applicant producing unverifiable publications (Figure 2).

Types of Unverifiable Publications

The types of unverifiable publications are summarized in Figure 3. Two applicants had 2 unverifiable publications each. The most common reason was our inability to locate the publication (13 of 24 publications [54%]). Additional issues included the work being an abstract rather than a full-length publication (5 [20.8%]), incorrect author position (4 [16.7%]), applicant not listed as an author at all on the publication in question (1 [4.2%]), and substantial title discrepancy such that a different project was suggested by the name (1 [4.2%]). One listed publication (4.2%) had an incorrect author position (not first author) and was also listed in an incorrect journal.

Discussion

Previously reported rates of unverifiable publications diverge widely across multiple specialties.1-18 Rates as high as 45% among neurosurgery residency applicants1 in 2012 and as low as 1.9% among ophthalmology applicants17 from 2000 to 2004 have been reported. Our study endeavored to examine the prevalence of unverifiable publications among future physicians most likely to practice ophthalmology—those invited to interview rather than the entire applicant pool—in an effort to increase relevance in our field, which interviews a small proportion of total applicants.

Our observation that 6.8% of all interviewed ophthalmology residency applicants for entering classes 2012 to 2017 had at least 1 unverifiable publication on their applications is largely consistent with the disparate previous reports among multiple medical specialties; however, it is higher than the Wiggins report17 of 1.9% among all ophthalmology resident applicants to his institution from 2000 to 2004. Wiggins found a rate of unverifiable publications of 8.1% among applicants who reported at least 1 published scholarly work; our rate for the same metric was remarkably similar at 9.2%. The major difference between the 2 studies is that, in the Wiggins cohort, 24.5% of applicants reported at least 1 publication, whereas in our study, 74.2% of applicants had published.

In concert with the observation that our rates of unverifiable publications among applicants reporting published scholarly work are similar to the Wiggins cohort from 2000 to 2004 is that our study did not show change over time in rates of unverifiable publications for entering classes 2012 to 2017. This finding suggests relatively stable rates of unverifiable publications specific to the field of ophthalmology at around 8% to 9% among applicants reporting at least 1 published scholarly work for the past 17 years. The absolute number of applicants with unverifiable citations has risen in concert with more applicants reporting publications.

Although it is somewhat reassuring that the rate of unverifiable publications has not increased over time, it is still alarming that this issue persists in our field. Although the authors can only speculate as to the motives of the applicants in our study, previous studies have offered some ideas. Maverakis et al12 published on the widespread belief that it is an advantage for an applicant to have multiple scholarly works; anecdotally, we also believe this to be true. Medical students at our institution who are interested in pursuing a career in ophthalmology are strongly encouraged to become involved with ophthalmology research to strengthen their applications in addition to their knowledge. It is also possible that students perceive small falsifications or embellishments to curricula vitae to be a widespread practice; if this is the case, they may believe that they are at a disadvantage if they do not also make their resumes appear as competitive as possible even at the detriment of the truth.

Previous authors have postulated that some discrepancies in reporting could be accounted for by rigid categories on the Electronic Residency Application Service application. In the case that the applicants did not understand the terminology, such as “peer reviewed,” they could inadvertently categorize their work incorrectly in an honest mistake.1 This possibility is inapplicable to the field of ophthalmology, which instead relies on the SF Match. The SF Match application lists a section entitled “Research activities, papers, and/or additional information” and relies on self-report. Therefore, it is difficult to imagine how an applicant could unknowingly miscategorize his or her work.

Limitations

Our study had several limitations. It was retrospective in design. Demographic information was limited: the SF Match application did not include key identifiers such as age and race. Other potentially important variables, such as AOA status and medical school class rank, were not reported by most applicants. Both of these variables were also self-reported, as were scholarly works, leaving open the possibility of dishonesty in these parts of the application as well. Using the variables that we were able to reliably identify, we were not able to establish any correlation between applicant characteristics and the likelihood of unverifiable publications. It is possible that associations do exist, but there were relatively too few instances of unverifiable publications in our study to determine such a relationship. We were able to show that an advanced degree and the USMLE Step 1 score were significantly related to an applicant producing any published works.

Quiz Ref IDWe chose to scrutinize only scholarly work listed as “published,” “accepted,” or “in press,” as these works should be readily accessible provided that the citation is accurate. An applicant could have been confused by this nomenclature and mistakenly classified his or her work as accepted when in fact it was accepted with revisions and as such would not yet be accessible via our search techniques. We did not scrutinize the unverifiable publications again at a later time, although this will be an area of further investigation. We did not consider small misspellings or inaccuracies such as page-number errors to represent dishonesty. In the small number of these minimal errors, the works in question were easily identified through our search methods. Although such carelessness is not ideal in a future resident, we did not believe that these errors warranted categorization as unverifiable. Finally, it is possible that the pool of interviewed applicants at our institution is not generalizable to all ophthalmology residency programs.

Conclusions

Our study underscores the persistent problem of unverifiable publications among applicants to ophthalmology residency programs. Although unverifiable publications cannot be assumed to represent academic dishonesty without due process, it is difficult to imagine scenarios in which these works would be unverifiable in the age of readily searchable databases, especially because all works in question were self-reported on the candidates’ applications. This situation is particularly concerning given that trustworthiness, a sound ethical background, and attention to detail are considered by most to be prerequisite to a career in medicine. It is imperative that physicians mentoring our medical students and residents uphold these values. Based on our findings, the Vanderbilt Eye Institute will be altering its ophthalmology applicant review process. We are considering asking applicants to bring copies of their published work with them to the interview or asking the applicants to provide the PMCID (PubMed Central reference number) or DOI (digital object identifier) of their scholarly work in a brief supplemental application. These strategies aim to resolve any honest discrepancies on the candidate’s application in a timely fashion and would also remove the burden of verifying all cited works from our staff. We also advocate for the SF Match to modify the “Research activities, papers, and/or additional information” section of the application to strongly recommend or require that applicants supply these identifiers to go along with their full-length scholarly works. This change would benefit all ophthalmology programs and help address the problem of unverifiable publications before inviting the candidate for an interview. These changes will help ensure that we recruit not only the most intelligent, hard-working future physicians but also only those with a sound ethical base to our field.

Back to top
Article Information

Accepted for Publication: February 17, 2018.

Corresponding Author: Louise A. Mawn, MD, Vanderbilt Eye Institute, Vanderbilt University Medical Center, 2311 Pierce Ave, Nashville, TN 37232-8808 (louise.a.mawn@vanderbilt.edu).

Published Online: April 19, 2018. doi:10.1001/jamaophthalmol.2018.0846

Author Contributions: Drs Tamez and Brown had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Tamez, Wayman, Mawn.

Acquisition, analysis, or interpretation of data: Tamez, Tauscher, Brown.

Drafting of the manuscript: Tamez, Brown, Mawn.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Tamez, Tauscher, Brown.

Administrative, technical, or material support: Wayman, Mawn.

Study supervision: Mawn.

Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.

Funding/Support: Supported in part by a Physician Scientist Award (Dr Mawn) and an unrestricted grant to the Vanderbilt Eye Institute from Research to Prevent Blindness.

Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
Kistka  HM, Nayeri  A, Wang  L, Dow  J, Chandrasekhar  R, Chambless  LB.  Publication misrepresentation among neurosurgery residency applicants: an increasing problem.  J Neurosurg. 2016;124(1):193-198. doi:10.3171/2014.12.JNS141990PubMedGoogle ScholarCrossref
2.
Beswick  DM, Man  L-X, Johnston  BA, Johnson  JT, Schaitkin  BM.  Publication misrepresentation among otolaryngology residency applicants.  Otolaryngol Head Neck Surg. 2010;143(6):815-819. doi:10.1016/j.otohns.2010.08.054PubMedGoogle ScholarCrossref
3.
Chung  CK, Hernandez-Boussard  T, Lee  GK.  “Phantom” publications among plastic surgery residency applicants.  Ann Plast Surg. 2012;68(4):391-395. doi:10.1097/SAP.0b013e31823d2c4ePubMedGoogle ScholarCrossref
4.
Cohen-Gadol  AA, Koch  CA, Raffel  C, Spinner  RJ.  Confirmation of research publications reported by neurological surgery residency applicants.  Surg Neurol. 2003;60(4):280-283. doi:10.1016/S0090-3019(03)00429-4PubMedGoogle ScholarCrossref
5.
Dale  JA, Schmitt  CM, Crosby  LA.  Misrepresentation of research criteria by orthopaedic residency applicants.  J Bone Joint Surg Am. 1999;81(12):1679-1681.PubMedGoogle ScholarCrossref
6.
Gasior  AC, Knott  EM, Fike  FB,  et al.  Ghost publications in the pediatric surgery match.  J Surg Res. 2013;184(1):37-41. doi:10.1016/j.jss.2013.04.031PubMedGoogle ScholarCrossref
7.
Gurudevan  SV, Mower  WR.  Misrepresentation of research publications among emergency medicine residency applicants.  Ann Emerg Med. 1996;27(3):327-330. doi:10.1016/S0196-0644(96)70268-8PubMedGoogle ScholarCrossref
8.
Hebert  RS, Smith  CG, Wright  SM.  Minimal prevalence of authorship misrepresentation among internal medicine residency applicants: do previous estimates of “misrepresentation” represent insufficient case finding?  Ann Intern Med. 2003;138(5):390-392. doi:10.7326/0003-4819-138-5-200303040-00008PubMedGoogle ScholarCrossref
9.
Hsi  RS, Hotaling  JM, Moore  TN, Joyner  BD.  Publication misrepresentation among urology residency applicants.  World J Urol. 2013;31(3):697-702. doi:10.1007/s00345-012-0895-0PubMedGoogle ScholarCrossref
10.
Kaley  JR, Bornhorst  J, Wiggins  M, Yared  M.  Prevalence and types of misrepresentation of publication record by pathology residency applicants.  Arch Pathol Lab Med. 2013;137(7):979-982. doi:10.5858/arpa.2012-0253-OAPubMedGoogle ScholarCrossref
11.
Konstantakos  EK, Laughlin  RT, Markert  RJ, Crosby  LA.  Follow-up on misrepresentation of research activity by orthopaedic residency applicants: has anything changed?  J Bone Joint Surg Am. 2007;89(9):2084-2088. doi:10.2106/JBJS.G.00567PubMedGoogle Scholar
12.
Maverakis  E, Li  CS, Alikhan  A, Lin  TC, Idriss  N, Armstrong  AW.  The effect of academic “misrepresentation” on residency match outcomes.  Dermatol Online J. 2012;18(1):1.PubMedGoogle Scholar
13.
Phillips  JP, Sugg  KB, Murphy  MA, Kasten  SJ.  Misrepresentation of scholarly works by integrated plastic surgery applicants.  Plast Reconstr Surg. 2012;130(3):731-735. doi:10.1097/PRS.0b013e31825dc3f2PubMedGoogle ScholarCrossref
14.
Roellig  MS, Katz  ED.  Inaccuracies on applications for emergency medicine residency training.  Acad Emerg Med. 2004;11(9):992-994.PubMedGoogle ScholarCrossref
15.
Sekas  G, Hutson  WR.  Misrepresentation of academic accomplishments by applicants for gastroenterology fellowships.  Ann Intern Med. 1995;123(1):38-41.PubMedGoogle ScholarCrossref
16.
Thompson  KM, Neuman  S, Schroeder  DR,  et al.  Misrepresentation in multidisciplinary pain medicine fellowship applications to a single academic program.  Pain Med. 2015;16(2):274-279. doi:10.1111/pme.12322PubMedGoogle ScholarCrossref
17.
Wiggins  MN.  Misrepresentation by ophthalmology residency applicants.  Arch Ophthalmol. 2010;128(7):906-910. doi:10.1001/archophthalmol.2010.123PubMedGoogle ScholarCrossref
18.
Wiggins  MN.  A meta-analysis of studies of publication misrepresentation by applicants to residency and fellowship programs.  Acad Med. 2010;85(9):1470-1474. doi:10.1097/ACM.0b013e3181e2cf2bPubMedGoogle ScholarCrossref
19.
Graduate school search—medicine programs. U.S. News & World Report. 2018. https://www.usnews.com/best-graduate-schools/search?program=top-medical-schools&name=. Accessed March 20, 2018.
20.
R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2013. http://www.R-project.org/. Accessed December 21, 2016.
×