Customize your JAMA Network experience by selecting one or more topics from the list below.
Institutions have a central role in protecting the integrity of research. They employ researchers, own the facilities where the work is conducted, receive grant funding, and teach many students about the research process. When questions arise about research misconduct associated with published articles, scientists and journal editors usually first ask the researchers’ institution to investigate the allegations and then report the outcomes, under defined circumstances, to federal oversight agencies and other entities, including journals.1
Depending on institutions to investigate their own faculty presents significant challenges. Misconduct reports, the mandated product of institutional investigations for which US federal dollars have been spent, have a wide range of problems. These include lack of standardization, inherent conflicts of interest that must be addressed to directly ensure credibility, little quality control or peer review, and limited oversight. Even when institutions act, the information they release to the public is often limited and unhelpful.
As a result, like most elements of research misconduct, little is known about institutions’ responses to potential misconduct by their own members. The community that relies on the integrity of university research does not have access to information about how often such claims arise, or how they are resolved. Nonetheless, there are some indications that many internal reviews are deficient.
Three recent reports from the National Academies of Science, Engineering, and Medicine (NASEM) underscore this phenomenon. In 2016, the Optimizing the Nation’s Investment in Academic Research: A New Regulatory Framework for the 21st Century panel concluded that “Some academic research institutions have failed to respond appropriately to investigators’ transgressions or failed to use effectively the range of tools available to create an environment that strongly discourages, at both the institutional and the individual level, behaviors in conflict with the standards and norms of the scientific community.”2 In 2017, the Committee on Responsible Science noted that “significant gaps exist in the information available to institutions as well as to the rest of the research enterprise about how allegations are handled, what challenges arise, and how successful institutions are able to ensure effective performance.”3 A third NASEM group, the Committee on the Review of Omics-Based Tests for Predicting Patient Outcomes in Clinical Trials, reported in 2012 that “institutions can be influenced by secondary interest beyond financial interests, such as factors that impact an institution’s reputation. In research, such reputational factors can be quite prominent and difficult to manage, including deference to esteemed and well-funded investigators and the importance to both investigators and institutions of faculty publications in high-impact journals.”4
The Office of Inspector General (OIG) of the National Science Foundation (NSF) has found that the reports of some institutions do not meet reasonable standards. For instance, some reports from universities do not ask relevant research questions, or they fail to appropriately expand the investigation beyond a particular allegation; other reports focus on finding fault with an individual when many were involved in the research; and for some other reports, committee members have lacked relevant expertise (email from Alan Boehm, MFS; James T. Kroll, PhD; and Aaron S. Manka, PhD; December 2017). These are not idiosyncratic or 1-time problems. A partial list of shortcomings that the OIG staff has compiled and shared at conferences includes the following:
Investigative reports that lack supporting evidence and fail to address the elements of a research misconduct finding, particularly intent;
Individuals who are the subjects of the investigation blaming the student or postdoctoral researchers, but the investigative committee never interviewing these individuals;
Accepting, without question, excuses by the subjects of the investigation; and
Relying only on information in allegations, not checking for patterns or other misconduct.
Social psychology illuminates why so many institutional responses to allegations of research misconduct are flawed. The work of Valdesolo and DeSteno documents how “individuals’ evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions enacted by others. To the extent that the group stands as an important source of self-definition, one may have an interest in protecting the sanctity of that entity.”5 Some institutional responses to allegations of research misconduct, even from sophisticated and well-resourced universities, seem distorted by inexperience, inefficiency, or symptoms of in-group thinking. Understanding and counteracting these human inclinations are critical to reinforce research reproducibility, integrity, and not least, the credibility of institutions and the research community.
Universities must strengthen plans and reports for investigations into research integrity before they are finalized. Checklists are one potential approach for moving in this direction, according to Davidoff, an early proponent: “standard-setting checklists are emerging as particularly valuable tools in high-stakes and high-pressure situations—not only in medicine but also in other disciplines.”6
As an essential first step, the research community should agree on standards that institutional reports should meet. In December 2017, a meeting of experts was convened (hereinafter “Experts Meeting”) to consider this issue. Represented were a wide range of individuals who deal with scientific misconduct, including a former university provost and president, other institutional leaders, federal officials, researchers, a journal editor, journalists, NASEM panel participants, and attorneys representing respondents, whistle-blowers, and institutions. The group developed a proposed checklist for research integrity investigation (eAppendix in the Supplement). The checklist, or its improved successors, is designed to address whether an investigation follows reasonable standards and if the subsequent report is appropriate and complete. For example, it addresses the following issues:
Whether a specific institutional investigation plan or report is generated that identifies appropriate questions to pursue and proposes a meaningful approach to securing the answers;
Whether the correct individuals are interviewed;
Whether the relevant data are secured and reviewed by appropriate experts;
Whether the report provides factual basis and data; and
Whether the report supports its conclusions.
A first step is use of the checklist by internal investigative committees, institutional officials who receive misconduct investigation reports, and attorneys responsible for signing off on these reports. Second, institutions should experiment with forming consortia to provide external peer review of internal investigative reports before they are made final and revising the reports if needed. Implementing such an approach on a widespread scale has clear logistical challenges, not least among them confidentiality and efficiency. Given the stakes, and existing mechanisms in other sectors, these challenges must be overcome.
While not all those who attended the Experts Meeting to develop this checklist agree, institutional reports of research misconduct should ultimately be released and made available for scrutiny for the scientific enterprise to achieve more trust. Efforts by one of our organizations (Retraction Watch) to use public records laws to obtain such reports has met with some success, but this approach has limitations.
The scientific community relies on reports of research, and the journals that publish those reports rely on the institutions of investigators to ensure integrity in research. These institutions can and should do better.
Corresponding Author: Ivan Oransky, MD, Arthur Carter Journalism Institute, New York University, 20 Cooper Square, Sixth Floor, New York, NY 10003 (email@example.com).
Published Online: March 12, 2018. doi:10.1001/jama.2018.0358
Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Mr Marcus and Dr Oransky are paid partial salaries by The Center for Scientific Integrity. Mr Marcus is employed by McMahon Publishing, and Dr Oransky is employed by New York University. Ms Gunsalus is a professor emerita at the University of Illinois and operates a higher education consulting firm, C. K. Gunsalus & Associates.
Additional Contributions: We thank the following persons for participating in the Experts Meeting: Alan Boehm, MFS (NSF-OIG); Laura Clower, JD (University of Illinois); David DeMets, MS, PhD (University of Wisconsin-Madison); Larry Faulkner, PhD (University of Texas at Austin); Annette Flanagin, RN, MA (JAMA and JAMA Network); Sylvie Khan, MUP (University of Illinois at Urbana–Champaign); Iekuni Ichikawa, MD, PhD (Association for the Promotion of Research Integrity, Japan Medical Science Federation, Shinshu University, Vanderbilt University); Aaron Manka, PhD (NSF-OIG); Lisa McShane, MS, PhD (National Institutes of Health); Michael Morisy (MuckRock); Robert Nerem, MSc, PhD (Georgia Institute of Technology); Lauran Qualkenbush (Northwestern University); David F. Ransohoff, MD (University of North Carolina); Paul S. Thaler, JD (Cohen Seglias).
Additional Information: The Experts Meeting was jointly sponsored by the National Center for Professional & Research Ethics at Coordinated Science Laboratory, University of Illinois at Urbana–Champaign; and The Center for Scientific Integrity, the parent nonprofit organization of Retraction Watch, and involved no outside funding.
eAppendix. Peer Review Form for Research Integrity Investigation Reports
Gunsalus CK, Marcus AR, Oransky I. Institutional Research Misconduct Reports Need More Credibility. JAMA. 2018;319(13):1315–1316. doi:10.1001/jama.2018.0358
Create a personal account or sign in to: