[Skip to Navigation]
Sign In
Editorial
September 22, 2023

Peer Review and Scientific Publication at a Crossroads: Call for Research for the 10th International Congress on Peer Review and Scientific Publication

Author Affiliations
  • 1Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California
  • 2Department of Medicine, Stanford University School of Medicine, Stanford, California
  • 3JAMA and the JAMA Network, Chicago, Illinois
  • 4The BMJ, London, England
JAMA. 2023;330(13):1232-1235. doi:10.1001/jama.2023.17607

The way science is assessed, published, and disseminated has markedly changed since 1986, when the launch of a new Congress focused on the science of peer review was first announced. There have been 9 International Peer Review Congresses since 1989, typically running on an every-4-year cycle, and most recently in 2022 after a 1-year delay due to the COVID-19 pandemic.1 Here, we announce that the 10th International Congress on Peer Review and Scientific Publication will be held in Chicago, Illinois, on September 3-5, 2025.

The congresses have been enormously productive, incentivizing and publicizing important empirical work into how science is produced, evaluated, published, and disseminated.2-4 However, peer review and scientific publication are currently at a crossroads and their future more difficult than ever to predict. After decades of experience and research in these fields, we have learned a lot about a wide range of aspects of peer review and scientific publication.2-5 We have accumulated a large body of empirical evidence on how systems function and how they can malfunction. There is also growing evidence on how to make peer review, publication, and dissemination processes more efficient, fair, open, transparent, reliable, and equitable.6-15 Experimental randomized evaluations of peer review practices are only a small part of the literature, but their numbers have been growing since the early trials of anonymized peer review.16-22 Research has revealed a rapidly growing list of biases, inefficiencies, and threats to the trustworthiness of published research, some now well recognized, others deserving of more attention.2,3 Moreover, practices continue to change and diversify in response to new needs, tools, and technologies as well as the persistent “publish or perish” pressures on scientists-as-authors.

With the continued evolution of electronic platforms and tools—most recently the emergence and use of large language models and artificial intelligence (AI)—peer review and scientific publication are rapidly evolving to address new opportunities and threats.23,24 Moreover, a lot of money is at stake; scientific publishing is a huge market with one of the highest profit margins among all business enterprises, and it supports a massive biomedical and broader science economy. Many stakeholders try to profit from or influence the scientific literature in ways that do not necessarily serve science or enhance its benefits to society. The number of science journal titles and articles is steadily increasing25; many millions of scientists coauthor scientific papers, and perverse reward systems do not help improve the quality of this burgeoning corpus. Furthermore, principled mandates for immediate and open access to research and data may not be fully understood, accepted, or funded. Many other new, often disruptive, ideas abound on how to improve dissemination of and access to science, some more speculative, utopian, or self-serving than others. In addition, deceptive, rogue actors, such as predatory and pirate publishers, fake reviewers, and paper mills continue to threaten the integrity of peer review and scientific publication. Careful testing of the many proposals to improve peer review and publication and of interventions and processes to address threats to their integrity in a rigorous and timely manner are essential to the future of science and the scholarly publishing enterprise.

Proposed remedies for several of the problems and biases have been evaluated,4 but many are untested or have inconclusive evidence for or against their use. New biases continue to appear (or at least to be recognized). In addition, there is tension about how exactly to correct the scientific literature, where a large share of what is published may not be replicable or is obviously false.26 Even outright fraud may be becoming more common—or may simply be recognized and reported more frequently than before.27,28

By their very nature, peer review and scientific publication practices are in a state of flux and may be unstable as they struggle to serve rapidly changing circumstances, technologies, and stakeholder needs and goals. Therefore, some unease would exist even in the absence of major perturbations, even if all the main stakeholders (authors, journals, publishers, funders) simply wanted to continue business as usual. However, the emergence of additional rapid changes further exacerbates the challenges, while also providing opportunities to improve the system at large. The COVID-19 crisis was one major quake that shook the way research is designed, conducted, evaluated, published, disseminated, and accessed.29,30 Advances in AI and large language models may be another, potentially even larger, seismic force, with some viewing the challenge posed by these new developments as another hyped tempest in a teapot and others believing them to be an existential threat to truth and all of humanity. Scientific publication should fruitfully absorb this energy.23,24 Research has never been needed more urgently to properly examine, test, and correct (in essence: peer review) scientific and nonscientific claims for the sake of humanity’s best interests. The premise of all Peer Review Congresses is that peer review and scientific publication must be properly examined, tested, and corrected in the same way the scientific method and its products are applied, vetted, weighted, and interpreted.2

The range of topics on which we encourage research to be conducted, presented, and discussed at the 10th International Congress on Peer Review and Scientific Publication expands what was covered by the 9 previous iterations of the congress (Box).1,2,4 We understand that new topics may yet emerge; 2 years until September 2025 is a relatively long period, during which major changes are possible, and even likely. Therefore, we encourage research in any area of work that may be relevant to peer review and scientific publication, including novel empirical investigations of processes, biases, policies, and innovations. The congress has the ambitious goal to cover all branches and disciplines of science. It is increasingly recognized that much can be learned by comparing experiences in research and review practices across different disciplines. While biomedical sciences have had the lion’s share in empirical contributions to research on peer review in the past, we want to help correct this imbalance. Therefore, we strongly encourage the contribution of work from all scientific disciplines, including the natural and physical sciences, social sciences, psychological sciences, economics, computer science, mathematics, and new emerging disciplines. Interdisciplinary work is particularly welcome.

Box Section Ref ID
Box.

Topics of Interest for the 10th International Congress on Peer Review and Scientific Publication

Bias
  • Efforts to avoid, manage, or account for bias in research methods, design, conduct, and reporting and interpretation

  • Publication and reporting bias

  • Bias on the part of researchers, authors, reviewers, editors, funders, commentators, influencers, disseminators, and consumers of scientific information

  • Interventions to address gender, race and ethnicity, geographic location, career stage, and discipline biases in peer review, publication, research dissemination, and impact

  • Improving and measuring diversity, equity, and inclusion of authors, reviewers, editors, and editorial board members

  • Motivational factors for bias related to rewards and incentives

  • New forms of bias introduced by wider use of large language models and other forms of artificial intelligence (AI)

Editorial and Peer Review Decision-Making
  • Assessment and testing of models of peer review and editorial decision-making and workflows used by journals, publishers, funders, and research disseminators

  • Evaluations of the quality, validity, and practicality of peer review and editorial decision-making

  • Challenges, new biases, and opportunities with mega-journals

  • Assessment of practices related to publication of special issues with guest editors

  • Economic and systemic evaluations of the peer review machinery and the related publishing business sector

  • Methods for ascertaining use of large language models and other forms of AI in authoring and peer review of scientific papers

  • AI in peer review and editorial decision-making

  • Quality assurance for reviewers, editors, and funders

  • Editorial policies and responsibilities

  • Editorial freedom and integrity

  • Peer review of grant proposals

  • Peer review of content for meetings

  • Editorial handling of science journalism

  • Role of journals as publishing venues vs peer review venues

  • COVID-19 pandemic and postpandemic effects

Research and Publication Ethics
  • Ethical concerns for researchers, authors, reviewers, editors, publishers, and funders

  • Authorship, contributorship, accountability, and responsibility for published material

  • Conflicts of interest (financial and nonfinancial)

  • Research and publication misconduct

  • Editorial nepotism or favoritism

  • Paper mills

  • Citation cartels, citejacking, and other manipulation of citations

  • Conflicts of interest among those who critique or criticize published research and researchers

  • Ethical review and approval of studies

  • Confidentiality considerations

  • Rights of research participants in scientific publication

  • Effects of funding and sponsorship on research and publication

  • Influence of external stakeholders: funders, journal owners, advertisers/sponsors, libraries, legal representatives, news media, social media, fact-checkers, technology companies, and others

  • Tools and software to detect wrongdoing, such as duplication, fraudulent manuscripts and reviewers, image manipulation, and submissions from paper mills

  • Corrections and retractions

  • Legal issues in peer review and correction of the literature

  • Evaluations of censorship in science

  • Intrusion of political and ideological agendas in scientific publishing

  • Science and scientific publication under authoritarian regimes

Improving Research Design, Conduct, and Reporting
  • Effectiveness of guidelines and standards designed to improve the design, conduct, and reporting of scientific studies

  • Evaluations of the methodological rigor of published information

  • Data sharing, transparency, reliability, and access

  • Research reanalysis, reproducibility, and replicability

  • Approaches for efficient and effective correction of errors

  • Curtailing citation and continued spread of retracted science

  • Innovations in best, fit-for-purpose methods and statistics, and ways to improve their appropriate use

  • Implementations of AI and related tools to improve research design, conduct, and reporting

  • Innovations to improve data and scientific display

  • Quality and reliability of data presentation and scientific images

  • Standards for multimedia and new content models for dissemination of science

  • Quality and effectiveness of new formats for scientific articles

  • Fixed articles vs evolving versions and innovations to support updating of scientific articles and reviews

Models for Peer Review and Scientific Publication
  • Single-anonymous, double-anonymous, collaborative, and open peer review

  • Pre–study conduct peer review

  • Open and public access

  • Embargoes

  • Preprints and prepublication posting and release of information

  • Prospective registration of research

  • Postpublication review, communications, and influence

  • Engaging statistical and other technical expertise in peer review

  • Evaluations of reward systems for authors, reviewers, and editors

  • Approaches to improve diversity, equity, and inclusion in peer review and publication

  • Innovations to address reviewer fatigue

  • Scientific information in multimedia and new media

  • Publication and performance metrics and usage statistics

  • Financial and economic models of peer-reviewed publication

  • Quality and influence of advertising and sponsored publication

  • Quality and effectiveness of content tagging, markup, and linking

  • Use of AI and software to improve peer review, decision-making, and dissemination of science

  • Practices of opportunistic, predatory, and pirate operators

  • Threats to scientific publication

  • The future of scientific publication

Dissemination of Scientific and Scholarly Information
  • New technologies and methods for improving the quality and efficiency of, and equitable access to, scientific information

  • Novel mechanisms, formats, and platforms to disseminate science

  • Funding and reward systems for science and scientific publication

  • Use of bibliometrics and alternative metrics to evaluate the quality and equitable dissemination of published science

  • Best practices for corrections and retracting fraudulent articles

  • Comparisons of and lessons from various scientific disciplines

  • Mapping of scientific methods and reporting practices and of meta-research across disciplines

  • Use and effects of social media

  • Misinformation and disinformation

  • Reporting, publishing, disseminating, and accessing science in emergency situations (pandemics, natural disasters, political turmoil, wars)

The congress is organized under the auspices of JAMA and the JAMA Network, The BMJ, and the Meta-research Innovation Center at Stanford (METRICS) and is guided by an international panel of advisors who represent diverse areas of science and of activities relevant to peer review and scientific publication.4 The abstract submission site is expected to open on December 1, 2024, with an anticipated deadline for abstract submission by January 31, 2025. Announcements will appear on the congress website (https://peerreviewcongress.org/).4

Back to top
Article Information

Corresponding Author: John P. A. Ioannidis, MD, DSc, Stanford Prevention Research Center, Stanford University, 1265 Welch Rd, MSOB X306, Stanford, CA 94305 (jioannid@stanford.edu).

Published Online: September 22, 2023. doi:10.1001/jama.2023.17607

Conflict of Interest Disclosures: All authors serve as directors or coordinators of the Peer Review Congress. Ms Flanagin reports serving as an unpaid board member for STM: International Association of Scientific, Technical, and Medical Publishers. Dr Bloom reports being a founder of medRxiv and a member of the Board of Managers of American Institute of Physics Publishing.

Additional Information: Drs Ioannidis and Berkwits are directors; Ms Flanagin, executive director; and Dr Bloom, European director and coordinator for the International Congress on Peer Review and Scientific Publication.

Note: This article is being published simultaneously in The BMJ and JAMA.

References
1.
Ioannidis  JPA, Berkwits  M, Flanagin  A, Godlee  F, Bloom  T.  Ninth international congress on peer review and scientific publication: call for abstracts.   BMJ. 2021;374(2252):n2252. doi:10.1136/bmj.n2252 PubMedGoogle ScholarCrossref
2.
Rennie  D, Flanagin  A.  Three decades of Peer Review Congresses.   JAMA. 2018;319(4):350-353. doi:10.1001/jama.2017.20606 PubMedGoogle ScholarCrossref
3.
Rennie  D.  Let’s make peer review scientific.   Nature. 2016;535(7610):31-33. doi:10.1038/535031a PubMedGoogle ScholarCrossref
4.
International Congress on Peer Review and Scientific Publication. Past congresses. Accessed July 5, 2023. https://peerreviewcongress.org/past-congresses/ https://peerreviewcongress.org/
5.
Grimaldo  F, Marušić  A, Squazzoni  F.  Fragments of peer review: a quantitative analysis of the literature (1969-2015).   PLoS One. 2018;13(2):e0193148. doi:10.1371/journal.pone.0193148 PubMedGoogle ScholarCrossref
6.
Hardwicke  TE, Salholz-Hillel  M, Malički  M, Szűcs  D, Bendixen  T, Ioannidis  JPA.  Statistical guidance to authors at top-ranked journals across scientific disciplines.   Am Stat. 2022;77:239-247. doi:10.1080/00031305.2022.2143897 Google ScholarCrossref
7.
Shi  X, Ross  JS, Amancharla  N, Niforatos  JD, Krumholz  HM, Wallach  JD.  Assessment of concordance and discordance among clinical studies posted as preprints and subsequently published in high-impact journals.   JAMA Netw Open. 2021;4(3):e212110. doi:10.1001/jamanetworkopen.2021.2110 PubMedGoogle ScholarCrossref
8.
Spungen  H, Burton  J, Schenkel  S, Schriger  DL.  Completeness and spin of medRxiv preprint and associated published abstracts of COVID-19 randomized clinical trials.   JAMA. 2023;329(15):1310-1312. doi:10.1001/jama.2023.1784PubMedGoogle ScholarCrossref
9.
Hamilton  DG, Hong  K, Fraser  H, Rowhani-Farid  A, Fidler  F, Page  MJ.  Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data.   BMJ. 2023;382:e075767. doi:10.1136/bmj-2023-075767PubMedGoogle ScholarCrossref
10.
Nelson  JT, Tse  T, Puplampu-Dove  Y, Golfinopoulos  E, Zarin  DA.  Comparison of availability of trial results in ClinicalTrials.gov and PubMed by data source and funder type.   JAMA. 2023;329(16):1404-1406. doi:10.1001/jama.2023.2351 PubMedGoogle ScholarCrossref
11.
Rowhani-Farid  A, Hong  K, Grewal  M,  et al.  Consistency between trials presented at conferences, their subsequent publications and press releases.   BMJ Evid Based Med. 2023;28(2):95-102. doi:10.1136/bmjebm-2022-111989 PubMedGoogle ScholarCrossref
12.
Shi  X, Abritis  A, Patel  RP,  et al.  Characteristics of retracted research articles about COVID-19 vs other topics.   JAMA Netw Open. 2022;5(10):e2234585. doi:10.1001/jamanetworkopen.2022.34585 PubMedGoogle ScholarCrossref
13.
Malički  M, Aalbersberg  IJ, Bouter  L, Mulligan  A, Ter Riet  G.  Transparency in conducting and reporting research: a survey of authors, reviewers, and editors across scholarly disciplines.   PLoS One. 2023;18(3):e0270054. doi:10.1371/journal.pone.0270054 PubMedGoogle ScholarCrossref
14.
Ben Messaoud  K, Schroter  S, Richards  M, Gayet-Ageron  A.  Analysis of peer reviewers’ response to invitations by gender and geographical region: cohort study of manuscripts reviewed at 21 biomedical journals before and during covid-19 pandemic.   BMJ. 2023;381:e075719. doi:10.1136/bmj-2023-075719 PubMedGoogle ScholarCrossref
15.
Flanagin  A, Cintron  MY, Christiansen  SL,  et al.  Comparison of reporting race and ethnicity in medical journals before and after implementation of reporting guidance, 2019-2022.   JAMA Netw Open. 2023;6(3):e231706. doi:10.1001/jamanetworkopen.2023.1706 PubMedGoogle ScholarCrossref
16.
McNutt  RA, Evans  AT, Fletcher  RH, Fletcher  SW.  The effects of blinding on the quality of peer review. a randomized trial.   JAMA. 1990;263(10):1371-1376. doi:10.1001/jama.1990.03440100079012 PubMedGoogle ScholarCrossref
17.
Godlee  F, Gale  CR, Martyn  CN.  Effect on the quality of peer review of blinding reviewers and asking them to sign their reports: a randomized controlled trial.   JAMA. 1998;280(3):237-240. doi:10.1001/jama.280.3.237 PubMedGoogle ScholarCrossref
18.
Bruce  R, Chauvin  A, Trinquart  L, Ravaud  P, Boutron  I.  Impact of interventions to improve the quality of peer review of biomedical journals: a systematic review and meta-analysis.   BMC Med. 2016;14(1):85. doi:10.1186/s12916-016-0631-5 PubMedGoogle ScholarCrossref
19.
Speich  B, Mann  E, Schönenberger  CM,  et al.  Reminding peer reviewers of reporting guideline items to improve completeness in published articles: primary results of 2 randomized trials.   JAMA Netw Open. 2023;6(6):e2317651. doi:10.1001/jamanetworkopen.2023.17651 PubMedGoogle ScholarCrossref
20.
Fox  CW, Meyer  J, Aime  E.  Double-blind peer review affects reviewer ratings and editor decisions at an ecology journal.   Funct Ecol. 2023;37(5):1144-1157. doi:10.1111/1365-2435.14259 Google ScholarCrossref
21.
Ghannad  M, Yang  B, Leeflang  M,  et al.  A randomized trial of an editorial intervention to reduce spin in the abstract’s conclusion of manuscripts showed no significant effect.   J Clin Epidemiol. 2021;130:69-77. doi:10.1016/j.jclinepi.2020.10.014 PubMedGoogle ScholarCrossref
22.
Stelmakh  I, Rastogi  C, Shah  NB, Singh  A, Daumé  H  III.  A large scale randomized controlled trial on herding in peer-review discussions.   PLoS One. 2023;18(7):e0287443. doi:10.1371/journal.pone.0287443 PubMedGoogle ScholarCrossref
23.
Flanagin  A, Bibbins-Domingo  K, Berkwits  M, Christiansen  SL.  Nonhuman “authors” and implications for the integrity of scientific publication and medical knowledge.   JAMA. 2023;329(8):637-639. doi:10.1001/jama.2023.1344 PubMedGoogle ScholarCrossref
24.
Flanagin  A, Kendall-Taylor  J, Bibbins-Domingo  K.  Guidance for authors, peer reviewers, and editors on use of AI, language models, and chatbots.   JAMA. 2023;330(8):702-703. Published online July 27, 2023. doi:10.1001/jama.2023.12500 PubMedGoogle ScholarCrossref
25.
Ioannidis  JPA, Pezzullo  AM, Boccia  S.  The rapid growth of mega-journals: threats and opportunities.   JAMA. 2023;329(15):1253-1254. doi:10.1001/jama.2023.3212 PubMedGoogle ScholarCrossref
26.
Ioannidis  JPA.  Why most published research findings are false.   PLoS Med. 2022;19(8):e1004085. doi:10.1371/journal.pmed.1004085 PubMedGoogle ScholarCrossref
27.
Mol  BW, Ioannidis  JPA.  How do we increase the trustworthiness of medical publications?   Fertil Steril. 2023;24:S0015-S0282. doi:10.1016/j.fertnstert.2023.02.023PubMedGoogle ScholarCrossref
28.
Carlisle  JB.  False individual patient data and zombie randomised controlled trials submitted to Anaesthesia.   Anaesthesia. 2021;76(4):472-479. doi:10.1111/anae.15263 PubMedGoogle ScholarCrossref
29.
Else  H.  How a torrent of COVID science changed research publishing–in seven charts.   Nature. 2020;588(7839):553. doi:10.1038/d41586-020-03564-y PubMedGoogle ScholarCrossref
30.
Ioannidis  JPA, Bendavid  E, Salholz-Hillel  M, Boyack  KW, Baas  J.  Massive covidization of research citations and the citation elite.   Proc Natl Acad Sci U S A. 2022;119(28):e2204074119. doi:10.1073/pnas.2204074119 PubMedGoogle ScholarCrossref
×