Preprint servers offer a means to disseminate research reports before they undergo peer review and are relatively new to clinical research.1-4 medRxiv is an independent, not-for-profit preprint server for clinical and health science researchers that was introduced in June 2019.4 A central question was whether there would be adoption of a new approach to dissemination of pre–peer-review science. Now, a year after its establishment, we report medRxiv’s submissions, posts, and downloads.
We used data from the medRxiv website,5 internal data, and Altmetric.com from launch on June 11, 2019, through June 30, 2020. We assessed submissions, postings, abstract views, downloads, comments, and withdrawals. We also looked at submissions and postings before coronavirus disease 2019 (COVID-19) (July 1 through December 31, 2019) and after COVID-19 (January 1 through June 30, 2020). We calculated the posting rate as the percentage of submissions that were posted after passing screening criteria. These criteria include the requirement that the manuscript is a full scientific research report (not a narrative review, commentary, or case report); the absence of obscenity, plagiarism, or patient identifiers; and confirmation by an affiliate (a member of the scientific community who voluntarily screens submissions) that posting would not pose potential risk to patients or public health.
We also identified medRxiv preprints with the top Altmetric scores. We searched PubMed and CrossRef on June 30, 2020, to determine if any posted preprints had been published.
In its first complete month (July 2019), medRxiv had 176 submissions, of which 116 (66%) passed screening and were posted. During the most recent month (June 2020), there were 1866 submissions and 1615 (87%) were posted. As of June 30, 2020, medRxiv had 11 052 submissions and 7695 (77%) were posted (Figure 1) from 57 096 unique authors in 124 countries. Thus far, 22% of submissions have been revised at least once. In July 2019, there were 6800 article downloads and 25 300 abstract views, whereas there were 2 356 900 and 5 853 600 in June 2020, respectively (Figure 2).
In the pre–COVID-19 period, the median number of submissions per day was 6 (interquartile range, 4-8), in contrast to 51 (interquartile range, 23-83) in the post–COVID-19 period. COVID-19 submissions comprised 73% of the total posted in February-June 2020. Overall, 31% of COVID-19 submissions were not posted because they did not meet the screening criteria.
Among preprints posted through June 30, 2020, there were 28 with Altmetric scores of 3000 or greater and 90 with scores of 1000 or greater. The 20 highest Altmetric scores ranged from 3727 to 20 607. In total, 698 preprints have received comments on the medRxiv site (9%). Overall, to date, 14% of the preprints posted through June 30, 2020, have been published in 532 peer-reviewed journals. The median interval between posting and journal publication was 141 days for non–COVID-19 articles and 46 days for COVID-19 articles. A total of 18 (0.002%) preprints were withdrawn after posting on medRxiv, including 13 that were pandemic related.
In its first year of operation, medRxiv had 11 052 submissions. While submissions increased steadily from launch to December 2019, COVID-19–related submissions contributed to the rapid growth in 2020. To date, 14% of preprints have been published in scientific journals, but the time frame for completing the peer-reviewed publication cycle was short. Nine percent of preprints received on-site comments, but this represents only a fraction of the interactions, as they are also occurring elsewhere, such as on social media. A small percentage of preprints have been withdrawn (n = 18; 0.002%), including 13 since January 2020 and related to COVID-19 reports. The screening resulted in a 31% denial rate, but data on the reasons for failing screening criteria were not collected. Future studies should evaluate medRxiv after COVID-19, including the extent to which preprints that are published change in response to feedback from the scientific community and peer review, and the potential influence that preprints posted to the server have had on clinical research.
Corresponding Author: Harlan M. Krumholz, MD, SM, Section of Cardiovascular Medicine, Yale School of Medicine, One Church St, Ste 200, New Haven, CT 06510 (harlan.krumholz@yale.edu).
Accepted for Publication: August 26, 2020.
Author Contributions: Drs Inglis and Ross had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: Krumholz, Bloom, Sever, Ross.
Acquisition, analysis, or interpretation of data: Krumholz, Sever, Rawlinson, Inglis, Ross.
Drafting of the manuscript: Krumholz, Bloom, Rawlinson, Inglis.
Critical revision of the manuscript for important intellectual content: Bloom, Sever, Inglis, Ross.
Statistical analysis: Inglis.
Obtained funding: Inglis.
Administrative, technical, or material support: Bloom, Sever, Inglis.
Supervision: Inglis.
Conflict of Interest Disclosures: All authors reported being cofounders of medRxiv. Dr Krumholz reported that he has contracts through Yale New Haven Hospital with the Centers for Medicare & Medicaid Services to support quality measurement programs and through Yale with UnitedHealth Group to engage in collaborative research. He was a recipient of a research grant, through Yale, from Medtronic for data sharing, from the US Food and Drug Administration to develop methods for postmarket surveillance of medical devices, from Johnson & Johnson to support data sharing, and from the Shenzhen Center for Health Information for work to advance intelligent disease prevention and health promotion. He is an advisor to the National Center for Cardiovascular Diseases in Beijing, was an expert witness the Arnold & Porter Law Firm for work related to the Sanofi clopidogrel litigation, and is an expert witness for the Martin/Baughman Law Firm for work related to the Cook Celect IVC filter litigation and related to C. R. Bard Recovery IVC filter litigation and for the Siegfried and Jensen Law Firm for work related to Vioxx litigation. He chairs a cardiac scientific advisory board for UnitedHealth; was a member of the IBM Watson Health Life Sciences Board; is a member of the advisory board for Element Science, the healthcare advisory board for Facebook, and the physician advisory board for Aetna. He is the cofounder of HugoHealth, a personal health information platform, and cofounder of Refactor Health, an enterprise health care artificial intelligence–augmented data management company. He is a venture partner at F-Prime. Dr Bloom reported chairing the scientific advisory board of EMBL-EBI Literature Services, being on the Board of Managers of AIP Publishing, and being European coordinator for the Peer Review Congress. Dr Bloom and Ms Rawlinson are employed full time by BMJ. Dr Sever reported being the assistant director of Cold Spring Harbor Laboratory Press and a director of Life Science Alliance LLC. Dr Inglis reported being the executive director and publisher of Cold Spring Harbor Laboratory Press, a director of Life Science Alliance LLC, and a member of the advisory board of MIT Press. Dr Ross reported receiving research support through Yale University from Johnson & Johnson and from the US Food and Drug Administration, Medical Devices Innovation Consortium/National Evaluation System for Health Technology, Centers for Medicare & Medicaid Services, Agency for Healthcare Research and Quality, National Heart, Lung, and Blood Institute, and Laura and John Arnold Foundation.
Funding/Support: This work was supported by funds from Cold Spring Harbor Laboratory.
Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.
Additional Contributions: We thank Ted Roeder and Mala Mazzullo, Cold Spring Harbor Laboratory, for data collection and analysis, Samantha Hindle, PhD, Cold Spring Harbor Laboratory, Emma Ganley, PhD, Cold Spring Harbor Laboratory, John Fletcher, MBBCh, MA, MPH, BMJ, and the medRxiv team for making the project possible, and the medRxiv affiliates for their expertise in helping screen manuscripts. These individuals were not compensated for their role in the study.