Primary listed outcome of 664 RCTs organized by intended enrollment size. Most studies have intended enrollment fewer than 500 patients, and the most common primary end points address (1) clinical course, (2) infection rates, or (3) persistence of viral detection.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Pundi K, Perino AC, Harrington RA, Krumholz HM, Turakhia MP. Characteristics and Strength of Evidence of COVID-19 Studies Registered on ClinicalTrials.gov. JAMA Intern Med. 2020;180(10):1398–1400. doi:10.1001/jamainternmed.2020.2904
The coronavirus disease 2019 (COVID-19) pandemic has led to a massive activation of clinical research. The methodological strength of these studies is not well characterized but has implications for the quality of evidence produced. We evaluated the characteristics and expected strength of evidence of COVID-19 studies registered on ClinicalTrials.gov.
For this cross-sectional analysis, we searched ClinicalTrials.gov on May 19, 2020, using the terms COVID-19, SARS-CoV-2, 2019-nCov, 2019 novel coronavirus, and severe acute respiratory syndrome coronavirus 2 and extracted all structured data fields.1 We excluded withdrawn, suspended, terminated, or expanded-access studies. We categorized reported outcomes and graded studies using the 2011 Oxford Centre for Evidence-Based Medicine (OCEBM) level of evidence framework.2 A single reviewer (K.P.) verified studies for inclusion and removed duplicates, and 2 reviewers (A.C.P. and M.P.T.) audited results. This study followed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline.
We identified 1551 studies registered from March 1, 2011, to May 19, 2020, meeting inclusion criteria: 911 (58.7%) interventional (including 664 randomized clinical trials [RCTs]) and 640 (41.3%) observational studies (Table); 1180 (76.1%) were single center. Frequently reported primary and secondary outcomes include mortality (526 [33.9%]), ventilation requirement (413 [26.6%]), and treatment complications (359 [23.1%]). Of the 1551 studies, 451 (29.1%) could potentially yield OCEBM level 2 evidence, or the highest level of individual study evidence.2
Across 664 RCTs, the primary outcome most frequently pertained to clinical course (323 [48.6%]); 51 (7.7%) had a primary outcome of mortality, and 42 (6.3%) had a composite end point including mortality (Figure). Blinding (required for OCEBM level 2 evidence) was reported for 364 RCTs, of which 195 (29.3%) were placebo-controlled, 238 (35.8%) planned enrollment of more than 100 participants, and 113 (17.0%) reported at least 2 study centers or sites. Only 75 RCTs (11.3%) were placebo-controlled and blinded with at least 2 study centers (60 with enrollment >100 participants; 24 with >500 participants). Most RCTs evaluated drugs and biologic compounds (486 [73.2%]); 155 (23.3%), hydroxychloroquine or chloroquine; 7 (1.1%), remdesivir; 48 (7.2%), other antivirals; 21 (3.2%), tocilizumab; and 20 (3.0%), corticosteroids.
Of the 640 observational studies, 517 (80.8%) were single center and 123 (19.2%) were multicenter, 36 of which had 10 or more centers. Eighty-seven studies (13.6%) were prospective cohort studies that could yield level 2 evidence.
Although a few large multicenter trials may generate high-quality evidence, the large proportion of studies with an expected low level of evidence is concerning. Rapid dissemination of studies with low-quality evidence studies can influence public opinion, government actions, and clinical practice in potentially harmful ways,3 especially with a rising tide of COVID-19 study dissemination via preprint or other strategies ahead of peer review.
A number of measures can mitigate these issues. Preprint results could be accompanied by transparent data sharing ahead of peer review. Rapidly deployable systems for multicenter registries and trials should be created but with an emphasis on quality, not just speed. These systems could be activated for global health crises, leading to streamlined operations for international study coordination, data sharing, and central institutional review boards. These systems can also combine and harmonize similar observational studies into large multicenter studies or embed randomization or pragmatic features when possible. The World Health Organization’s 100-country, adaptive Solidarity trial comparing 4 treatment arms uses a common data platform and operations, but results may not be available for months.4,5 Finally, we urge institutional review boards to work with investigators to ensure that experimental research involving human participants is sufficiently well designed to achieve the goal of generating clinically meaningful evidence.
Our study has important limitations. Current regulations only require drug, device, or biological studies to register with ClinicalTrials.gov. Half of non-US studies are estimated to not be registered with ClinicalTrials.gov,6 and OCEBM is most accurately applied to completed studies.
This cross-sectional study found that despite the marked rise in COVID-19 studies, only 29.1% of those registered in ClinicalTrials.gov have the potential to result in OCEBM level 2 evidence. Of the RCTs, only 29.3% are placebo-controlled, blinded studies. Global decline in new cases could also stall enrollment. Even before results are known, most studies likely will not yield meaningful scientific evidence at a time when rapid generation of high-quality knowledge is critical.
Accepted for Publication: May 27, 2020.
Corresponding Author: Mintu Turakhia, MD, MAS, Department of Medicine, Stanford University School of Medicine, 3801 Miranda Ave, Room 111C, Palo Alto CA 94304 (email@example.com).
Published Online: July 27, 2020. doi:10.1001/jamainternmed.2020.2904
Author Contributions: Drs Pundi and Turakhia had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: Pundi, Perino, Turakhia.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: Pundi, Perino, Turakhia.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Pundi, Turakhia.
Obtained funding: Turakhia.
Administrative, technical, or material support: Harrington, Turakhia.
Supervision: Perino, Harrington, Turakhia.
Conflict of Interest Disclosures: Dr Pundi reported receiving research grants from the American Heart Association (AHA) outside the submitted work. Dr Perino reported receiving research support from the AHA and Bristol-Myers Squibb/Pfizer. Dr Harrington reported serving on the AHA Board of Directors (unpaid) and on the Stanford Healthcare Board of Directors from 2016 to 2018 (unpaid). Dr Krumholz reported working under contract with the Centers for Medicare & Medicaid Services to support quality measurement programs; receiving research grants/agreements through Yale University from Medtronic plc and the US Food and Drug Administration to develop methods for postmarket surveillance of medical devices, from Johnson & Johnson to support clinical trial data sharing, and from the Shenzhen Center for Health Information for work to advance intelligent disease prevention and health promotion; collaborating with the National Center for Cardiovascular Diseases in Beijing; receiving payments from the Arnold & Porter law firm for work related to the Sanofi SA clopidogrel litigation, from the Martin Baughman law firm for work related to the Cook Celect inferior vena cava filter litigation, and from the Siegfried & Jensen law firm for work related to rofecoxib (Vioxx) litigation; chairing a cardiac scientific advisory board for UnitedHealth; serving as a member of the IBM Watson Health Life Sciences Board; serving as a member of the Advisory Board for Element Science, the Advisory Board for Facebook, and the Physician Advisory Board for Aetna; and being the cofounder of HugoHealth, a personal health information platform, and Refactor Health, an enterprise health care artificial intelligence–augmented data management company. Dr Turakhia reported receiving grants from Apple Inc, Janssen Pharmaceuticals, Boehringer Ingelheim, Bristol-Myers Squibb, the AHA, and SentreHeart, personal fees from Medtronic plc, Abbott Laboratories, iBeat Inc, iRhythm Technologies, Novartis International AG, Biotronik, Sanofi-Aventis, and Pfizer, Inc, and equity from AliveCor outside the submitted work. No other disclosures were reported.