Customize your JAMA Network experience by selecting one or more topics from the list below.
Patsopoulos NA, Analatos AA, Ioannidis JPA. Relative Citation Impact of Various Study Designs in the Health Sciences. JAMA. 2005;293(19):2362–2366. doi:https://doi.org/10.1001/jama.293.19.2362
Author Affiliations: Clinical and Molecular
Epidemiology Unit and Clinical Trials and Evidence Based Medicine Unit, Department
of Hygiene and Epidemiology, University of Ioannina School of Medicine, Ioannina,
Greece (Drs Patsopoulos and Ioannidis and Mr Analatos); Department of Immunology
and Histocompatibility, Faculty of Medicine, University of Thessaly, Larissa,
Greece (Dr Patsopoulos and Mr Analatos); Institute for Clinical Research and
Health Policy Studies, Department of Medicine, Tufts-New England Medical Center,
Tufts University School of Medicine, Boston, Mass (Dr Ioannidis).
Context The relative merits of various study designs and their placement in
hierarchies of evidence are often discussed. However, there is limited knowledge
about the relative citation impact of articles using various study designs.
Objective To determine whether the type of study design affects the rate of citation
in subsequent articles.
Design and Setting We measured the citation impact of articles using various study designs—including
meta-analyses, randomized controlled trials, cohort studies, case-control
studies, case reports, nonsystematic reviews, and decision analysis or cost-effectiveness
analysis—published in 1991 and in 2001 for a sample of 2646 articles.
Main Outcome Measure The citation count through the end of the second year after the year
of publication and the total received citations.
Results Meta-analyses received more citations than any other study design both
in 1991 (P<.05 for all comparisons) and in 2001
(P<.001 for all comparisons) and both in the first
2 years and in the longer term. More than 10 citations in the first 2 years
were received by 32.4% of meta-analyses published in 1991 and 43.6% of meta-analyses
published in 2001. Randomized controlled trials did not differ significantly
from epidemiological studies and nonsystematic review articles in 1991 but
clearly became the second-cited study design in 2001. Epidemiological studies,
nonsystematic review articles, and decision and cost-effectiveness analyses
had relatively similar impact; case reports received negligible citations.
Meta-analyses were cited significantly more often than all other designs after
adjusting for year of publication, high journal impact factor, and country
of origin. When limited to studies addressing treatment effects, meta-analyses
received more citations than randomized trials.
Conclusion Overall, the citation impact of various study designs is commensurate
with most proposed hierarchies of evidence.
Create a personal account or sign in to: