[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 54.163.92.62. Please contact the publisher to request reinstatement.
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Editorial
April 28, 2015

Researchers, Readers, and Reporting GuidelinesWriting Between the Lines

Author Affiliations
  • 1Deputy Editor, JAMA
  • 2Executive Editor, JAMA
JAMA. 2015;313(16):1625-1626. doi:10.1001/jama.2015.3837

Scientific research is often a result of a spark of creativity. However, formal scientific reporting and creative writing have differing goals that require different approaches. The purpose of a report of a research study is to communicate the design, execution, and findings of the study with precision and accuracy. From the perspective of a journal editor, the writing should be invisible to an informed reader, without byzantine phrasing or ambiguity of meaning. If someone needs to read a sentence multiple times to understand it, the authors and editors have failed.

The scientific report should be concise but should also provide transparency and present all of the key information required for researchers, clinicians, and other readers to be able to assess the validity of the study and its findings. Recognizing deficiencies in the quality of reports of randomized clinical trials (RCTs), a group met in 1993 to establish reporting standards for these studies, with the subsequent publication of the Standards of Reporting Trials (SORT) proposed checklist in 1994,1 followed by the Consolidated Standards of Reporting Trials (CONSORT) statement in 1996.2 Researchers and editors recognized the value of using elements of these standardized approaches, and the CONSORT guidelines became more widely embraced and have become de facto standards for reporting RCTs. Since then, guidelines have been produced for virtually the entire range of research design, including observational studies (STROBE), diagnostic test assessment (STARD), systematic reviews and meta-analysis (PRISMA and MOOSE), tumor marker studies (REMARK), cost-effectiveness analysis (CHEERS), and preclinical animal studies (ARRIVE), and some, like CONSORT, have been updated. The EQUATOR network was formed to help with the development and dissemination of these guidelines and currently includes 256 reporting guidelines on its website.3

Most reporting guidelines include checklists for authors to provide journals at the time of submission; they are often copublished with practical examples and sometimes a standalone explanation and elaboration document. While the hope is that adherence to these guidelines will result in a published article that is precise and therefore allows readers to make informed judgments, at the least it helps editors and external reviewers conduct effective peer review by increasing the likelihood that critical information is included in the submitted manuscript.

Recognizing the potential value of such guidelines for the peer review and scientific publication processes, JAMA published several of the initial reporting guidelines, including CONSORT2 and MOOSE.4 While the proliferation of guidelines means that many deal with fairly narrow niches, JAMA continues to be interested in publishing guidelines that address common or emerging and important study designs. Recent examples include extensions to the CONSORT guidelines for noninferiority and equivalence designs5 and for patient-reported outcomes6; both of these reflect study designs of increasing prevalence in the clinical literature and hence progressively more important for researchers and clinicians.

In this issue of JAMA, Stewart and colleagues7 provide an extension of the PRISMA guidelines for individual participant data (IPD) meta-analyses: the PRISMA-IPD Statement. Modifications of PRISMA relate to structural elements such as the abstract, but also to issues particular to this study design, such as methods of obtaining the individual participant data, exploring data integrity, handling trials for which individual data were unavailable, and methods for data synthesis. At present, the IPD meta-analysis design represents a small percentage of all systematic reviews and meta-analyses. However, the increasing interest in data sharing8,9 is likely to lead to a proliferation of IPD meta-analyses, so this study design will be increasingly relevant to authors and readers. While meta-analyses based on either aggregate or individual participant data are susceptible to important and sometimes critical limitations,10 having standards for their reporting will help ensure clarity and transparency around these limitations so that readers’ interpretations are fully informed.

Even though reporting guidelines like PRISMA-IPD provide consensus-based expert opinion for communicating the methods and results of studies, these recommendations are guidelines, not rules, and journals may selectively not adhere to all points. For instance, JAMA does not ordinarily differentiate subtypes of meta-analyses in the article title or subtitle, as recommended by the PRISMA-IPD statement (checklist, item 1). Also, because of the uncertainty that may arise when combining data from RCTs from various sources (such as differences in the patient populations, effectiveness of randomization, application of interventions, and assessment of outcomes), the ability to draw valid causal inferences, even from pooled individual participant data, may be limited. Accordingly, JAMA considers meta-analysis to represent an observational design, such that outcomes, inferences, and interpretations should be described as associations rather than reported using causal terms such as “size of effect,” as suggested by the PRISMA-IPD statement (checklist, item 21).

In addition, reporting guidelines have an important limitation. The guidelines are generally developed through a consensus process and, while often derived from solid epidemiologic and statistical principles, they might not be seen as being evidence-based. Moreover, even though research has established improvement in the adherence to reporting guidelines,11,12 there does not appear to be evidence that such adherence has led to meaningful improvements in the engagement of readers with the research, or in patient care. Nevertheless, it is clear that use of a guideline checklist assists editors in their assessment of submitted manuscripts; makes it more likely that a published article will include the information to allow a researcher to potentially reproduce a study; and perhaps most importantly helps ensure that the article will include all of the key elements necessary for a reader to conduct a thorough critical appraisal.

Another important consideration is that a study is completed before the manuscript is submitted to a journal, so that all reporting guidelines can do for poorly designed studies is highlight flaws. Because the guidelines are distal in the research process, it is usually too late to correct these flaws. Although there may be hope that researchers who use the reporting guidelines will keep them in mind in designing future studies, that is not the overt purpose of the guidelines.

To improve the quality of study designs requires proximal intervention. To that end, the SPIRIT guidelines have been developed,13 directed toward RCT protocols. If each of the points in that guideline are followed, the likelihood is increased that researchers will consider them at the inception of the study, with the potential for studies that are more likely to achieve meaningful and valid results. At the least, the SPIRIT guidelines may help granting organizations be more discerning in the projects that are funded, potentially resulting in more efficient use of limited research funds.

Ultimately, the goal of reporting guidelines, including the PRISMA-IPD Statement published in this issue of JAMA, is to help improve the quality of research and advance patient care. We encourage authors to use reporting guidelines in the planning and execution of research and in the writing of manuscripts, and also encourage journals to use them in the manuscript submission and peer review process.

Back to top
Article Information
Editorials represent the opinions of the authors and JAMA and not those of the American Medical Association.

Corresponding Author: Robert M. Golub, MD, JAMA, 330 N Wabash Ave, Chicago, IL 60611 (robert.golub@jamanetwork.org).

Conflict of Interest Disclosures: The authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.

References
1.
The Standards of Reporting Trials Group.  A proposal for structured reporting of randomized controlled trials. JAMA. 1994;272(24):1926-1931.
PubMedArticle
2.
Begg  C, Cho  M, Eastwood  S,  et al.  Improving the quality of reporting of randomized controlled trials: the CONSORT statement. JAMA. 1996;276(8):637-639.
PubMedArticle
3.
Equator Network.http://www.equator-network.org. Accessed March 31, 2015.
4.
Stroup  DF, Berlin  JA, Morton  SC,  et al; Meta-analysis Of Observational Studies in Epidemiology (MOOSE) Group.  Meta-analysis of observational studies in epidemiology: a proposal for reporting. JAMA. 2000;283(15):2008-2012.
PubMedArticle
5.
Piaggio  G, Elbourne  DR, Pocock  SJ, Evans  SJ, Altman  DG; CONSORT Group.  Reporting of noninferiority and equivalence randomized trials: extension of the CONSORT 2010 statement. JAMA. 2012;308(24):2594-2604.
PubMedArticle
6.
Calvert  M, Blazeby  J, Altman  DG, Revicki  DA, Moher  D, Brundage  MD; CONSORT PRO Group.  Reporting of patient-reported outcomes in randomized trials: the CONSORT PRO extension. JAMA. 2013;309(8):814-822.
PubMedArticle
7.
Stewart  LA, Clarke  M, Rovers  M,  et al.  Preferred reporting items for a systematic review and meta-analysis of individual participant data: the PRISMA-IPD statement. JAMA. doi:10.1001/jama.2015.3656.
8.
Lo  B.  Sharing clinical trial data: maximizing benefits, minimizing risk. JAMA. 2015;313(8):793-794.
PubMedArticle
9.
Institute of Medicine (IOM) Committee on Strategies for Responsible Sharing of Clinical Trial Data. IOM website. http://www.iom.edu/activities/research/sharingclinicaltrialdata.aspx. 2015. Accessed March 31, 2015.
10.
Berlin  JA, Golub  RM.  Meta-analysis as evidence: building a better pyramid. JAMA. 2014;312(6):603-605.
PubMedArticle
11.
Hopewell  S, Ravaud  P, Baron  G, Boutron  I.  Effect of editors’ implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.
PubMedArticle
12.
Korevaar  DA, van Enst  WA, Spijker  R, Bossuyt  PM, Hooft  L.  Reporting quality of diagnostic accuracy studies: a systematic review and meta-analysis of investigations on adherence to STARD. Evid Based Med. 2014;19(2):47-54.
PubMedArticle
13.
Chan  A-W, Tetzlaff  JM, Altman  DG,  et al.  SPIRIT 2013 Statement: defining standard protocol items for clinical trials. Ann Intern Med. 2013;158(3):200-207.
PubMedArticle
×