[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
Purchase Options:
[Skip to Content Landing]
Peer Review Congress
July 15, 1998

Evaluating the BMJ Guidelines for Economic SubmissionsProspective Audit of Economic Submissions to BMJ and The Lancet

Author Affiliations

From the Ministry of Defence, United Kingdom (Dr Jefferson and Mr Pratt), and BMJ (Drs Smith and Gale), London, England; and the Centre for Health Economics, University of York, York, England (Ms Yee and Dr Drummond).

JAMA. 1998;280(3):275-277. doi:10.1001/jama.280.3.275

Context.— Editorial management of articles on health economics may benefit from guidelines for peer review and revision.

Objective.— To assess whether publication (in August 1996) of the BMJ guidelines on peer review of economics submissions made any difference to editorial and peer review processes, quality of submitted manuscripts, and quality of published manuscripts.

Design and Setting.— Before-after study conducted in the editorial offices of BMJ and The Lancet of the effect of the BMJ guidelines on review and revision of economics submissions, defined as those making explicit comments about resource allocation and/or costs of interventions.

Main Outcome Measures.— Editorial fate and changes in the quality of submissions.

Results.— A total of 2982 manuscripts were submitted to the 2 journals during the before periods, 105 (3.5%) of which were economics submissions. Of these, 27 (24.3%) were full economics evaluations, and 78 (75.7%) were other economics submissions. Overall acceptance rate was 11.6% (12/105). During the after period 2077 manuscripts were submitted to the 2 journals, 87 (4.2%) of which were economics submissions. Eighteen (20.7%) were full economics evaluations, and 69 (79.3%) were other economics submissions. Overall acceptance rate was 6.9% (6/87). Although a number of manuscripts could not be traced to determine whether they were economics submissions, there appeared to be little difference between the 2 journals in numbers or editorial fate of the manuscripts. There was no change in the quality of submitted manuscripts, but BMJ editors found the guidelines and checklists useful and sent fewer economics submissions for external peer review in the after phase.

Conclusions.— Publication of the guidelines helped the BMJ editors improve the efficiency of the editorial process but had no impact on the quality of economics evaluations submitted or published.

HEALTH ECONOMICS literature has increased in the last 10 years,1 but many of the published articles are of low quality.25 Poor economic evaluations are not only wasteful of scarce resources but also misleading. Although guidelines for economic evaluations have been promulgated by governments,68 the pharmaceutical industry, and groups of researchers,9 little is known about editorial practices relating to economic studies.10 Furthermore, few medical journals have written policies, and even fewer have a clear idea of what a "good" economic evaluation is.11 In January 1995, BMJ formed a working party with the aim of improving the quality of submitted and published economic articles by clarifying acceptable methods and their systematic application throughout the peer review process. The guidelines (published in August 1996) provided referees and editors with checklists discussing how manuscripts should be approached and algorithms of the editorial process with an explanatory text.12 We are reporting the results of our study to assess whether publication of the BMJ guidelines made any difference to (1) the quality of submitted manuscripts, (2) the quality of published manuscripts, and (3) the editorial and peer review process.


All submissions with an economic content (those making explicit comments about resource allocation and/or costs of interventions) made during the periods July 1 to September 30, 1994, to BMJ and October 1 to December 31, 1995, to BMJ and The Lancet were included in a "before" phase of the study. Submissions to BMJ in the period July 1 and July 31, 1997, and to The Lancet from April 1 to July 1, 1997, were included in the "after" phase of the study. Study periods were chosen on the basis of high manuscript volume. The Lancet was chosen as a control journal because of its similar scope and size to BMJ and because no concerted effort was made to promulgate the guidelines.

We recorded the editorial fate of submissions (eg, rejected without external review, rejected after review, or published) and undertook an unblinded assessment of the quality of submitted and published economic evaluations using the referees' checklist included in the guidelines. This checklist contains 35 items important for reporting the results of economic evaluations.12 For each item the 4 possible outcomes were (1) yes, the authors dealt with the issue; (2) no, they did not; (3) not clear; and (4) the issue was not applicable to that particular submission. Therefore, the sum of the "no" and "not clear" columns gives an indication of the extent issues were not dealt with. A score of 35 would represent a paper failing on all the items.

To answer the second study question, we sent questionnaires to editorial staff of both journals. This contained questions about whether editorial staff were aware of the guidelines, had used the editors' or referees' checklists, and had found the guidelines useful in the editorial process.


Submitted manuscripts were classified as economic evaluations (EEs), in which 2 or more alternatives were compared with their costs and consequences, and other economic submissions (OESs), including partially analytical, descriptive, or methodological papers. The full classification has been described elsewhere.13 In the "before" period, 1642 and 1340 manuscripts were submitted to BMJ and The Lancet, respectively (433 and 1644 for the "after" period). Most manuscripts were traced and assessed to determine whether they constituted economic submissions. However, a number of manuscripts could not be traced for different reasons, including incorrect sequential submission numbering, editorial delay, and withdrawals.

Editorial Fate of Economic Submissions

Figure 1 outlines the number of submissions to both journals in the "before" and "after" periods and their editorial fate. Full EEs constitute 23% of all economic submissions. Although more were submitted to The Lancet, fewer were sent out for external peer review and none were published. The main change is in the treatment of OESs by BMJ. In the "before" period, 52 submissions were received, 20 (38%) were sent for peer review, and 2 were accepted. In the "after" period, only 5 (13%) of 38 were sent for peer review, and 1 was accepted, suggesting that the guidelines may have improved the efficiency of peer review with no discernible impact on the editorial fate of EEs in both journals.

Image description not available.
Economic submissions to BMJ and The Lancet and their fate. RNT indicates without peer review; PR, all peer-reviewed manuscripts; and A, accepted.
Changes in the Editorial Process

The overall results of the questionnaire (Table 1) show that the guidelines were widely promulgated among editors of BMJ and were perceived to be useful. Few of The Lancet editors were aware of the guidelines. In BMJ the usefulness of the guidelines seemed to mostly help the editors with the internal peer review process, as there was little evidence that editors had sent the referees' checklists to those refereeing papers.

Table 1.—Usefulness of the BMJ Guidelines to Editors of BMJ and The Lancet*
Table 1.—Usefulness of the BMJ Guidelines to Editors of BMJ and The Lancet*
Image description not available.
Quality of Submitted and Published Papers

The guidelines had no apparent impact on the quality of the 43 EEs submitted to the journals (Table 2). A conclusion on whether the guidelines have contributed to any improvements observed between submission and final publication cannot be drawn yet due to small numbers. The most frequent defects in the economic evaluations submitted were (1) failure to state the research question (36 studies or 84%); (2) failure to justify the choice of form of economic evaluation in relation to the question being asked (40 studies or 93%); (3) failure to give details of price adjustments for inflation or currency conversion (34 studies or 79%); (4) failure to justify the choice of discount rate (36 studies or 84%); and (5) failure to state the approach to sensitivity analysis (42 studies or 98%).

Table 2.—Number of Items on the Reviewers' Checklist Either Not Satisfied or Not Clear*
Table 2.—Number of Items on the Reviewers' Checklist Either Not Satisfied or Not Clear*
Image description not available.

Because this may be the first attempt to evaluate the impact of economic methodological guidelines, the study had several important shortcomings. We collected few manuscripts relating to the "after" period from BMJ, as manuscripts submitted prior to June 1997 had been shredded and manuscripts submitted from August 1997 onward were still undergoing editorial assessment. Missing manuscripts and exclusive focus on management of economic submissions in only 2 journals may introduce a bias into the study results, hence the absence of statistical analysis of the data.

There was some evidence that the editorial process had improved at BMJ. Editors found the guidelines helpful and potentially wasteful peer review of OEs was reduced. In the "before" period, BMJ put 20 manuscripts (38%) in this category through to peer review but published only 2. In the "after" period, only 5 (13.1%) were sent for peer review and 1 paper was published, suggesting a greater in-house confidence in dealing with submissions. In The Lancet, where knowledge of the guidelines was limited, 35% of OEs were sent for peer review in the "before" period, rising to 45% in the "after" period, with 4 and 5 papers being published, respectively.

There is no evidence that the guidelines had an impact on improving the quality of submitted and published articles. Application of the referees' checklist confirmed that the quality of the economic evaluation literature is disappointing. However, there was no change in the quality scores "before" and "after" probably because, although BMJ editors were made aware of the guidelines, they were not asked to distribute the guidelines or referees' checklist when sending papers out for peer review. Only a minority of referees used the checklist as a basis for their review, and authors took little notice of the guidelines when preparing their manuscripts, possibly because of ignorance, disagreement, or the impression that BMJ was unlikely to enforce them. This suggests that a more sustained educational effort is required to change authors' and editors' practice. A more rigorous prospective study of the impact of the guidelines could be conducted, whereby editors and/or referees could be randomly assigned to use the checklists.

Elixhauser A. Health care cost-benefit and cost-effectiveness analysis (CBA/CEA): from 1979 to 1990: a bibliography.  Med Care.1993;31(suppl): JS1-JS149.Google Scholar
Gerard K. Cost-utility in practice: a policy maker's guide to the state of the art.  Health Policy.1992;21:249-279.Google Scholar
Udvarhelyi S, Colditz GA, Rai A, Epstein AM. Cost-effectiveness and cost-benefit analysis in the medical literature: are methods being used correctly?  Ann Intern Med.1992;116:238-244.Google Scholar
Adams ME, McCall NT, Gray DT, Orza MJ, Chalmers TC. Economic analysis in randomized control trials.  Med Care.1992;30:231-238.Google Scholar
Jefferson TO, Demicheli V. Is vaccination against hepatitis B efficient? a review of world literature.  Health Econ.1994;3:25-37.Google Scholar
Commonwealth Department of Health, Housing, and Community Services.  Guidelines for the Pharmaceutical Industry on Preparation of Submissions to the Pharmaceutical Benefits Advisory Committee . Canberra: Australian Government Publishing Service; 1992.
Detsky AS. Guidelines for economic analysis of pharmaceutical products: a draft for Ontario and Canada.  Pharmacoeconomics.1993;3:354-361.Google Scholar
Canadian Co-ordinating Office for Health Technology Assessment (CCOHTA).  Guidelines for Economic Evaluation of Pharmaceuticals . Ottawa, Ontario: CCOHTA; 1994.
Gold MR, Siegel JC, Russell LB, Weinstein MC. Cost-effectiveness Analysis in Health and Medicine . New York, NY: Oxford University Press; 1996.
Kassirer JP, Angel M. The journal's policy on cost-effectiveness analyses.  N Engl J Med.1994;331:669-670.Google Scholar
Jefferson TO, Demicheli V. Are guidelines for peer-reviewing economic evaluations necessary? a survey of current editorial practice.  Health Econ.1995;4:383-388.Google Scholar
Drummond MF, Jefferson TO.for the BMJ Working Party on Guidelines for Authors and Peer-Reviewers of Economic Submissions to the British Medical Journal.  Guidelines for authors and peer-reviewers of economic submissions to the British Medical Journal BMJ.1996;313:275-383.Google Scholar
Jefferson TO, Demicheli V, Entwistle V. Assessing quality of economic submissions to the BMJ BMJ.1995;311:393-394.Google Scholar