[Skip to Navigation]
Sign In
Table 1.  Swiss Stakeholder Views on Reviewing the Existing Research
Swiss Stakeholder Views on Reviewing the Existing Research
Table 2.  International Funders’ Views on Reviewing the Existing Research
International Funders’ Views on Reviewing the Existing Research
1.
Sutton  AJ, Cooper  NJ, Jones  DR.  Evidence synthesis as the key to more coherent and efficient research.   BMC Med Res Methodol. 2009;9:29. doi:10.1186/1471-2288-9-29 PubMedGoogle ScholarCrossref
2.
Clarke  M.  Doing new research? don’t forget the old.   PLoS Med. 2004;1(2):e35. doi:10.1371/journal.pmed.0010035 PubMedGoogle Scholar
3.
Clarke  M, Hopewell  S, Chalmers  I.  Clinical trials should begin and end with systematic reviews of relevant evidence: 12 years and waiting.   Lancet. 2010;376(9734):20-21. doi:10.1016/S0140-6736(10)61045-8 PubMedGoogle ScholarCrossref
4.
Chalmers  I, Bracken  MB, Djulbegovic  B,  et al.  How to increase value and reduce waste when research priorities are set.   Lancet. 2014;383(9912):156-165. doi:10.1016/S0140-6736(13)62229-1 PubMedGoogle ScholarCrossref
5.
Robinson  KA, Brunnhuber  K, Ciliska  D, Juhl  CB, Christensen  R, Lund  H; Evidence-Based Research Network.  Evidence-based research series—paper 1: what evidence-based research is and why is it important?   J Clin Epidemiol. 2021;129:151-157. doi:10.1016/j.jclinepi.2020.07.020 PubMedGoogle ScholarCrossref
6.
Clarke  M, Brice  A, Chalmers  I.  Accumulating research: a systematic account of how cumulative meta-analyses would have provided knowledge, improved health, reduced harm and saved resources.   PLoS One. 2014;9(7):e102670. doi:10.1371/journal.pone.0102670 PubMedGoogle Scholar
7.
Habre  C, Tramèr  MR, Pöpping  DM, Elia  N.  Ability of a meta-analysis to prevent redundant research: systematic review of studies on pain from propofol injection.   BMJ. 2014;348:g5219. doi:10.1136/bmj.g5219 PubMedGoogle Scholar
8.
Lau  J, Antman  EM, Jimenez-Silva  J, Kupelnick  B, Mosteller  F, Chalmers  TC.  Cumulative meta-analysis of therapeutic trials for myocardial infarction.   N Engl J Med. 1992;327(4):248-254. doi:10.1056/NEJM199207233270406 PubMedGoogle ScholarCrossref
9.
Fiorentino  F, Vasilakis  C, Treasure  T.  Clinical reports of pulmonary metastasectomy for colorectal cancer: a citation network analysis.   Br J Cancer. 2011;104(7):1085-1097. doi:10.1038/sj.bjc.6606060 PubMedGoogle ScholarCrossref
10.
Greenberg  SA.  How citation distortions create unfounded authority: analysis of a citation network.   BMJ. 2009;339:b2680. doi:10.1136/bmj.b2680 PubMedGoogle ScholarCrossref
11.
Gøtzsche  PC.  Reference bias in reports of drug trials.   BMJ (Clin Res Ed). 1987;295(6599):654-656. doi:10.1136/bmj.295.6599.654 PubMedGoogle ScholarCrossref
12.
Robinson  KA, Goodman  SN.  A systematic examination of the citation of prior research in reports of randomized, controlled trials.   Ann Intern Med. 2011;154(1):50-55. doi:10.7326/0003-4819-154-1-201101040-00007 PubMedGoogle ScholarCrossref
13.
Clarke  M, Hopewell  S.  Many reports of randomised trials still don't begin or end with a systematic review of the relevant evidence.   J Bahrain Med Soc. 2013;24:145-148.Google Scholar
14.
Engelking  A, Cavar  M, Puljak  L.  The use of systematic reviews to justify anaesthesiology trials: a meta-epidemiological study.   Eur J Pain. 2018;22(10):1844-1849. doi:10.1002/ejp.1280 PubMedGoogle ScholarCrossref
15.
Goudie  AC, Sutton  AJ, Jones  DR, Donald  A.  Empirical assessment suggests that existing evidence could be used more fully in designing randomized controlled trials.   J Clin Epidemiol. 2010;63(9):983-991. doi:10.1016/j.jclinepi.2010.01.022 PubMedGoogle ScholarCrossref
16.
Chan  AW, Hróbjartsson  A, Jørgensen  KJ, Gøtzsche  PC, Altman  DG.  Discrepancies in sample size calculations and data analyses reported in randomised trials: comparison of publications with protocols.   BMJ. 2008;337:a2299. doi:10.1136/bmj.a2299 PubMedGoogle ScholarCrossref
17.
Bhurke  S, Cook  A, Tallant  A, Young  A, Williams  E, Raftery  J.  Using systematic reviews to inform NIHR HTA trial planning and design: a retrospective cohort.   BMC Med Res Methodol. 2015;15:108. doi:10.1186/s12874-015-0102-2 PubMedGoogle ScholarCrossref
18.
Clayton  GL, Smith  IL, Higgins  JPT,  et al.  The INVEST project: investigating the use of evidence synthesis in the design and analysis of clinical trials.   Trials. 2017;18(1):219. doi:10.1186/s13063-017-1955-y PubMedGoogle ScholarCrossref
19.
von Niederhäusern  B, Magnin  A, Pauli-Magnus  C.  The impact of clinical trial units on the value of clinical research in Switzerland.   Swiss Med Wkly. 2018;148:w14615.PubMedGoogle Scholar
20.
Swiss National Science Foundation. Investigator Initiated Clinical Trials (IICTs): call for proposals 2020. Accessed July 5, 2021. http://www.snf.ch/SiteCollectionDocuments/IICT_Call%202020_EN.pdf
21.
Jones  AP, Conroy  E, Williamson  PR, Clarke  M, Gamble  C.  The use of systematic reviews in the planning, design and conduct of randomised trials: a retrospective cohort of NIHR HTA funded trials.   BMC Med Res Methodol. 2013;13:50. doi:10.1186/1471-2288-13-50 PubMedGoogle ScholarCrossref
22.
Pandis  N, Fleming  PS, Koletsi  D, Hopewell  S.  The citation of relevant systematic reviews and randomised trials in published reports of trial protocols.   Trials. 2016;17(1):581. doi:10.1186/s13063-016-1713-6 PubMedGoogle ScholarCrossref
23.
Joseph  PD, Caldwell  PH, Barnes  EH,  et al.  Completeness of protocols for clinical trials in children submitted to ethics committees.   J Paediatr Child Health. 2019;55(3):291-298. doi:10.1111/jpc.14189 PubMedGoogle ScholarCrossref
24.
Nasser  M, Clarke  M, Chalmers  I,  et al.  What are funders doing to minimise waste in research?   Lancet. 2017;389(10073):1006-1007. doi:10.1016/S0140-6736(17)30657-8 PubMedGoogle ScholarCrossref
25.
Ensuring Value in Research. Accessed October 14, 2021. https://evir.org/
26.
Tong  A, Sainsbury  P, Craig  J.  Consolidated Criteria for Reporting Qualitative Research (COREQ): a 32-item checklist for interviews and focus groups.   Int J Qual Health Care. 2007;19(6):349-357. doi:10.1093/intqhc/mzm042 PubMedGoogle ScholarCrossref
27.
McLennan  S, Griessbach  A, Briel  M; Making Randomized Trials Affordable (MARTA) Group.  Practices and attitudes of Swiss stakeholders regarding investigator-initiated clinical trial funding acquisition and cost management.   JAMA Netw Open. 2021;4(6):e2111847. doi:10.1001/jamanetworkopen.2021.11847 PubMedGoogle Scholar
28.
Gloy  V, McLennan  S, Rinderknecht  M,  et al.  Uncertainties about the need for ethics approval in Switzerland: a mixed-methods study.   Swiss Med Wkly. 2020;150:w20318. doi:10.4414/smw.2020.20318 PubMedGoogle Scholar
29.
Briel  M, Speich  B, von Elm  E, Gloy  V.  Comparison of randomized controlled trials discontinued or revised for poor recruitment and completed trials with the same research question: a matched qualitative study.   Trials. 2019;20(1):800. doi:10.1186/s13063-019-3957-4 PubMedGoogle ScholarCrossref
30.
Briel  M, Elger  B, von Elm  E, Satalkar  P.  Insufficient recruitment and premature discontinuation of clinical trials in Switzerland: qualitative study with trialists and other stakeholders.   Swiss Med Wkly. 2017;147:w14556.PubMedGoogle Scholar
31.
McLennan  S.  Rejected online feedback from a Swiss physician rating website between 2008 and 2017: analysis of 2352 ratings.   J Med Internet Res. 2020;22(8):e18374. doi:10.2196/18374 PubMedGoogle Scholar
32.
McLennan  S.  The content and nature of narrative comments on Swiss physician rating websites: analysis of 849 comments.   J Med Internet Res. 2019;21(9):e14336. doi:10.2196/14336 PubMedGoogle Scholar
33.
McLennan  S. The ethical oversight of learning health care activities in Switzerland: a qualitative study. Int J Qual Health Care. 2019;31(8):G81-G86. doi:10.1093/intqhc/mzz045
34.
McLennan  S, Schwappach  D, Harder  Y, Staender  S, Elger  B.  Patient safety issues in office-based surgery and anaesthesia in Switzerland: a qualitative study.   Z Evid Fortbild Qual Gesundhwes. 2017;125:23-29. doi:10.1016/j.zefq.2017.06.002 PubMedGoogle ScholarCrossref
35.
Palinkas  LA, Horwitz  SM, Green  CA, Wisdom  JP, Duan  N, Hoagwood  K.  Purposeful sampling for qualitative data collection and analysis in mixed method implementation research.   Adm Policy Ment Health. 2015;42(5):533-544. doi:10.1007/s10488-013-0528-y PubMedGoogle ScholarCrossref
36.
Marshall  MN.  Sampling for qualitative research.   Fam Pract. 1996;13(6):522-525. doi:10.1093/fampra/13.6.522 PubMedGoogle ScholarCrossref
37.
Fusch  PI, Ness  LR.  Are we there yet? data saturation in qualitative research.   Qual Rep. 2015;20(9):1408-1416. doi:10.46743/2160-3715/2015.2281Google Scholar
38.
Lund  H, Juhl  CB, Nørgaard  B,  et al; Evidence-Based Research Network.  Evidence-based research series—paper 2: using an evidence-based research approach before a new study is conducted to ensure value.   J Clin Epidemiol. 2021;129:158-166. doi:10.1016/j.jclinepi.2020.07.019 PubMedGoogle ScholarCrossref
39.
Lund  H, Juhl  CB, Nørgaard  B,  et al; Evidence-Based Research Network.  Evidence-based research series—paper 3: using an evidence-based research approach to place your results into context after the study is performed to ensure usefulness of the conclusion.   J Clin Epidemiol. 2021;129:167-171. doi:10.1016/j.jclinepi.2020.07.021 PubMedGoogle ScholarCrossref
40.
Spencer EA, Heneghan C. Confirmation bias. In: Catalogue of Bias. Catalogue of Bias Collaboration; 2018. Accessed July 5, 2021. https://catalogofbias.org/biases/confirmation-bias/
41.
Ganann  R, Ciliska  D, Thomas  H.  Expediting systematic reviews: methods and implications of rapid reviews.   Implement Sci. 2010;5:56. doi:10.1186/1748-5908-5-56 PubMedGoogle ScholarCrossref
42.
Garritty  C, Gartlehner  G, Kamel  C,  et al. Cochrane Rapid Reviews—interim guidance from the Cochrane Rapid Reviews Methods Group. March 23, 2020. Accessed April 29, 2021. https://methods.cochrane.org/rapidreviews/sites/methods.cochrane.org.rapidreviews/files/public/uploads/cochrane_rr_-_guidance-23mar2020-final.pdf
43.
Dobbins  M; National Collaborating Centre of Methods and Tools. Rapid review guidebook. 2017. Accessed April 29, 2021. https://www.nccmt.ca/uploads/media/media/0001/02/800fe34eaedbad09edf80ad5081b9291acf1c0c2.pdf
44.
Garritty  CM, Norris  SL, Moher  D.  Developing WHO rapid advice guidelines in the setting of a public health emergency.   J Clin Epidemiol. 2017;82:47-60. doi:10.1016/j.jclinepi.2016.08.010 PubMedGoogle ScholarCrossref
45.
Kim  D, Hasford  J.  Redundant trials can be prevented, if the EU clinical trial regulation is applied duly.   BMC Med Ethics. 2020;21(1):107. doi:10.1186/s12910-020-00536-9 PubMedGoogle ScholarCrossref
46.
Smyth  RM, Jacoby  A, Altman  DG, Gamble  C, Williamson  PR.  The natural history of conducting and reporting clinical trials: interviews with trialists.   Trials. 2015;16:16. doi:10.1186/s13063-014-0536-6 PubMedGoogle ScholarCrossref
47.
Nikolakopoulou  A, Trelle  S, Sutton  AJ, Egger  M, Salanti  G.  Synthesizing existing evidence to design future trials: survey of methodologists from European institutions.   Trials. 2019;20(1):334. doi:10.1186/s13063-019-3449-6 PubMedGoogle ScholarCrossref
48.
Bergen  N, Labonté  R.  “Everything is perfect, and we have no problems”: detecting and limiting social desirability bias in qualitative research.   Qual Health Res. 2020;30(5):783-792. doi:10.1177/1049732319889354 PubMedGoogle ScholarCrossref
Original Investigation
Statistics and Research Methods
November 30, 2021

Barriers and Facilitating Factors for Conducting Systematic Evidence Assessments in Academic Clinical Trials

Author Affiliations
  • 1Department of Clinical Research, Basel Institute for Clinical Epidemiology and Biostatistics, University of Basel and University Hospital Basel, Basel, Switzerland
  • 2Institute of History and Ethics in Medicine, TUM School of Medicine, Technical University of Munich, Munich, Germany
  • 3Cochrane Austria, Department for Evidence-based Medicine and Evaluation, Danube University Krems, Krems, Austria
  • 4Meta-Research Innovation Center at Stanford, Stanford University, Stanford, California
  • 5Meta-Research Innovation Center Berlin, Berlin Institute of Health, Berlin, Germany
  • 6Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
JAMA Netw Open. 2021;4(11):e2136577. doi:10.1001/jamanetworkopen.2021.36577
Key Points

Question  What are the practices and attitudes of Swiss stakeholders and international funders regarding conducting systematic evidence assessments in academic clinical trials?

Findings  In this qualitative study in which 48 Swiss clinical trial stakeholders and 9 international funders were interviewed, responses varied widely regarding how previous evidence should be summarized and assessed when planning a new clinical trial. A lack of obligation, time, competent support, and financial resources was identified as a barrier for an evidence-based approach.

Meaning  There may be a need for more explicit requirements from funders and ethics committees to clarify the level of comprehensiveness needed in summarizing existing evidence for different types of clinical trials.

Abstract

Importance  A systematic assessment of existing research should justify the conduct and inform the design of new clinical research but is often lacking. There is little research on the barriers to and factors facilitating systematic evidence assessments.

Objective  To examine the practices and attitudes of Swiss stakeholders and international funders regarding conducting systematic evidence assessments in academic clinical trials.

Design, Setting, and Participants  In this qualitative study, individual semistructured qualitative interviews were conducted between February and August 2020 with 48 Swiss stakeholder groups (27 primary investigators, 9 funders and sponsors, 6 clinical trial support organizations, and 6 ethics committee members) and between January and March 2021 with 9 international funders of clinical trials from North America and Europe with a reputation for requiring systematic evidence synthesis in applications for academic clinical trials.

Main Outcomes and Measures  The main outcomes were practices and attitudes of Swiss stakeholders and international funders regarding conducting systematic evidence assessments in academic clinical trials. Interviews were analyzed using conventional content analysis.

Results  Of the 57 participants, 40 (70.2%) were male. Participants universally acknowledged that a comprehensive understanding of the previous evidence is important but reported wide variation regarding how this should be achieved. Participants reported that the conduct of formal systematic reviews was currently not expected before most clinical trials, but most international funders reported expecting a systematic search for the existing evidence. Whereas time and resources were reported by all participants as barriers to conducting systematic reviews, the Swiss research ecosystem was reported not to be as supportive of a systematic approach compared with international settings.

Conclusions and Relevance  In this qualitative study, Swiss stakeholders and international funders generally agreed that new clinical trials should be justified by a systematic evidence assessment but that barriers on individual, organizational, and political levels kept them from implementing it. More explicit requirements from funders appear to be needed to clarify the required level of comprehensiveness in summarizing existing evidence for different types of clinical trials.

Introduction

New clinical research requires a systematic assessment of the existing evidence; unnecessary or poorly informed clinical research is costly and unethical, limits the available funding for relevant and well-designed research, and diminishes the public’s trust in science.1-5 A lack of a systematic assessment of prior research has led to thousands of patients being recruited into clinical trials and inadequately treated with no treatment or inferior control treatment well after the tested intervention was known to be effective.6-8 Nonsystematic reviews of the literature also often involve bias, with previous studies being selected to justify new research based on strategic considerations and preferences of the investigators, more often citing studies that have supportive, positive, and statistically significant findings vs those that have conflicting, negative, and nonsignificant findings.9-11 However, previous studies indicate that an evidence-based approach to clinical research remains insufficiently used.12-17

Previous research regarding the barriers and facilitating factors for using systematic reviews to justify and inform the design of new clinical trials is currently limited (search details are provided in eAppendix 1 in the Supplement). A survey conducted among delegates of the International Clinical Trials Methodology Conference in November 2015 found that time constraints were perceived as the biggest barrier to the use of systematic evidence synthesis when designing a new trial, followed by a belief that the trial was the first in the research area, a belief that previous trials were different from the current trial, financial constraints, and the fact that funders did not require systematic evidence synthesis.18 However, the response rate of the survey was only 17% (106 of 638 delegates), and 95% of registered delegates were from the UK and Ireland,18 severely limiting the validity and applicability of the survey.

In Switzerland, the need to support high-quality academic clinical trials has been increasingly recognized; a network of clinical trial support units was initiated in 2007,19 and the Swiss National Science Foundation has implemented a yearly program for investigator-initiated clinical trials since 2016.20 However, it is currently not known to what extent an evidence-based approach in academic clinical trials in Switzerland is required by funders or implemented in practice. Furthermore, previous research suggests that some international funders currently require a systematic review when applying for financial support for a new clinical trial.21-24 These funders are typically members of the Ensuring Value in Research Funders’ Forum,25 an organization aiming to help health research funders increase the value of their research. One of the principles endorsed by all members of the forum is that “Research should only be funded if set in the context of one or more existing systematic reviews of what is already known or an otherwise robust demonstration of a research gap.”25 However, there is a lack of research examining what these funders require in practice and their experience implementing an evidence-based approach.

Although the lack of an evidence-based approach in clinical research is well known and there is general agreement about its importance,12-17 it remains unclear how to best achieve widespread and sustainable implementation. A more thorough understanding of barriers and facilitating factors from the different perspectives of all relevant stakeholders is needed. This study therefore aimed to examine the practices and attitudes of Swiss stakeholders and international funders regarding conducting systematic evidence assessments in academic clinical trials.

Methods

For this qualitative study, the study design and data collection did not require approval of an ethics committee according to Articles 1 and 2 of the Federal Act on Research Involving Human Beings in Switzerland. Verbal informed consent was obtained from participants at the start of the interview. The study followed the Consolidated Criteria for Reporting Qualitative Research (COREQ) reporting guideline.26 An extended methods section is provided in eAppendix 2 in the Supplement.

Research Team and Reflexivity

Interviews were primarily conducted by S.M., a male senior researcher in biomedical ethics. One interview was conducted by M.B., a male physician and senior scientist in clinical epidemiology. Both interviewers have long-standing experience with qualitative research in the context of clinical research and evidence-based medicine.27-34 The interviewers had already had contact with some of the Swiss stakeholders from previous research studies. Otherwise, no relationship existed between the interviewers and the other participants before the study, and participants received limited information about interviewers.

Study Design

Participants were primarily selected through purposive sampling35 to ensure sample diversity according to predetermined factors (eg, field of expertise). Additional participants were identified using snowball sampling.36 Participants were contacted by email and suitable dates for an interview were found with those willing to participate. The study consisted of 57 participants from 2 different samples. In the first sample, 48 Swiss stakeholders from 4 different groups were recruited: 27 primary investigators, 9 funders and sponsors, 6 clinical trial support organizations, and 6 ethics committee members. In the second sample, 9 international funders from North America and Europe with a reputation for implementing an evidence-based approach for clinical trials were recruited.24 Interviews were conducted between February and August 2020 with Swiss stakeholders and between January and March 2021 with international funders. One participant provided their response in writing via email; the remaining interviews were conducted via a telephone or video call. All interviews were conducted in English. Only the participant and the researcher were present during the interview. A researcher-developed semistructured interview guide was created for each group to guide the discussion (eAppendix 3 in the Supplement). The interview guide was piloted in the first 2 interviews. Because no problems were identified, no further piloting or adaptation of the interview guides was deemed to be necessary. Interviews were audio-recorded and were a mean duration of 29 minutes (range, 12-62 minutes) with Swiss stakeholders and 30 minutes (range, 22-44 minutes) with international funders. After 57 interviews, the topic of data saturation arose and was discussed by the research team.37 It was concluded that no new substantive themes were being expressed by the participants. Transcriptions of the interviews were returned to all participants with an invitation for them to review and send any corrections or clarifications; 8 responses were received with minor corrections to syntax.

Data Analysis

Using the interview transcriptions in their original language, S.M. performed conventional content analysis with the assistance of the qualitative software MAXQDA, version 11 (VERBI Software). Analysis commenced while interviews were ongoing and used an iterative approach in which initial codes common across participants as well as those unique to individuals were identified using a process of open coding and developed as the analysis progressed. B.N.-S., L.G.H., and M.B. reviewed the initial analysis to clarify and refine codes, and conversations among the investigators continued until coding differences were resolved and consensus was achieved. Findings are presented as higher- and lower-level categories in a coding frame.

Results
Current Practice

Of the 57 participants, 40 (70.2%) were male. Swiss participants universally acknowledged the importance of having a comprehensive understanding of the previous evidence when designing and justifying a new clinical trial. However, it was reported that most investigators in Switzerland were not currently conducting systematic reviews, and it was estimated that systematic reviews only happened in 10% to 30% of trials. Many participants disagreed that systematic reviews were always necessary before a new clinical trial and reported that the need for a systematic review depended on the type of clinical trial and available evidence. Only a few participants said a systematic review should always be done before a new clinical trial if no systematic review had been recently published on the topic by other researchers (Table 1).

In contrast, international funders reported a clear expectation that academic clinical trial proposals should include some form of systematic evidence synthesis (Table 2). However, participants reported large variations in terms of the type of evidence synthesis required and when it should be required.

Types of Evidence Assessment Required

The level of comprehensiveness that participants expected in summaries of existing evidence typically depended on the type of clinical trials being funded. A funder of large, late-stage multicenter clinical trials reported that they expected a formal systematic review to be provided; this was seen as important owing to the focus on public funding and accountability in their country, with systematic reviews being a part of opening the scientific process to public scrutiny. However, most funders of smaller, earlier-stage clinical trials reported that they did not mandate formal systematic reviews and only required a systematic search of the literature (including the search strategy and databases used). Nevertheless, 2 international funders who were members of the Ensuring Value in Research Funders’ Forum reported that they did not have explicit requirements regarding evidence assessment and left it to investigators what to provide.

When Evidence Assessment Is Required

Most funders reported having a 2-step funding process. Although 1 funder required systematic evidence assessments to be provided at the preapplication stage, most funders required it at the full application stage. Participants from the latter group reported that the key reason for requesting the evidence assessment at the second step was the limited space available in preapplications and the desire to reduce the work of the funder and the applicants who would not be funded.

Barriers to an Evidence-Based Approach

Swiss participants identified 3 key barriers that may explain why systematic reviews are used infrequently.

  • The first barrier was investigator perceptions. Participants reported that various views of principal investigators led them to not conduct systematic reviews before a new trial, including the view that they were experts in the field and already knew the relevant research; the view that their study was very innovative, and therefore, a systematic review did not make sense because of the limited evidence available; and the belief that an evidence-based approach could be achieved without a formal systematic review.

  • The second barrier was practical challenges. Some practical challenges were also identified by participants as reasons why systematic reviews were not conducted, including systematic reviews being very time consuming; a lack of specific funds, which led to systematic reviews often being done without funding; insufficient personnel with the right expertise to conduct systematic reviews; and a lack of awareness and knowledge of systematic reviews among investigators and funding panels.

  • The third barrier was lack of enforcement. Participants also reported that none of the funders of academic clinical trials in Switzerland were currently requiring a systematic review, that the method for reviewing the existing research was left to principal investigators, and that funders relied on the expertise of reviewers to evaluate whether the information presented in applications was sufficient. Participants also reported that ethics committees did not require systematic reviews and that although ethics committees checked whether the literature review conducted was generally sufficient and might ask for more information if the review was clearly inadequate, they ultimately saw this as the responsibility of the principal investigator and outside the task of ethics committees.

International funders did not report perceiving substantial barriers. Investigators were seen to have sufficient awareness and knowledge and generally accepted the funder’s expectations regarding evidence assessment. The practical challenges of time and resources were identified as the main barriers to formal systematic reviews. Because formal systematic reviews were perceived as time consuming and typically unfunded, these participants reported that mandating them could create problems and undermine innovative research. However, a funder of larger clinical trials noted that if an investigator was applying for a substantial sum of money to run a large multicenter clinical trial, they would expect the investigator to have the resources to do this sort of preparatory work.

Factors Facilitating an Evidence-Based Approach

Many Swiss participants reported that the best way to facilitate systematic evidence assessments was for funders to make them mandatory. Although participants said they wanted such a requirement to be feasible and flexible, many reported that funders were the most suitable stakeholder to enforce evidence assessments and said that knowing what the current evidence is for a research question should be a prerequisite for academic clinical trials, particularly if large amounts of public funds are involved. However, many participants also acknowledged that any such requirement would need to be combined with better support from academic institutions and associated clinical trial support organizations.

In contrast, multiple international funders highlighted how evidence assessment communities had become firmly established in their countries, with sufficient capacity now existing for training, expertise, and collaboration, which had been important in promoting a systematic approach. Moving forward, some funders said that it would be helpful to provide investigators with more explicit requirements to clarify the level of comprehensiveness expected. However, funders identified a need to improve their own knowledge of the different types of evidence assessment (eg, formal systematic reviews vs rapid reviews) and for more research on the advantages and disadvantages of each approach.

Discussion

To our knowledge, this is the first qualitative study to examine the barriers to and factors facilitating an evidence-based research approach with several stakeholders in the context of academic clinical trials. This study found general agreement among Swiss stakeholders and international funders that new clinical trials should be justified by a systematic evidence assessment, but there seemed to be substantial variation among stakeholders in terms of the expected comprehensiveness and transparency of evidence assessments. Formal systematic reviews are currently not expected and not conducted before most clinical trials, but most international funders expect a systematic search for the existing evidence. Whereas time and resources were reported by all participants as barriers to conducting systematic reviews, the Swiss research ecosystem was reported not to be as supportive of a systematic approach compared with international settings. Moving toward a more evidence-based research approach in academic clinical trials requires changes on multiple levels.

On an individual level, we identified a lack of awareness and knowledge of the importance of systematic reviews as well as a lack of skills to conduct systematic reviews. Providing training and integrating courses into the curricula of life sciences to build capacity in evidence-based research as well as more support from academic institutions (eg, information specialists in libraries where systematic literature searches are conducted) could help to overcome these barriers.38,39 In addition, it seems necessary to raise awareness within the scientific community that expert opinion and narrative reviews are not enough to implement evidence-based research. They have a high risk of being distorted by confirmation bias—the tendency to search and interpret information in a way that supports one’s own beliefs—and therefore are not sufficient to foster transparent evidence-based research.40

At an organizational level, resource constraints (eg, time and money) often hinder the conduct of comprehensive systematic reviews. Conducting a comprehensive systematic review can take up to 1 to 2 years.41 Rapid reviews, “a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource-efficient manner,”42 have emerged as pragmatic alternatives to systematic reviews and could enable primary researchers to sustainably adopt an evidence-based approach.43,44 Conducting a systematic review is (if at all) a prerequisite during the application stage, meaning that the update or conduct of a systematic review is often not funded. Thus, researchers may opt to apply for a clinical trial grant without putting unpaid resources into the production of a systematic review. Funders could consider financially supporting the production of systematic reviews or providing grant application funds. Although this might cost money initially, it could save money and resources by avoiding research waste in the long run.

On a political level, the lack of funders and ethics committees requiring systematic reviews was perceived by the study participants as a barrier. Making systematic evidence assessments mandatory for funding and ethics approval emerged as the most promising factor to facilitate implementation of evidence-based research. According to the National Institutes of Health Research in the UK, specifically requiring systematic reviews to justify the need for a new large multicenter trial can increase their use to nearly 100%.17 For smaller, earlier-stage clinical trials, funders could also explicitly clarify the requirements with respect to a systematic evidence assessment (eg, requesting at least a documented search strategy with searched databases). Ethics committees have also been called on to more vigorously advocate for an evidence-based approach.45

This study fills a research gap because, to our knowledge, other studies generally investigated the “natural history of conducting and reporting clinical trials” in interviews, with trialists only touching on “insufficient reference to previous research” when planning a new trial,46 or they specifically focused on a survey of methodologists on their opinions about a conditional trial design framework using network meta-analysis for the planning of new trials.47 The current study contributes to the findings of the preliminary survey of methodologists by Clayton et al18 by including perceptions of various stakeholders outside the UK and Ireland and by conducting in-depth interviews, which resulted in a more thorough understanding of barriers to and factors facilitating an evidence-based approach when designing a new trial.

Limitations

This study has limitations. First, this qualitative study did not use a random sample of stakeholders. However, we included a range of experts who had direct experience with systematic evidence assessment in academic clinical trials, which makes it likely that this study captured key aspects of a multisided issue and provided applicable results. Second, a bias might have existed toward the reporting of socially desirable attitudes.48 Given that the results were critical of current practice, we believe such a bias is limited. Third, with the exception of funders, all participants were from Switzerland, compromising the generalizability of the findings to some degree. However, the findings are generally in line with existing research,18 and probing current practices of funders from different countries that have the reputation of endorsing an evidence-based approach and contrasting their views with Swiss stakeholders has yielded rich findings about knowledge gaps and insufficient clarity of funder requirements regarding evidence assessments. There appear to be a number of common issues across countries, which likely make our findings of interest to many countries.

Conclusions

In this qualitative study, there was general agreement among Swiss stakeholders and international funders that new clinical trials should be justified by a systematic evidence assessment. However, investigators reported that barriers on individual, organizational, and political levels still regularly kept them from implementing these assessments. In their role as gatekeepers, funding agencies and ethics committees are in a position to enforce an evidence-based research approach by making it mandatory for new clinical trials.17,45 In addition, universities should train students and researchers in evidence-based methods and raise awareness of the importance of a systematic and transparent approach to justify new trials.

Back to top
Article Information

Accepted for Publication: October 4, 2021.

Published: November 30, 2021. doi:10.1001/jamanetworkopen.2021.36577

Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2021 McLennan S et al. JAMA Network Open.

Corresponding Author: Matthias Briel, MD, PhD, Department of Clinical Research, Basel Institute for Clinical Epidemiology and Biostatistics, University of Basel and University Hospital Basel, Spitalstrasse 12, 4031 Basel, Switzerland (matthias.briel@usb.ch).

Author Contributions: Drs McLennan and Briel had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: McLennan, Briel.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: McLennan, Nussbaumer-Streit, Briel.

Critical revision of the manuscript for important intellectual content: All authors.

Obtained funding: Hemkens, Briel.

Administrative, technical, or material support: McLennan, Nussbaumer-Streit, Briel.

Supervision: Briel.

Conflict of Interest Disclosures: Dr Hemkens reported receiving a grant from the Swiss National Science Foundation during the conduct of the study. Dr Briel reported receiving grants from the Swiss National Science Foundation during the conduct of the study. No other disclosures were reported.

Funding/Support: This study was supported by project grant IZCOZ0_198082/1 from the Swiss National Science Foundation.

Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
Sutton  AJ, Cooper  NJ, Jones  DR.  Evidence synthesis as the key to more coherent and efficient research.   BMC Med Res Methodol. 2009;9:29. doi:10.1186/1471-2288-9-29 PubMedGoogle ScholarCrossref
2.
Clarke  M.  Doing new research? don’t forget the old.   PLoS Med. 2004;1(2):e35. doi:10.1371/journal.pmed.0010035 PubMedGoogle Scholar
3.
Clarke  M, Hopewell  S, Chalmers  I.  Clinical trials should begin and end with systematic reviews of relevant evidence: 12 years and waiting.   Lancet. 2010;376(9734):20-21. doi:10.1016/S0140-6736(10)61045-8 PubMedGoogle ScholarCrossref
4.
Chalmers  I, Bracken  MB, Djulbegovic  B,  et al.  How to increase value and reduce waste when research priorities are set.   Lancet. 2014;383(9912):156-165. doi:10.1016/S0140-6736(13)62229-1 PubMedGoogle ScholarCrossref
5.
Robinson  KA, Brunnhuber  K, Ciliska  D, Juhl  CB, Christensen  R, Lund  H; Evidence-Based Research Network.  Evidence-based research series—paper 1: what evidence-based research is and why is it important?   J Clin Epidemiol. 2021;129:151-157. doi:10.1016/j.jclinepi.2020.07.020 PubMedGoogle ScholarCrossref
6.
Clarke  M, Brice  A, Chalmers  I.  Accumulating research: a systematic account of how cumulative meta-analyses would have provided knowledge, improved health, reduced harm and saved resources.   PLoS One. 2014;9(7):e102670. doi:10.1371/journal.pone.0102670 PubMedGoogle Scholar
7.
Habre  C, Tramèr  MR, Pöpping  DM, Elia  N.  Ability of a meta-analysis to prevent redundant research: systematic review of studies on pain from propofol injection.   BMJ. 2014;348:g5219. doi:10.1136/bmj.g5219 PubMedGoogle Scholar
8.
Lau  J, Antman  EM, Jimenez-Silva  J, Kupelnick  B, Mosteller  F, Chalmers  TC.  Cumulative meta-analysis of therapeutic trials for myocardial infarction.   N Engl J Med. 1992;327(4):248-254. doi:10.1056/NEJM199207233270406 PubMedGoogle ScholarCrossref
9.
Fiorentino  F, Vasilakis  C, Treasure  T.  Clinical reports of pulmonary metastasectomy for colorectal cancer: a citation network analysis.   Br J Cancer. 2011;104(7):1085-1097. doi:10.1038/sj.bjc.6606060 PubMedGoogle ScholarCrossref
10.
Greenberg  SA.  How citation distortions create unfounded authority: analysis of a citation network.   BMJ. 2009;339:b2680. doi:10.1136/bmj.b2680 PubMedGoogle ScholarCrossref
11.
Gøtzsche  PC.  Reference bias in reports of drug trials.   BMJ (Clin Res Ed). 1987;295(6599):654-656. doi:10.1136/bmj.295.6599.654 PubMedGoogle ScholarCrossref
12.
Robinson  KA, Goodman  SN.  A systematic examination of the citation of prior research in reports of randomized, controlled trials.   Ann Intern Med. 2011;154(1):50-55. doi:10.7326/0003-4819-154-1-201101040-00007 PubMedGoogle ScholarCrossref
13.
Clarke  M, Hopewell  S.  Many reports of randomised trials still don't begin or end with a systematic review of the relevant evidence.   J Bahrain Med Soc. 2013;24:145-148.Google Scholar
14.
Engelking  A, Cavar  M, Puljak  L.  The use of systematic reviews to justify anaesthesiology trials: a meta-epidemiological study.   Eur J Pain. 2018;22(10):1844-1849. doi:10.1002/ejp.1280 PubMedGoogle ScholarCrossref
15.
Goudie  AC, Sutton  AJ, Jones  DR, Donald  A.  Empirical assessment suggests that existing evidence could be used more fully in designing randomized controlled trials.   J Clin Epidemiol. 2010;63(9):983-991. doi:10.1016/j.jclinepi.2010.01.022 PubMedGoogle ScholarCrossref
16.
Chan  AW, Hróbjartsson  A, Jørgensen  KJ, Gøtzsche  PC, Altman  DG.  Discrepancies in sample size calculations and data analyses reported in randomised trials: comparison of publications with protocols.   BMJ. 2008;337:a2299. doi:10.1136/bmj.a2299 PubMedGoogle ScholarCrossref
17.
Bhurke  S, Cook  A, Tallant  A, Young  A, Williams  E, Raftery  J.  Using systematic reviews to inform NIHR HTA trial planning and design: a retrospective cohort.   BMC Med Res Methodol. 2015;15:108. doi:10.1186/s12874-015-0102-2 PubMedGoogle ScholarCrossref
18.
Clayton  GL, Smith  IL, Higgins  JPT,  et al.  The INVEST project: investigating the use of evidence synthesis in the design and analysis of clinical trials.   Trials. 2017;18(1):219. doi:10.1186/s13063-017-1955-y PubMedGoogle ScholarCrossref
19.
von Niederhäusern  B, Magnin  A, Pauli-Magnus  C.  The impact of clinical trial units on the value of clinical research in Switzerland.   Swiss Med Wkly. 2018;148:w14615.PubMedGoogle Scholar
20.
Swiss National Science Foundation. Investigator Initiated Clinical Trials (IICTs): call for proposals 2020. Accessed July 5, 2021. http://www.snf.ch/SiteCollectionDocuments/IICT_Call%202020_EN.pdf
21.
Jones  AP, Conroy  E, Williamson  PR, Clarke  M, Gamble  C.  The use of systematic reviews in the planning, design and conduct of randomised trials: a retrospective cohort of NIHR HTA funded trials.   BMC Med Res Methodol. 2013;13:50. doi:10.1186/1471-2288-13-50 PubMedGoogle ScholarCrossref
22.
Pandis  N, Fleming  PS, Koletsi  D, Hopewell  S.  The citation of relevant systematic reviews and randomised trials in published reports of trial protocols.   Trials. 2016;17(1):581. doi:10.1186/s13063-016-1713-6 PubMedGoogle ScholarCrossref
23.
Joseph  PD, Caldwell  PH, Barnes  EH,  et al.  Completeness of protocols for clinical trials in children submitted to ethics committees.   J Paediatr Child Health. 2019;55(3):291-298. doi:10.1111/jpc.14189 PubMedGoogle ScholarCrossref
24.
Nasser  M, Clarke  M, Chalmers  I,  et al.  What are funders doing to minimise waste in research?   Lancet. 2017;389(10073):1006-1007. doi:10.1016/S0140-6736(17)30657-8 PubMedGoogle ScholarCrossref
25.
Ensuring Value in Research. Accessed October 14, 2021. https://evir.org/
26.
Tong  A, Sainsbury  P, Craig  J.  Consolidated Criteria for Reporting Qualitative Research (COREQ): a 32-item checklist for interviews and focus groups.   Int J Qual Health Care. 2007;19(6):349-357. doi:10.1093/intqhc/mzm042 PubMedGoogle ScholarCrossref
27.
McLennan  S, Griessbach  A, Briel  M; Making Randomized Trials Affordable (MARTA) Group.  Practices and attitudes of Swiss stakeholders regarding investigator-initiated clinical trial funding acquisition and cost management.   JAMA Netw Open. 2021;4(6):e2111847. doi:10.1001/jamanetworkopen.2021.11847 PubMedGoogle Scholar
28.
Gloy  V, McLennan  S, Rinderknecht  M,  et al.  Uncertainties about the need for ethics approval in Switzerland: a mixed-methods study.   Swiss Med Wkly. 2020;150:w20318. doi:10.4414/smw.2020.20318 PubMedGoogle Scholar
29.
Briel  M, Speich  B, von Elm  E, Gloy  V.  Comparison of randomized controlled trials discontinued or revised for poor recruitment and completed trials with the same research question: a matched qualitative study.   Trials. 2019;20(1):800. doi:10.1186/s13063-019-3957-4 PubMedGoogle ScholarCrossref
30.
Briel  M, Elger  B, von Elm  E, Satalkar  P.  Insufficient recruitment and premature discontinuation of clinical trials in Switzerland: qualitative study with trialists and other stakeholders.   Swiss Med Wkly. 2017;147:w14556.PubMedGoogle Scholar
31.
McLennan  S.  Rejected online feedback from a Swiss physician rating website between 2008 and 2017: analysis of 2352 ratings.   J Med Internet Res. 2020;22(8):e18374. doi:10.2196/18374 PubMedGoogle Scholar
32.
McLennan  S.  The content and nature of narrative comments on Swiss physician rating websites: analysis of 849 comments.   J Med Internet Res. 2019;21(9):e14336. doi:10.2196/14336 PubMedGoogle Scholar
33.
McLennan  S. The ethical oversight of learning health care activities in Switzerland: a qualitative study. Int J Qual Health Care. 2019;31(8):G81-G86. doi:10.1093/intqhc/mzz045
34.
McLennan  S, Schwappach  D, Harder  Y, Staender  S, Elger  B.  Patient safety issues in office-based surgery and anaesthesia in Switzerland: a qualitative study.   Z Evid Fortbild Qual Gesundhwes. 2017;125:23-29. doi:10.1016/j.zefq.2017.06.002 PubMedGoogle ScholarCrossref
35.
Palinkas  LA, Horwitz  SM, Green  CA, Wisdom  JP, Duan  N, Hoagwood  K.  Purposeful sampling for qualitative data collection and analysis in mixed method implementation research.   Adm Policy Ment Health. 2015;42(5):533-544. doi:10.1007/s10488-013-0528-y PubMedGoogle ScholarCrossref
36.
Marshall  MN.  Sampling for qualitative research.   Fam Pract. 1996;13(6):522-525. doi:10.1093/fampra/13.6.522 PubMedGoogle ScholarCrossref
37.
Fusch  PI, Ness  LR.  Are we there yet? data saturation in qualitative research.   Qual Rep. 2015;20(9):1408-1416. doi:10.46743/2160-3715/2015.2281Google Scholar
38.
Lund  H, Juhl  CB, Nørgaard  B,  et al; Evidence-Based Research Network.  Evidence-based research series—paper 2: using an evidence-based research approach before a new study is conducted to ensure value.   J Clin Epidemiol. 2021;129:158-166. doi:10.1016/j.jclinepi.2020.07.019 PubMedGoogle ScholarCrossref
39.
Lund  H, Juhl  CB, Nørgaard  B,  et al; Evidence-Based Research Network.  Evidence-based research series—paper 3: using an evidence-based research approach to place your results into context after the study is performed to ensure usefulness of the conclusion.   J Clin Epidemiol. 2021;129:167-171. doi:10.1016/j.jclinepi.2020.07.021 PubMedGoogle ScholarCrossref
40.
Spencer EA, Heneghan C. Confirmation bias. In: Catalogue of Bias. Catalogue of Bias Collaboration; 2018. Accessed July 5, 2021. https://catalogofbias.org/biases/confirmation-bias/
41.
Ganann  R, Ciliska  D, Thomas  H.  Expediting systematic reviews: methods and implications of rapid reviews.   Implement Sci. 2010;5:56. doi:10.1186/1748-5908-5-56 PubMedGoogle ScholarCrossref
42.
Garritty  C, Gartlehner  G, Kamel  C,  et al. Cochrane Rapid Reviews—interim guidance from the Cochrane Rapid Reviews Methods Group. March 23, 2020. Accessed April 29, 2021. https://methods.cochrane.org/rapidreviews/sites/methods.cochrane.org.rapidreviews/files/public/uploads/cochrane_rr_-_guidance-23mar2020-final.pdf
43.
Dobbins  M; National Collaborating Centre of Methods and Tools. Rapid review guidebook. 2017. Accessed April 29, 2021. https://www.nccmt.ca/uploads/media/media/0001/02/800fe34eaedbad09edf80ad5081b9291acf1c0c2.pdf
44.
Garritty  CM, Norris  SL, Moher  D.  Developing WHO rapid advice guidelines in the setting of a public health emergency.   J Clin Epidemiol. 2017;82:47-60. doi:10.1016/j.jclinepi.2016.08.010 PubMedGoogle ScholarCrossref
45.
Kim  D, Hasford  J.  Redundant trials can be prevented, if the EU clinical trial regulation is applied duly.   BMC Med Ethics. 2020;21(1):107. doi:10.1186/s12910-020-00536-9 PubMedGoogle ScholarCrossref
46.
Smyth  RM, Jacoby  A, Altman  DG, Gamble  C, Williamson  PR.  The natural history of conducting and reporting clinical trials: interviews with trialists.   Trials. 2015;16:16. doi:10.1186/s13063-014-0536-6 PubMedGoogle ScholarCrossref
47.
Nikolakopoulou  A, Trelle  S, Sutton  AJ, Egger  M, Salanti  G.  Synthesizing existing evidence to design future trials: survey of methodologists from European institutions.   Trials. 2019;20(1):334. doi:10.1186/s13063-019-3449-6 PubMedGoogle ScholarCrossref
48.
Bergen  N, Labonté  R.  “Everything is perfect, and we have no problems”: detecting and limiting social desirability bias in qualitative research.   Qual Health Res. 2020;30(5):783-792. doi:10.1177/1049732319889354 PubMedGoogle ScholarCrossref
×