[Skip to Navigation]
Sign In
Figure 1. 
Flow diagram of physician and trial coordinator activities in the clinical trial alert (CTA) process. EHR indicates electronic health record.

Flow diagram of physician and trial coordinator activities in the clinical trial alert (CTA) process. EHR indicates electronic health record.

Figure 2. 
Monthly physician-generated referrals and enrollments before and after clinical trial alert (CTA) activation.

Monthly physician-generated referrals and enrollments before and after clinical trial alert (CTA) activation.

Table 1. 
Number of Physicians Who Referred Patients and Generated Enrollments
Number of Physicians Who Referred Patients and Generated Enrollments
Table 2. 
Physician-Generated Referral and Enrollment Rates, Overall and by Practice Setting
Physician-Generated Referral and Enrollment Rates, Overall and by Practice Setting
1.
Nathan  DGWilson  JD Clinical research and the NIH: a report card.  N Engl J Med 2003;3491860- 1865PubMedGoogle ScholarCrossref
2.
Campbell  EGWeissman  JSMoy  EBlumenthal  D Status of clinical research in academic health centers: views from the research leadership.  JAMA 2001;286800- 806PubMedGoogle ScholarCrossref
3.
Hunninghake  DBDarby  CAProbstfield  JL Recruitment experience in clinical trials: literature summary and annotated bibliography.  Control Clin Trials 1987;8 ((4 suppl)) 6S- 30SPubMedGoogle ScholarCrossref
4.
Marks  LPower  E Using technology to address recruitment issues in the clinical trial process.  Trends Biotechnol 2002;20105- 109PubMedGoogle ScholarCrossref
5.
Sung  NSCrowley  WF  JrGenel  M  et al.  Central challenges facing the national clinical research enterprise.  JAMA 2003;2891278- 1287PubMedGoogle ScholarCrossref
6.
Wright  JRCrooks  DEllis  PMMings  DWhelan  TJ Factors that influence the recruitment of patients to phase III studies in oncology: the perspective of the clinical research associate.  Cancer 2002;951584- 1591PubMedGoogle ScholarCrossref
7.
Siminoff  LAZhang  AColabianchi  NSturm  CMShen  Q Factors that predict the referral of breast cancer patients onto clinical trials by their surgeons and medical oncologists.  J Clin Oncol 2000;181203- 1211PubMedGoogle Scholar
8.
 The many reasons why people do (and would) participate in clinical trials. Available at: http://www.harrisinteractive.com. Accessed January 31, 2005
9.
Lara  PN  JrHigdon  RLim  N  et al.  Prospective evaluation of cancer clinical trial accrual patterns: identifying potential barriers to enrollment.  J Clin Oncol 2001;191728- 1733PubMedGoogle Scholar
10.
Taylor  KMMargolese  RGSoskolne  CL Physicians’ reasons for not entering eligible patients in a randomized clinical trial of surgery for breast cancer.  N Engl J Med 1984;3101363- 1367PubMedGoogle ScholarCrossref
11.
Mansour  EG Barriers to clinical trials, III: knowledge and attitudes of health care providers.  Cancer 1994;74 ((9 suppl)) 2672- 2675PubMedGoogle ScholarCrossref
12.
Fisher  WBCohen  SJHammond  MKTurner  SLoehrer  PJ Clinical trials in cancer therapy: efforts to improve patient enrollment by community oncologists.  Med Pediatr Oncol 1991;19165- 168PubMedGoogle ScholarCrossref
13.
Breitfeld  PPWeisburd  MOverhage  JMSledge  G  JrTierney  WM Pilot study of a point-of-use decision support tool for cancer clinical trials eligibility.  J Am Med Inform Assoc 1999;6466- 477PubMedGoogle ScholarCrossref
14.
Seroussi  BBouaud  J Using OncoDoc as a computer-based eligibility screening system to improve accrual onto breast cancer clinical trials.  Artif Intell Med 2003;29153- 167PubMedGoogle ScholarCrossref
15.
Ash  NOgunyemi  OZeng  QOhno-Machado  L Finding appropriate clinical trials: evaluating encoded eligibility criteria with incomplete data.  Proc AMIA Symp 2001;27- 31PubMedGoogle Scholar
16.
Papaconstantinou  CTheocharous  GMahadevan  S An expert system for assigning patients into clinical trials based on Bayesian networks.  J Med Syst 1998;22189- 202PubMedGoogle ScholarCrossref
17.
Thompson  DSOberteuffer  RDorman  T Sepsis alert and diagnostic system: integrating clinical systems to enhance study coordinator efficiency.  Comput Inform Nurs 2003;2122- 28PubMedGoogle ScholarCrossref
18.
Ohno-Machado  LWang  SJMar  PBoxwala  AA Decision support for clinical trial eligibility determination in breast cancer.  Proc AMIA Symp 1999;340- 344PubMedGoogle Scholar
19.
Fink  EKokku  PKNikiforou  SHall  LOGoldgof  DBKrischer  JP Selection of patients for clinical trials: an interactive Web-based system.  Artif Intell Med 2004;31241- 254PubMedGoogle ScholarCrossref
20.
Gennari  JHSklar  DSilva  J Cross-tool communication: from protocol authoring to eligibility determination.  Proc AMIA Symp 2001;199- 203PubMedGoogle Scholar
21.
Butte  AJWeinstein  DAKohane  IS Enrolling patients into clinical trials faster using RealTime Recuiting.  Proc AMIA Symp 2000;111- 115PubMedGoogle Scholar
22.
Weiner  DLButte  AJHibberd  PLFleisher  GR Computerized recruiting for clinical trials in real time.  Ann Emerg Med 2003;41242- 246PubMedGoogle ScholarCrossref
23.
Afrin  LBOates  JCBoyd  CKDaniels  MS Leveraging of open EMR architecture for clinical trial accrual.  AMIA Annu Symp Proc. 2003;16- 20Google Scholar
24.
Carlson  RWTu  SWLane  NM  et al.  Computer-based screening of patients with HIV/AIDS for clinical-trial eligibility.  Online J Curr Clin Trials March28 1995;Document 179Google Scholar
25.
Moore  TDHotz  KChristensen  R  et al.  Integration of clinical trial decision rules in an electronic medical record (EMR) enhances patient accrual and facilitates data management, quality control and analysis.  Paper presented at: American Society of Clinical Oncology May 31, 2003 Chicago, Ill
26.
Musen  MACarlson  RWFagan  LMDeresinski  SCShortliffe  EH T-HELPER: automated support for community-based clinical research.  Proc Annu Symp Comput Appl Med Care 1992;719- 723PubMedGoogle Scholar
27.
Hersh  W Health care information technology: progress and barriers.  JAMA 2004;2922273- 2274PubMedGoogle ScholarCrossref
28.
Centers for Medicare & Medicaid Services, US Health Insurance Portability and Accountability Act of 1996. Available at: http://www.cms.hhs.gov/hipaa/. Accessed January 31, 2005
29.
Jain  AAtreja  AHarris  CMLehmann  MBurns  JYoung  J Responding to the rofecoxib withdrawal crisis: a new model for notifying patients at risk and their health care providers.  Ann Intern Med 2005;142182- 186PubMedGoogle ScholarCrossref
30.
Mannel  RSWalker  JLGould  N  et al.  Impact of individual physicians on enrollment of patients into clinical trials.  Am J Clin Oncol 2003;26171- 173PubMedGoogle ScholarCrossref
31.
Cohen  GI Clinical research by community oncologists.  CA Cancer J Clin 2003;5373- 81PubMedGoogle ScholarCrossref
32.
Crosson  KEisner  EBrown  CTer Maat  J Primary care physicians’ attitudes, knowledge, and practices related to cancer clinical trials.  J Cancer Educ 2001;16188- 192PubMedGoogle Scholar
33.
King  TE  Jr Racial disparities in clinical trials.  N Engl J Med 2002;3461400- 1402PubMedGoogle ScholarCrossref
34.
Winn  RJ Obstacles to the accrual of patients to clinical trials in the community setting.  Semin Oncol 1994;21 ((suppl 7)) 112- 117PubMedGoogle Scholar
35.
Fallowfield  LRatcliffe  DSouhami  R Clinicians’ attitudes to clinical trials of cancer therapy.  Eur J Cancer 1997;332221- 2229PubMedGoogle ScholarCrossref
36.
Cox  KMcGarry  J Why patients don’t take part in cancer clinical trials: an overview of the literature.  Eur J Cancer Care (Engl) 2003;12114- 122PubMedGoogle ScholarCrossref
37.
Kinney  AYRichards  CVernon  SWVogel  VG The effect of physician recommendation on enrollment in the Breast Cancer Chemoprevention Trial.  Prev Med 1998;27713- 719PubMedGoogle ScholarCrossref
38.
Kohn  LTed Academic Health Centers: Leading Change in the 21st Century.  Washington, DC National Academy Press2003;
39.
Kohn  LedCorrigan  JMedDonaldson  Med To Err Is Human: Building a Safer Health System.  Washington, DC National Academy Press1999;
40.
Brailer  DJ The decade of health information technology: delivering consumer-centric and information-rich health care.  Framework for Strategic Action Available at: http://www.hhs.gov/healthit/frameworkchapters.html. Accessed January 31, 2005Google Scholar
41.
Ash  JSBates  DW Factors and forces affecting EHR system adoption: report of a 2004 ACMI discussion.  J Am Med Inform Assoc 2005;128- 12PubMedGoogle ScholarCrossref
42.
Bell-Syer  SEMoffett  JA Recruiting patients to randomized trials in primary care: principles and case study.  Fam Pract 2000;17187- 191PubMedGoogle ScholarCrossref
Original Investigation
October 24, 2005

Effect of a Clinical Trial Alert System on Physician Participation in Trial Recruitment

Author Affiliations

Author Affiliations: Division of General Internal Medicine, University of Cincinnati College of Medicine (Dr Embi), and Institute for the Study of Health, University of Cincinnati Medical Center (Drs Embi and Hornung), Cincinnati, Ohio; and Department of Rheumatic and Immunologic Diseases (Dr Embi), Information Technology Division (Drs Jain and Harris and Mr Clark), Department of General Internal Medicine (Dr Jain and Harris), and Department of Endocrinology, Diabetes and Metabolism (Ms Bizjack), Cleveland Clinic Foundation, Cleveland, Ohio.

Arch Intern Med. 2005;165(19):2272-2277. doi:10.1001/archinte.165.19.2272
Abstract

Background  Failure to recruit a sufficient number of eligible subjects in a timely manner represents a major impediment to the success of clinical trials. Physician participation is vital to trial recruitment but is often limited.

Methods  After 12 months of traditional recruitment to a clinical trial, we activated our electronic health record (EHR)–based clinical trial alert (CTA) system in selected outpatient clinics of a large, US academic health care system. When a patient’s EHR data met selected trial criteria during the subsequent 4-month intervention period, the CTA prompted physician consideration of the patient’s eligibility and facilitated secure messaging to the trial’s coordinator. Subjects were the 114 physicians practicing at selected EHR-equipped clinics throughout our study. We compared differences in the number of physicians participating in recruitment and their recruitment rates before and after CTA activation.

Results  The CTA intervention was associated with significant increases in the number of physicians generating referrals (5 before and 42 after; < .001) and enrollments (5 before and 11 after; = .03), a 10-fold increase in those physicians’ referral rate (5.7/mo before and 59.5/mo after; rate ratio, 10.44; 95% confidence interval, 7.98-13.68; P<.001), and a doubling of their enrollment rate (2.9/mo before and 6.0/mo after; rate ratio, 2.06; 95% confidence interval, 1.22-3.46; = .007).

Conclusions  Use of an EHR-based CTA led to significant increases in physicians’ participation in and recruitment rates to an ongoing clinical trial. Given the trend toward EHR implementation in health care centers engaged in clinical research, this approach may represent a much-needed solution to the common problem of inadequate trial recruitment.

Clinical trials are essential to the advancement of medical science and are a priority for academic health centers and research funding agencies.1,2 Their success depends on the recruitment of a sufficient number of eligible subjects in a timely manner. Unfortunately, difficulties achieving recruitment goals are common, and failure to meet such goals can impede the development and evaluation of new therapies and can increase costs to the health care system.3-5

Physician participation is critical to successful trial recruitment.3,6 In addition to helping identify potentially eligible subjects, recruitment by a treating physician increases the likelihood that a given patient will participate in a trial.7,8 However, due in part to the demands of clinical practice, a limited number of physicians recruit patients for clinical trials, and most patients are never offered the opportunity to participate.7,8 Even in fields like oncology, where clinical trial enrollment for all eligible patients is considered the goal, as few as 2% of patients enroll in trials.9 In addition to impeding trial completion, limited recruitment can introduce bias to a trial and prevent some patients from receiving potentially beneficial therapy. Physicians cite limited awareness of the trial, time constraints, and difficulty following enrollment procedures among their reasons for not recruiting patients.10-12

Numerous technological approaches have been developed in attempts to en-hance clinical trial recruitment.13-20 Some have shown promise by using computerized clinical databases to automate the identification of potentially eligible patients.21,22 Electronic health record (EHR)–based approaches have also been described, although mostly in specialized settings, and few have been subjected to controlled study or demonstrated significant benefit.23-26 It remains to be determined whether the resources of a comprehensive EHR can be leveraged for the benefit of clinical trial recruitment as effectively as they have been for patient safety and health care quality.27

We developed an EHR-based clinical trial alert (CTA) system to overcome many of the known obstacles to trial recruitment by physicians while complying with current privacy regulations.28 The aim of this study was to determine if CTA use could enhance physicians’ participation in the recruitment of subjects and increase physician-generated recruitment rates to an ongoing clinical trial.

Methods
Setting

We evaluated our CTA intervention at The Cleveland Clinic, Cleveland, Ohio, a large, US academic health care system with a fully implemented, commercial ambulatory EHR (EpicCare; Epic Systems Corp, Madison, Wis). This EHR served as the sole point of documentation and computerized provider order entry at almost all of the institution’s ambulatory clinical practices, and its use at our selected clinical sites preceded this study by at least 6 months.

Design

We compared physicians’ trial recruitment activities during a 12-month phase of baseline recruitment with those of a 4-month intervention phase, during which CTA use was added to baseline recruitment efforts. The preintervention phase began on February 1, 2003, coincident with the onset of recruitment to the associated clinical trial. As part of the clinical trial’s recruitment strategy, physicians across the institution, including our study participants, were encouraged via traditional means (eg, the posting of flyers, memorandum distribution, and discussion at departmental meetings) to recruit subjects for the clinical trial. Beginning in February 2004, the CTA was activated for 4 months.

To test our intervention, we sought a clinical trial with a planned recruitment period that spanned our study and for which potential subjects would likely be encountered at main campus and community health center clinical sites. We identified an institutional review board–approved, multicenter trial of patients with type 2 diabetes mellitus that fit our criteria. We received approval from the institutional review board for our study and obtained the cooperation of the clinical trial’s coordinator and site principal investigator.

Participants

We targeted for this intervention the 10 endocrinologists and 104 general internists on staff at the selected clinical sites during both phases of our study and who had the opportunity to participate in recruitment activities throughout. Selected clinical sites included the study institution’s main campus endocrinology and general internal medicine clinics and the general internal medicine clinics at the institution’s 12 EHR-equipped community health centers located throughout the city and its suburbs.

Intervention

To develop the CTA, we combined features of the EHR’s clinical decision support system (CDSS) and its communications capabilities. A flow diagram of the CTA process is shown in Figure 1.

Using the CDSS programming interface, the CTA was set to trigger during physician-patient encounters at the selected clinical sites when data in a patient’s EHR met selected trial criteria (ie, age >40 years; any International Classification of Diseases, Ninth Revision, diagnosis of type 2 diabetes mellitus; and glycosylated hemoglobin level of >7.4%). We chose these criteria after consultation with the trial’s principal investigator and an analysis of which candidate criteria were present in the EHR. When triggered, the initial CTA screen alerted the physician about the ongoing trial and the current patient’s potential eligibility. If the physician opted to continue rather than dismiss the alert, a customized CTA order form appeared on-screen. This form prompted consideration of additional key eligibility criteria not consistently retrievable from the EHR (eg, history of cardiovascular disease) and discussion of the eligibility and level of interest with the current patient. The physician could then generate an order in the EHR by choosing 1 of the following 3 options on the CTA order form: (1) The patient meets criteria and is interested (referral order). (2) The patient does not meet study criteria at this time (not eligible order). (3) The patient meets criteria but is not interested at this time (not interested order).

Selection of a referral order automatically transmitted a secure message within the EHR to the trial coordinator. This message included a link to the patient’s electronic medical chart and confirmed the patient’s permission allowing review of that chart to determine eligibility. After chart review, the trial coordinator telephoned all referred patients and invited those potentially eligible for further screening. A referral order also appended general trial-related information to the patient instructions printed at the close of the clinic visit. These after-visit instructions informed patients that they should expect to be contacted by the trial’s coordinator within 2 weeks regarding their eligibility, and that they were not obligated to proceed should they change their minds. The presence of a referral order prevented a CTA from triggering at a future visit for that patient. Before activation in the operational EHR system, proper functioning of the CTA was confirmed in a test environment.

Data collection

We monitored the number of physicians contributing to trial recruitment as well as physician-generated referral and enrollment rates during both phases of our study. The clinical trial’s coordinator documented the date and source of any referrals received and, if physician generated, from which clinics the referrals originated. Enrollments were similarly tracked, but because of the lag between a referral and its resultant enrollment, we attributed an enrollment to the study phase when its associated referral occurred. We assessed the enrollment status of all referred patients 2 months after the conclusion of the 4-month intervention phase. The CTA events and physicians’ responses to CTAs were gathered by querying the EHR’s clinical data warehouse using a previously validated method.29

Individual physician response data were kept confidential and were aggregated by clinical site for further analysis. Patients’ protected health information was not accessed or removed from the EHR solely for the purposes of our study. Physicians and the clinical trial’s coordinator reviewed the protected health information of patients only in the context of performing their patient care or eligibility determination duties, respectively. Given practical considerations and the limited potential for harm, we were exempted from obtaining signed informed consent from our physician subjects. Instead, we sent a letter informing them of the CTA study and the plan to keep individual data confidential. Physicians could refuse to participate whenever presented with a CTA by opting not to proceed.

Statistical analysis

Descriptive analyses were generated by practice type for the numbers of physicians participating in recruitment and their monthly recruitment rates. To test the significance of differences between the numbers of physicians who participated in recruitment efforts during each study phase, we applied the McNemar exact test for matched data. For referral and enrollment rates, we assumed a Poisson distribution. Differences between referral and enrollment rates before and after the intervention were tested using a likelihood ratio test based on the Poisson distribution with an offset for the unequal time periods. Confidence intervals (CIs) were calculated from the resultant likelihood ratios. All statistical calculations were performed using SAS (version 8; SAS Institute Inc, Cary, NC).

Results
Physician participation

During the 4-month intervention phase, all 114 physicians received at least 1 CTA each, and 48 (42%) participated by processing at least 1 CTA order form. Of those 48 who participated, 42 (88%) referred at least 1 patient to the trial coordinator, and 11 (23%) generated at least 1 enrollment. The number of physicians referring patients after CTA activation increased more than 8-fold from 5 before to 42 after (P<.001), and the number generating enrollments more than doubled from 5 before to 11 after (P = .03) (Table 1). These increases included participation by 36 internists and 1 endocrinologist who had not contributed to recruitment before the intervention.

Physician-generated referral and enrollment rates

After CTA activation, the physician-generated referral rate increased more than 10-fold from 5.7/mo before to 59.5/mo after (P<.001), and the enrollment rate more than doubled from 2.9/mo before to 6.0/mo after (= .007) (Table 2). Referral and enrollment rates fluctuated slightly from month to month before the intervention, but increased markedly after CTA activation (Figure 2). Although general internists had not contributed to recruitment before, they generated 170 (71%) of the referrals and 7 (29%) of the enrollments after CTA activation. Use of the CTA was also associated with a significant referral rate increase among endocrinologists, but their 47% enrollment rate increase did not reach statistical significance (Table 2).

Cta events and outcomes

During the 4-month intervention phase, there were 4780 CTAs triggered during clinical encounters with 3158 individual patients. Of all CTAs triggered, physicians dismissed 4295 (90%) of them without further action. Of the 485 CTAs acted on, physicians referred 238 (49%) to the trial coordinator and indicated that the rest were either ineligible (n = 153 [32%]) or eligible but disinterested (n = 94 [19%]). These proportions did not change significantly throughout the intervention phase.

Of the 238 referrals, the trial coordinator found that 121 (51%) remained potentially eligible after a thorough EHR chart review. Those patients were invited for further evaluation, and 60 (50%) presented for a screening visit. At final analysis, 24 (40%) of those who underwent screening had enrolled, whereas the rest were ineligible or disinterested (18 patients [30%]) or were “on hold” awaiting further data to determine if they might eventually become eligible (18 patients [30%]).

Comment

We found that the use of an EHR-based CTA system was associated with significant increases in the number of physicians participating in trial recruitment and physician-generated referral and enrollment rates to the associated clinical trial. Despite traditional efforts to promote recruitment institution wide, a few endocrinologists practicing at the primary trial site did all of the recruiting before the intervention, a common occurrence.30,31 Although physicians at this site continued to generate most of the enrollments after CTA activation, the intervention increased their referral rate 3-fold and their enrollment rate by nearly 50%, suggesting that it helped to recruit subjects being missed even in that site. In addition, general internists’ participation increased dramatically after CTA activation. Their contributions during our intervention phase accounted for more than two thirds of the referrals, almost one third of the enrollments, and were essential to the statistically significant increase in the enrollment rate observed.

This enhanced recruitment among general internists after CTA activation is particularly notable given the recognized value of recruiting patients from primary care and community-based sites but the historic difficulty of doing so.32 Beyond enhancing recruitment rates, CTA use in such practice settings could help to limit referral bias, bring the benefits of clinical trials to a wider patient population, help community physicians enjoy the benefits of contributing to trial recruitment, and potentially help overcome some of the disparities present in clinical trials.5,31,33

Although this study was not designed to assess reasons for the CTA’s impact, some of our design choices were based on factors previously recognized to influence physician participation and may have contributed to its success.10,11,34,35 Traditional recruitment requires that physicians remember a clinical trial is active, recall the trial’s details to determine patient eligibility, and take time to perform other recruitment activities. The CTA likely helped our physicians overcome some of those obstacles to participation by alerting them about their patients’ potential eligibility for the trial at the point of care, facilitating communication and documentation tasks required for patient referral, and shifting much of the work of eligibility determination away from the physician.

In addition to physician factors, patient factors play an important role in successful trial recruitment.36 Previous studies have demonstrated that being recruited by one’s treating physician, feeling actively involved in the decision-making process, and taking time to consider whether to participate in a trial all significantly increase the likelihood that a patient will enroll.7,37 In designing our CTA, we also attempted to capitalize on these recognized success factors by prompting treating physicians to recruit their patients, providing basic trial information to the patient in the after-visit instructions, and giving patients time to consider their decision to participate before being contacted by the trial coordinator.

Driven by calls from governmental and advisory organizations,38-40 there is a trend toward implementation of comprehensive EHRs with functionality similar to that of the one studied in academic health centers.41 It may, therefore, be possible to replicate the CTA approach in such settings. Because past clinical trial success or failure has sometimes been considered by research funding agencies as a predictor of future performance when making research funding decisions,3 the ability to enhance trial completion with CTA technology may have implications for academic health centers. Should the CTA approach prove effective at augmenting current recruitment strategies in other such settings, academic health centers may view its potential benefit to their research missions as yet another reason to implement EHRs.

Although our use of the EHR’s CDSS to build the CTA may contribute to the generalizability of these findings, CDSS technology was not designed for this purpose and presented some limitations. For instance, our EHR’s CDSS did not allow activation of the CTA at the individual physician level, thereby preventing exclusion of disinterested physicians from the intervention. It also limited the kinds of EHR data that we were able to use and the ways in which we could combine EHR data to trigger the CTA, thereby allowing the CTA to trigger for more ineligible patients than we would have liked. These technological limitations likely contributed to the high proportion of physician nonresponders (58%) and to the high rate of dismissed CTAs (90%). Further, by allowing dismissal of CTAs without a response, the technology did not permit us to determine if such dismissals were due to physician disinterest, knowledge of the patient’s ineligibility, or some other reason.

After CTA activation, we observed a substantially higher ratio of physicians’ referrals to enrollments than previously. Although some of this likely related to the technical limitations noted above, this was somewhat expected, given our design choices. By choosing rather sensitive CTA triggering criteria, we decided to tolerate a potentially high false-positive referral rate to minimize false-negative referrals or missed eligible patients. Although this choice almost certainly contributed to some inefficiency, the observed statistically significant increase in the enrollment rate suggests that it was an effective approach. Indeed, prior research has demonstrated that when practice sites refer patients for trial consideration based on broad entry criteria and leave eligibility determination to the researcher, they achieve a higher recruitment success rate compared with practices that try to precisely determine eligibility before referral.42 Moreover, we observed that the clinical trial coordinator’s ability to review CTA-referred patients’ data via the EHR allowed for prescreening and resulted in fewer in-person screenings than might have otherwise been necessary. Whether a more specifically targeted CTA would have been as effective while also being more efficient will be examined in future studies.

This study has several limitations, some of which are noted above. We know of no other changes to ongoing trial recruitment efforts that might have had an effect on physician participation during the intervention phase, and recruitment trends were stable or even declining before our intervention (Figure 2). Nevertheless, it is still possible that unrecognized differences between the study phases may have accounted for some of the improvements observed due to our before-after study design. Furthermore, because the CTA was tested on a single clinical trial, at a single institution, using a single EHR, the benefits observed may relate to features specific to the trial, the EHR, or the study setting. It is possible that physicians altered their behavior because they were aware there CTA responses were being monitored (ie, a Hawthorne effect). It is also possible that the increased recruitment rates observed during the 4-month intervention phase might wane with prolonged use.

Future plans include refinement of the CTA system based on the lessons learned from this study and ongoing evaluations of subjects’ perceptions. We intend to test the CTA’s influence on multiple and varied clinical trials and assess the generalizability of our findings through testing in other EHRs at different health centers.

Conclusions

Use of a CTA during EHR-based outpatient practice was associated with significant increases in physician participation and in physician-generated recruitment rates to an ongoing clinical trial. Given the trend toward adoption of similarly capable EHRs in health centers engaged in clinical research, the CTA approach may be applicable to such settings and could represent a solution to the common problem of slow and insufficient trial recruitment by a limited number of physicians. Further studies are planned to optimize the CTA approach and assess its utility in other settings.

Back to top
Article Information

Correspondence: Peter J. Embi, MD, MS, University of Cincinnati College of Medicine, 231 Albert Sabin Way, Room 6603, PO Box 670535, Cincinnati, OH 45267-0535 (peter.embi@uc.edu).

Accepted for Publication: June 23, 2005.

Financial Disclosure: None.

Funding/Support: This study was supported in part by career development award K22-LM008534 from the National Library of Medicine, National Institutes of Health, Bethesda, Md (Dr Embi).

Role of the Sponsor: The funding agency had no role in any aspect of the design, execution, or publication of this study.

Previous Presentation: This study was presented at the 28th Annual Meeting of the Society of General Internal Medicine; May 14, 2005; New Orleans, La.

Additional Information: Dr Embi had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Acknowledgment: We thank the physicians who participated as subjects in this study. We also thank Byron Hoogwerf, MD; Gary S. Hoffman, MD, MS; Ronnie D. Horner, PhD; Michael G. Ison, MD, MS; James E. Lockshaw, MBA; and Mark H. Eckman, MD, MS, for their contributions to the project and this report.

This article was corrected on 12/2/2005.

References
1.
Nathan  DGWilson  JD Clinical research and the NIH: a report card.  N Engl J Med 2003;3491860- 1865PubMedGoogle ScholarCrossref
2.
Campbell  EGWeissman  JSMoy  EBlumenthal  D Status of clinical research in academic health centers: views from the research leadership.  JAMA 2001;286800- 806PubMedGoogle ScholarCrossref
3.
Hunninghake  DBDarby  CAProbstfield  JL Recruitment experience in clinical trials: literature summary and annotated bibliography.  Control Clin Trials 1987;8 ((4 suppl)) 6S- 30SPubMedGoogle ScholarCrossref
4.
Marks  LPower  E Using technology to address recruitment issues in the clinical trial process.  Trends Biotechnol 2002;20105- 109PubMedGoogle ScholarCrossref
5.
Sung  NSCrowley  WF  JrGenel  M  et al.  Central challenges facing the national clinical research enterprise.  JAMA 2003;2891278- 1287PubMedGoogle ScholarCrossref
6.
Wright  JRCrooks  DEllis  PMMings  DWhelan  TJ Factors that influence the recruitment of patients to phase III studies in oncology: the perspective of the clinical research associate.  Cancer 2002;951584- 1591PubMedGoogle ScholarCrossref
7.
Siminoff  LAZhang  AColabianchi  NSturm  CMShen  Q Factors that predict the referral of breast cancer patients onto clinical trials by their surgeons and medical oncologists.  J Clin Oncol 2000;181203- 1211PubMedGoogle Scholar
8.
 The many reasons why people do (and would) participate in clinical trials. Available at: http://www.harrisinteractive.com. Accessed January 31, 2005
9.
Lara  PN  JrHigdon  RLim  N  et al.  Prospective evaluation of cancer clinical trial accrual patterns: identifying potential barriers to enrollment.  J Clin Oncol 2001;191728- 1733PubMedGoogle Scholar
10.
Taylor  KMMargolese  RGSoskolne  CL Physicians’ reasons for not entering eligible patients in a randomized clinical trial of surgery for breast cancer.  N Engl J Med 1984;3101363- 1367PubMedGoogle ScholarCrossref
11.
Mansour  EG Barriers to clinical trials, III: knowledge and attitudes of health care providers.  Cancer 1994;74 ((9 suppl)) 2672- 2675PubMedGoogle ScholarCrossref
12.
Fisher  WBCohen  SJHammond  MKTurner  SLoehrer  PJ Clinical trials in cancer therapy: efforts to improve patient enrollment by community oncologists.  Med Pediatr Oncol 1991;19165- 168PubMedGoogle ScholarCrossref
13.
Breitfeld  PPWeisburd  MOverhage  JMSledge  G  JrTierney  WM Pilot study of a point-of-use decision support tool for cancer clinical trials eligibility.  J Am Med Inform Assoc 1999;6466- 477PubMedGoogle ScholarCrossref
14.
Seroussi  BBouaud  J Using OncoDoc as a computer-based eligibility screening system to improve accrual onto breast cancer clinical trials.  Artif Intell Med 2003;29153- 167PubMedGoogle ScholarCrossref
15.
Ash  NOgunyemi  OZeng  QOhno-Machado  L Finding appropriate clinical trials: evaluating encoded eligibility criteria with incomplete data.  Proc AMIA Symp 2001;27- 31PubMedGoogle Scholar
16.
Papaconstantinou  CTheocharous  GMahadevan  S An expert system for assigning patients into clinical trials based on Bayesian networks.  J Med Syst 1998;22189- 202PubMedGoogle ScholarCrossref
17.
Thompson  DSOberteuffer  RDorman  T Sepsis alert and diagnostic system: integrating clinical systems to enhance study coordinator efficiency.  Comput Inform Nurs 2003;2122- 28PubMedGoogle ScholarCrossref
18.
Ohno-Machado  LWang  SJMar  PBoxwala  AA Decision support for clinical trial eligibility determination in breast cancer.  Proc AMIA Symp 1999;340- 344PubMedGoogle Scholar
19.
Fink  EKokku  PKNikiforou  SHall  LOGoldgof  DBKrischer  JP Selection of patients for clinical trials: an interactive Web-based system.  Artif Intell Med 2004;31241- 254PubMedGoogle ScholarCrossref
20.
Gennari  JHSklar  DSilva  J Cross-tool communication: from protocol authoring to eligibility determination.  Proc AMIA Symp 2001;199- 203PubMedGoogle Scholar
21.
Butte  AJWeinstein  DAKohane  IS Enrolling patients into clinical trials faster using RealTime Recuiting.  Proc AMIA Symp 2000;111- 115PubMedGoogle Scholar
22.
Weiner  DLButte  AJHibberd  PLFleisher  GR Computerized recruiting for clinical trials in real time.  Ann Emerg Med 2003;41242- 246PubMedGoogle ScholarCrossref
23.
Afrin  LBOates  JCBoyd  CKDaniels  MS Leveraging of open EMR architecture for clinical trial accrual.  AMIA Annu Symp Proc. 2003;16- 20Google Scholar
24.
Carlson  RWTu  SWLane  NM  et al.  Computer-based screening of patients with HIV/AIDS for clinical-trial eligibility.  Online J Curr Clin Trials March28 1995;Document 179Google Scholar
25.
Moore  TDHotz  KChristensen  R  et al.  Integration of clinical trial decision rules in an electronic medical record (EMR) enhances patient accrual and facilitates data management, quality control and analysis.  Paper presented at: American Society of Clinical Oncology May 31, 2003 Chicago, Ill
26.
Musen  MACarlson  RWFagan  LMDeresinski  SCShortliffe  EH T-HELPER: automated support for community-based clinical research.  Proc Annu Symp Comput Appl Med Care 1992;719- 723PubMedGoogle Scholar
27.
Hersh  W Health care information technology: progress and barriers.  JAMA 2004;2922273- 2274PubMedGoogle ScholarCrossref
28.
Centers for Medicare & Medicaid Services, US Health Insurance Portability and Accountability Act of 1996. Available at: http://www.cms.hhs.gov/hipaa/. Accessed January 31, 2005
29.
Jain  AAtreja  AHarris  CMLehmann  MBurns  JYoung  J Responding to the rofecoxib withdrawal crisis: a new model for notifying patients at risk and their health care providers.  Ann Intern Med 2005;142182- 186PubMedGoogle ScholarCrossref
30.
Mannel  RSWalker  JLGould  N  et al.  Impact of individual physicians on enrollment of patients into clinical trials.  Am J Clin Oncol 2003;26171- 173PubMedGoogle ScholarCrossref
31.
Cohen  GI Clinical research by community oncologists.  CA Cancer J Clin 2003;5373- 81PubMedGoogle ScholarCrossref
32.
Crosson  KEisner  EBrown  CTer Maat  J Primary care physicians’ attitudes, knowledge, and practices related to cancer clinical trials.  J Cancer Educ 2001;16188- 192PubMedGoogle Scholar
33.
King  TE  Jr Racial disparities in clinical trials.  N Engl J Med 2002;3461400- 1402PubMedGoogle ScholarCrossref
34.
Winn  RJ Obstacles to the accrual of patients to clinical trials in the community setting.  Semin Oncol 1994;21 ((suppl 7)) 112- 117PubMedGoogle Scholar
35.
Fallowfield  LRatcliffe  DSouhami  R Clinicians’ attitudes to clinical trials of cancer therapy.  Eur J Cancer 1997;332221- 2229PubMedGoogle ScholarCrossref
36.
Cox  KMcGarry  J Why patients don’t take part in cancer clinical trials: an overview of the literature.  Eur J Cancer Care (Engl) 2003;12114- 122PubMedGoogle ScholarCrossref
37.
Kinney  AYRichards  CVernon  SWVogel  VG The effect of physician recommendation on enrollment in the Breast Cancer Chemoprevention Trial.  Prev Med 1998;27713- 719PubMedGoogle ScholarCrossref
38.
Kohn  LTed Academic Health Centers: Leading Change in the 21st Century.  Washington, DC National Academy Press2003;
39.
Kohn  LedCorrigan  JMedDonaldson  Med To Err Is Human: Building a Safer Health System.  Washington, DC National Academy Press1999;
40.
Brailer  DJ The decade of health information technology: delivering consumer-centric and information-rich health care.  Framework for Strategic Action Available at: http://www.hhs.gov/healthit/frameworkchapters.html. Accessed January 31, 2005Google Scholar
41.
Ash  JSBates  DW Factors and forces affecting EHR system adoption: report of a 2004 ACMI discussion.  J Am Med Inform Assoc 2005;128- 12PubMedGoogle ScholarCrossref
42.
Bell-Syer  SEMoffett  JA Recruiting patients to randomized trials in primary care: principles and case study.  Fam Pract 2000;17187- 191PubMedGoogle ScholarCrossref
×