Isaac T, Weissman JS, Davis RB, Massagli M, Cyrulik A, Sands DZ, Weingart SN. Overrides of Medication Alerts in Ambulatory Care. Arch Intern Med. 2009;169(3):305–311. doi:10.1001/archinternmed.2008.551
Electronic prescribing systems with decision support may improve patient safety in ambulatory care by offering drug allergy and drug interaction alerts. However, preliminary studies show that clinicians override most of these alerts.
We performed a retrospective analysis of 233 537 medication safety alerts generated by 2872 clinicians in Massachusetts, New Jersey, and Pennsylvania who used a common electronic prescribing system from January 1, 2006, through September 30, 2006. We used multivariate techniques to examine factors associated with alert acceptance.
A total of 6.6% of electronic prescription attempts generated alerts. Clinicians accepted 9.2% of drug interaction alerts and 23.0% of allergy alerts. High-severity interactions accounted for most alerts (61.6%); clinicians accepted high-severity alerts slightly more often than moderate- or low-severity interaction alerts (10.4%, 7.3%, and 7.1%, respectively; P < .001). Clinicians accepted 2.2% to 43.1% of high-severity interaction alerts, depending on the classes of interacting medications. In multivariable analyses, we found no difference in alert acceptance among clinicians of different specialties (P = .16). Clinicians were less likely to accept a drug interaction alert if the patient had previously received the alerted medication (odds ratio, 0.03; 95% confidence interval, 0.03-0.03).
Clinicians override most medication alerts, suggesting that current medication safety alerts may be inadequate to protect patient safety.
Adverse drug events (ADEs), defined as injuries due to medications, are common in primary care.1- 4 In a study5 of 4 Boston adult outpatient practices, one-quarter of patients who received a prescription experienced an ADE during the 3 months of follow-up, and more than one-third of these injuries were preventable. Organizations such as the Institute of Medicine and the National Patient Safety Foundation indicated that electronic prescribing is an important tool for improving medication safety, and outpatient clinicians are increasingly adopting its use.6- 8 Although rudimentary electronic prescribing can prevent errors through more legible and structured orders, experts believe that integrated clinical decision support, which can alert clinicians about inappropriate medications or harmful medication combinations, is needed to reduce the number of ADEs in ambulatory care.9- 13
To date, studies10- 12 that have examined outpatient electronic prescribing and clinical decision support alerts have been limited to a few clinicians at teaching hospitals. In 1 study12 of 5 academic primary care practices, clinicians accepted less than 10% of medication alerts, and physician reviewers concluded that more than one-third of all alerts lacked an adequate scientific basis or were not clinically useful. If the threshold for sending alerts to clinicians is set too low, prescribers may develop “alert fatigue,” which can lead them to override important warnings.14
As clinicians increasingly adopt electronic prescribing, the development and dissemination of decision support systems will depend in part on whether clinicians find medication safety alerts valuable. To our knowledge, no studies have examined the behavior of clinicians regarding medication alerts in a large number of community practices or across clinicians of different specialties. This information may provide perspective on the perceived usefulness of electronic alerts in outpatient care and insights into how to design better alert systems, which, in turn, may reduce the number and severity of ADEs. Therefore, we studied the electronic prescribing behavior of clinicians in 3 states to address the following questions: How often do clinicians encounter medication safety alerts? How often do clinicians accept alerts, aborting a potentially dangerous prescription? Do accepted medication alerts vary depending on alert type (allergy vs drug interaction), alert severity, or the classes of interacting medications? Do certain characteristics of patients and clinicians affect the decision to accept an alert? Finally, what can we learn from clinician behavior regarding medication safety alerts that could be used to improve alert quality?
We studied electronic prescriptions and associated medication safety alerts generated by all clinicians who used a specific electronic prescribing system in Massachusetts, New Jersey, and Pennsylvania from January 1, 2006, through September 30, 2006. We included clinicians who wrote at least 50 prescriptions during the study period. These 2872 clinicians were distributed across 862 practices; 69% of eligible clinicians were from Massachusetts, 20% from New Jersey, and 11% from Pennsylvania. The study was approved in advance by the Dana-Farber Cancer Institute institutional review board.
All study participants used PocketScript, an electronic prescribing system (Zix Corporation, Dallas, Texas), which enables clinicians to transmit prescriptions electronically to a pharmacy via a desktop computer or handheld device. The system creates a profile of a patient's active medications based on previous electronic prescriptions written by any member within the practice. The system also references dispensed drug history from the patient's pharmacy benefit management records, which has the added benefit of including information from handwritten prescriptions and prescriptions from other clinicians within the state. A patient's medication allergies must be entered into the system by the clinician or a member of the clinician's office staff. When an electronic prescription is written, the system alerts the clinician if the patient is allergic to the medication or if there is a drug-drug interaction (DDI) with one of the patient's active medications. The system uses a set of alerts maintained by Cerner Multum (Denver, Colorado), a health information technology company. Pharmacists at Cerner Multum use a consensus process to grade the severity of each potential DDI on a 3-tier scale (1 indicating high; 2, moderate; and 3, low) based on information from pharmaceutical companies and evidence in the scientific literature. When a DDI alert is generated, a warning banner with the severity category is displayed to the clinician. The prescriber may view a detailed description of the interaction, although this information does not appear on the initial screen. A prescriber may respond to an allergy or a DDI alert by (1) overriding the alert and continuing to prescribe the medication, (2) canceling the prescription altogether, or (3) rewriting (ie, changing) the prescription for a different medication.
We abstracted the following information from the electronic prescription database: prescription date and time, prescribed generic drug name and class, alert type (allergy or interaction), interacting generic drug name and class, alert severity level, action taken in response to an alert (override, cancel, or change), binary variable indicating whether the same generic drug had been prescribed to the patient previously, and device used to enter the prescription (handheld vs desktop computer). We abstracted deidentified clinician and practice identification numbers, clinician type (physician vs nonphysician), clinician specialty, number of electronic prescriptions written by each clinician during the study period, and length of time the clinician had been using an electronic prescribing system to write prescriptions. We also abstracted patient age, sex, and number of active medications at the beginning of the study period.
If a patient takes multiple medications, several interaction warnings can arise from a single prescription attempt. Duplicate alerts can also appear for the same prescription if a clinician views an alert, cancels the prescription, and then reattempts to order the same medication.12 To eliminate duplicate alerts and to focus on the clinician's final decision of whether or not to prescribe after generating an alert, we selected the last observation of alerted prescription data sorted by physician, patient, prescribed generic drug name, date, and time.12 This eliminated 235 525 duplicate alerts, leaving 3 570 378 prescription attempts and 233 537 alerts in our final data set.
We performed analyses using the prescription writing attempt and the alert as units of analysis. We calculated alert rates for each clinician type and specialty by dividing the number of alerted prescription attempts by total prescription attempts. We examined a clinician's response to an alerted order as a dichotomous variable, accept vs override. Accepted orders included canceled prescription orders or a change of the alerted medication to an alternate drug. We calculated accept rates for each alert type, DDI severity, clinician type, and clinician specialty by dividing the number of accepted alerts by the total number of alerted prescriptions. We used the Fisher exact test to examine differences in alert rates and accept rates in each category. We also examined accept rates for interacting medication classes, limiting our analysis to combinations of drug classes that generated at least 50 alerts to achieve more stable estimates. Some drugs did not have a drug class specified, and in these instances we used the prescribed generic name in lieu of the drug class.
To examine characteristics of accepted alerts, we excluded allergy alerts (<2% of all alerts) because of variability with which practices entered patient allergy information into the electronic prescribing system. For adjusted analysis of DDI alerts, we excluded patients seen by a pediatrician or those younger than 18 since the safety concerns of medications for children differ from those for adults. We fit logistic regression models with generalized estimating equations using the response to the alerted prescription (accept vs override) as the binary outcome. We accounted for clustering nested within the practice at the clinician level and used an exchangeable correlation structure.
The multivariate model included patient age, sex, and number of active medications as independent predictors. We examined for differences in accept rates among patient age groups using a Wald test, and we examined for a linear trend across number of patient medication categories. We also examined the relationship between the decision to accept a DDI alert and clinician type, specialty, and experience with electronic prescribing. We examined number of prescriptions written during the study period and duration of time using electronic prescribing as indicators of electronic prescribing experience. Using Wald tests, we examined for differences in accept rates across clinician types, specialties, and categories of time since becoming an active prescriber and examined for a linear trend across categories of number of prescriptions written. Finally, we examined the relationship between the decision to accept an alert and prescription characteristics, including the type of device used to enter the prescription, severity level of the interaction, and whether the same generic drug had been prescribed to the patient previously. Since classes of prescribed medications are highly related to provider specialty and may therefore act as confounders, we included approximately 200 variables to adjust for the fixed effects of the most common class-class interactions. These variables accounted for more than 70% of all alerted orders. All analyses were conducted using a commercially available software program (SAS, version 9.1; SAS Institute, Cary, North Carolina).
Physicians accounted for 82.0% of the 2872 prescribing clinicians and made 86.7% of all electronic prescription attempts. Most clinicians were in primary care fields, with 21.9% in family medicine, 19.5% in internal medicine, 17.4% in pediatrics, and 3.3% in obstetrics-gynecology. Clinicians wrote 78.9% of electronic prescriptions using a desktop computer and 21.1% using a handheld device. Each clinician wrote a median of 754 electronic prescriptions (interquartile range, ±1423), with a range of 50 to 10 687 prescriptions. Clinicians had a median of 352 (interquartile range, ±426) days of experience with electronic prescribing, with a range of 0 to 1987 days.
During the study period, 6.6% of all attempted electronic prescriptions generated alerts. The DDI alerts and allergy alerts accounted for 98.3% and 1.7% of alerts, respectively (Table 1). Clinicians accepted DDI alerts less often than allergy alerts (9.2% vs 23.0%; Table 1; P < .001), although accept rates for both alert types were low. Most DDI alerts were high-severity (62.6%) or moderate-severity (29.6%) interactions. Clinicians were slightly more likely to heed major interactions than moderate interactions or minor interactions (10.4%, 7.3%, and 7.1%, respectively; P < .001).
Among high-severity interactions, accept rates for classes of interacting drug combinations ranged from 2.2% to 43.1%. Table 2 lists the high-severity interactions that clinicians accepted most and least frequently. Among high-severity alerts, clinicians frequently overrode alerts for medications used in combination to treat certain diseases, such as corticosteroids and antirheumatic agents for rheumatoid arthritis. Clinicians commonly accepted alerts that involved combinations of antibiotics and antiarrhythmic medications or antibiotics and warfarin sodium.
Table 3 gives the DDI alert rates and accept rates of different clinician types and specialties. In unadjusted analyses, physicians generated DDI alerts less often than nonphysicians (6.2% vs 8.2% of prescription attempts; P < .001), and physicians accepted a lower percentage of alerted prescriptions (9.0% vs 10.5%; P < .001). The alert rates varied by clinician specialty, from 1.2% for pediatricians to 27.5% for psychiatrists (P < .001). Accept rates varied by specialty as well, from 5.3% for psychiatrists to 17.7% for surgical specialists (P < .001).
Table 4 gives the results of the multivariable model, which examined the factors associated with the clinician's decision to accept 20 180 of 219 517 DDI alerts. We found no difference in DDI acceptance for patients of different ages or sexes (P = .09 and .69, respectively). We did not find a linear trend between alert acceptance and number of active medications (P = .15), although clinicians were approximately half as likely to accept alerts of patients who took 2 or more medications compared with patients who took a single medication. Clinicians had higher odds of accepting high-severity alerts compared with low severity alerts (odds ratio, 1.43; 95% confidence interval, 1.10-1.87). Clinicians had much lower odds of accepting DDI alerts if the patient had previously received the alerted medication (odds ratio, 0.03; 95% confidence interval, 0.03-0.03). In contrast to the unadjusted analyses, we did not find any differences in acceptance of alerts between clinician types (physician vs nonphysician; P = .10) or among clinician specialties (P = .16). Clinicians who wrote more electronic prescriptions were less likely to accept DDI alerts than clinicians who wrote fewer prescriptions (P for trend <.001). However, clinicians who used the electronic prescribing system for a longer period were more likely to accept DDI alerts (P for trend <.001).
In this study of more than 3 million electronic prescriptions, we found that clinicians overrode most high-severity DDI alerts and allergy alerts. Clinicians distinguished among different alerts, accepting some DDI combinations more frequently than others, such as interactions between antiarrhythmic agents and antibiotics. Among high-severity DDI alerts, a 20-fold difference was found between alerts with the highest accept rate and lowest accept rate. In multivariable analyses, we found no difference in response to alerts among clinicians of different specialties.
This study offers several advantages over previous analyses of medication safety alerts in ambulatory care. We examined the behavior of a large group of community-based practitioners in response to medication alerts, and the sample size allowed us to examine responses to alerts for classes of interacting drug combinations and patient and clinician characteristics associated with alert acceptance. We found that clinicians discriminated among alerts based on their severity, the classes of interacting medications, and the patient's previous experience with the medication. Given the high override rate of all alerts, it appears that the utility of electronic medication alerts in outpatient practice is grossly inadequate. For active clinicians, most alerts may be more of a nuisance than an asset.
Our results are consistent with those of previous small studies12,15 of practice sites affiliated with academic medical centers, which have shown that DDI and allergy alerts are commonly overridden. Several observers argue that to reduce the burden of outpatient ADEs, high-quality decision support must be an essential element of electronic prescribing systems.1,9,11,16,17 However, few studies have examined ways to improve alert systems. In a survey of Veterans Health Affairs physicians, clinicians recommended that DDI alerts provide suggestions for alternative medications and that previously overridden medication alerts should not reappear for the same patient.18 Clinicians who use commercial electronic prescribing systems have suggested that extant decision support systems are inadequate.17
In a 2005 national survey, only 14% of outpatient physicians reported using electronic prescribing.8,12 However, several government and private initiatives are under way to promote adoption. The E-prescribing and Medicare Modernization Act of 2003 created standards for electronic prescribing system data and subsidizes third parties to offset implementation costs.19 The National ePrescribing Patient Initiative, a coalition of private technology companies and insurers, offers free software to participating practices.12
Despite these efforts, the only clear evidence that computerized prescribing reduces ADEs is limited to the inpatient setting; only a few studies20- 24 have examined its effectiveness in ambulatory care. Gandhi and colleagues11 found that basic electronic prescribing systems with structured orders and pick lists, but no advanced decision support, reduced the number of prescription writing errors but did not reduce the number of ADEs in 4 adult outpatient practices. Unless designers integrate high-quality decision support into electronic prescribing systems, substantial improvements in outpatient medication safety may not occur.
Our findings provide some insights into ways that designers may improve medication safety alerts. Although most DDI alerts were classified as high severity, clinicians accepted these alerts only slightly more frequently than lower severity alerts. Designers should consider reducing the proportion of alerts classified as high severity by reexamining and potentially reclassifying the alerts that are frequently overridden. By prioritizing a subset of high-severity alerts and only allowing these alerts to interrupt workflow, Shah and colleagues10 improved acceptance of these alerts to 67%. Designers should also provide an option for clinicians to suppress alerts for medications that patients had previously received, since most of these alerts are overridden. Finally, designers should consider customizing alerts depending on the clinician's specialty. For example, since psychiatrists encounter alerts in one-quarter of prescription attempts, they may be at higher risk for overriding important warnings. The usefulness of alerts may vary by clinician specialty, and reducing the number of frivolous alerts for each specialty may help improve alert adherence and enhance care.
Examination of the relationship between clinician experience and alert acceptance yielded mixed results. High-volume prescribers accepted medication alerts less often than lower-volume prescribers, raising concern that high-volume prescribers may become habituated to alerts and neglect potentially useful warnings. In contrast, clinicians who used electronic prescribing systems longer accepted more alerts than novice users. These findings are amenable to several possible interpretations. Experienced electronic prescribing clinicians may know more about alternative medications that can be prescribed after an alert is generated. Alternatively, experienced clinicians may learn to trust certain alerts over time. Further study is necessary to examine these possible relationships.
Our study has several limitations. First, we examined prescriptions written using 1 electronic prescribing system. However, the alerts we examined are typical of those encountered by clinicians, and the drug interaction database from Cerner Multum is used by many commercial electronic prescription systems. Second, our inference about an alert's utility was based on the clinician's decision to accept or override an alert. However, overridden alerts may be useful to clinicians if alerts prompt clinicians to change their behavior (eg, by observing a patient more closely or by ordering additional studies). Third, we assumed that a clinician's decision to override an alert was based on sound clinical judgment, although we were unable to validate this by reviewing the medical record. Previous studies25 have shown some examples of unjustified or dangerous prescribing behavior by clinicians. Finally, we could not examine several patient and clinician characteristics, such as patient comorbidities and clinician familiarity, with the use of electronic medical records.
In conclusion, clinicians' responses to medication safety alerts have much to teach us. Clinicians override most alerts, suggesting that existing alerts may provide little value to practitioners. Unless designers take steps to improve prescription alert systems, the potential benefits of electronic prescribing may not be realized.
Correspondence: Thomas Isaac, MD, MBA, MPH, Division of General Medicine and Primary Care, Beth Israel Deaconess Medical Center, 330 Brookline Ave, Boston, MA 02215 (email@example.com).
Accepted for Publication: September 3, 2008.
Author Contributions:Study concept and design: Isaac, Weissman, Sands, and Weingart. Acquisition of data: Weissman, Cyrulik, and Weingart. Analysis and interpretation of data: Isaac, Weissman, Davis, Massagli, and Weingart. Drafting of the manuscript: Isaac and Weissman. Critical revision of the manuscript for important intellectual content: Isaac, Weissman, Davis, Massagli, Cyrulik, Sands, and Weingart. Statistical analysis: Isaac, Davis, and Massagli. Obtained funding: Weissman, Cyrulik, Sands, and Weingart. Administrative, technical, and material support: Weingart. Study supervision: Isaac, Weissman, and Weingart.
Financial Disclosure: None reported.
Funding/Support: This study was supported by a grant from the Physicians' Foundation for Health Systems Excellence.