[Skip to Content]
[Skip to Content Landing]
Figure 1.
Analysis of Electronic Prescription Notes Content
Analysis of Electronic Prescription Notes Content

Flowchart illustrating the analysis and classification of electronic prescription notes content.

Figure 2.
Recommended Accommodations for Appropriate Electronic Prescription Notes Content
Recommended Accommodations for Appropriate Electronic Prescription Notes Content

Recommended future solutions for accommodating the communication of appropriate e-prescription clinical notes content.

aExcludes notes content classified as “other.”

Table 1.  
Inappropriate e-Prescription Notes Content Classification
Inappropriate e-Prescription Notes Content Classification
Table 2.  
Appropriate e-Prescription Notes Content Classification
Appropriate e-Prescription Notes Content Classification
Table 3.  
Common Themes in Appropriate Other Notes Content
Common Themes in Appropriate Other Notes Content
1.
Fischer  MA, Vogeli  C, Stedman  M, Ferris  T, Brookhart  MA, Weissman  JS.  Effect of electronic prescribing with formulary decision support on medication use and cost.  Arch Intern Med. 2008;168(22):2433-2439.PubMedGoogle ScholarCrossref
2.
Schiff  GD, Rucker  TD.  Computerized prescribing: building the electronic infrastructure for better medication usage.  JAMA. 1998;279(13):1024-1029.PubMedGoogle ScholarCrossref
3.
Bates  DW, Cohen  M, Leape  LL, Overhage  JM, Shabot  MM, Sheridan  T.  Reducing the frequency of errors in medicine using information technology.  J Am Med Inform Assoc. 2001;8(4):299-308.PubMedGoogle ScholarCrossref
4.
Bates  DW, Gawande  AA.  Improving safety with information technology.  N Engl J Med. 2003;348(25):2526-2534.PubMedGoogle ScholarCrossref
5.
Chaudhry  B, Wang  J, Wu  S,  et al.  Systematic review: impact of health information technology on quality, efficiency, and costs of medical care.  Ann Intern Med. 2006;144(10):742-752.PubMedGoogle ScholarCrossref
6.
Buntin  MB, Burke  MF, Hoaglin  MC, Blumenthal  D.  The benefits of health information technology: a review of the recent literature shows predominantly positive results.  Health Aff (Millwood). 2011;30(3):464-471.PubMedGoogle ScholarCrossref
7.
Kaushal  R, Kern  LM, Barrón  Y, Quaresimo  J, Abramson  EL.  Electronic prescribing improves medication safety in community-based office practices.  J Gen Intern Med. 2010;25(6):530-536.PubMedGoogle ScholarCrossref
8.
Devine  EB, Wilson-Norton  JL, Lawless  NM,  et al.  Characterization of prescribing errors in an internal medicine clinic.  Am J Health Syst Pharm. 2007;64(10):1062-1070.PubMedGoogle ScholarCrossref
9.
Donyai  P, O’Grady  K, Jacklin  A, Barber  N, Franklin  BD.  The effects of electronic prescribing on the quality of prescribing.  Br J Clin Pharmacol. 2008;65(2):230-237.PubMedGoogle ScholarCrossref
10.
Palchuk  MB, Fang  EA, Cygielnik  JM,  et al.  An unintended consequence of electronic prescriptions: prevalence and impact of internal discrepancies.  J Am Med Inform Assoc. 2010;17(4):472-476.PubMedGoogle ScholarCrossref
11.
Rupp  MT, Warholak  TL.  Evaluation of e-prescribing in chain community pharmacy: best-practice recommendations.  J Am Pharm Assoc (2003). 2008;48(3):364-370.PubMedGoogle ScholarCrossref
12.
Bates  DW, Boyle  DL, Teich  JM.  Impact of computerized physician order entry on physician time.  Proc Annu Symp Comput Appl Med Care. 1994:996.PubMedGoogle Scholar
13.
Tierney  WM, Miller  ME, Overhage  JM, McDonald  CJ.  Physician inpatient order writing on microcomputer workstations: effects on resource utilization.  JAMA. 1993;269(3):379-383.PubMedGoogle ScholarCrossref
14.
Stein  HD, Nadkarni  P, Erdos  J, Miller  PL.  Exploring the degree of concordance of coded and textual data in answering clinical queries from a clinical data repository.  J Am Med Inform Assoc. 2000;7(1):42-54.PubMedGoogle ScholarCrossref
15.
Hohnloser  JH, Fischer  MR, König  A, Emmerich  B.  Data quality in computerized patient records: analysis of a haematology biopsy report database.  Int J Clin Monit Comput. 1994;11(4):233-240.PubMedGoogle ScholarCrossref
16.
Johnson  SB, Bakken  S, Dine  D,  et al.  An electronic health record based on structured narrative.  J Am Med Inform Assoc. 2008;15(1):54-64.PubMedGoogle ScholarCrossref
17.
Dhavle  AA, Rupp  MT.  Towards creating the perfect electronic prescription.  J Am Med Inform Assoc. 2015;22(e1):e7-e12.PubMedGoogle Scholar
18.
Dhavle  AA, Corley  ST, Rupp  MT,  et al.  Evaluation of a user guidance reminder to improve the quality of electronic prescription messages.  Appl Clin Inform. 2014;5(3):699-707.PubMedGoogle ScholarCrossref
19.
Grossman  JM, Cross  DA, Boukus  ER, Cohen  GR.  Transmitting and processing electronic prescriptions: experiences of physician practices and pharmacies.  J Am Med Inform Assoc. 2012;19(3):353-359.PubMedGoogle ScholarCrossref
20.
National Council for Prescription Drug Programs (NCPDP). NCPDP SCRIPT implementation recommendations. http://www.ncpdp.org/NCPDP/media/pdf/SCRIPTImplementationRecommendationsV1-29.pdf. Published December 2014. Accessed January 22, 2015.
21.
Wolf  MS, Shekelle  P, Choudhry  NK, Agnew-Blais  J, Parker  RM, Shrank  WH.  Variability in pharmacy interpretations of physician prescriptions.  Med Care. 2009;47(3):370-373.PubMedGoogle ScholarCrossref
22.
Singh  H, Mani  S, Espadas  D, Petersen  N, Franklin  V, Petersen  LA.  Prescription errors and outcomes related to inconsistent information transmitted through computerized order entry: a prospective study.  Arch Intern Med. 2009;169(10):982-989.PubMedGoogle ScholarCrossref
23.
Hincapie  AL, Warholak  T, Altyar  A, Snead  R, Modisett  T.  Electronic prescribing problems reported to the Pharmacy and Provider ePrescribing Experience Reporting (PEER) portal.  Res Social Adm Pharm. 2014;10(4):647-655.PubMedGoogle ScholarCrossref
24.
Surescripts LLC. 2014 National progress report: more connected than ever before. http://surescripts.com/docs/default-source/national-progress-reports/surescripts-2014-national-progress-report.pdf. Published May 2014. Accessed July 19, 2015.
25.
Raosoft.com. Sample size calculator by Raosoft, Inc. http://www.raosoft.com/samplesize.html. Posted 2004. Accessed June 13, 2015.
26.
National Council for Prescription Drug Programs.  SCRIPT Standard Implementation Guide, Version 10.6. Scottsdale, AZ: National Council for Prescription Drug Programs; October 2014.
27.
Light  RJ.  Measures of response agreement for qualitative data: some generalizations and alternatives.  Psychol Bull. 1971;76(5):365-377.Google ScholarCrossref
28.
Halamka  JD. There’s more to eprescribing standards than you think. http://geekdoctor.blogspot.com/2014/07/theres-more-to-eprescribing-standards.html. Published July 28, 2014. Accessed June 5, 2015.
29.
Allen  AS, Sequist  TD.  Pharmacy dispensing of electronically discontinued medications.  Ann Intern Med. 2012;157(10):700-705.PubMedGoogle ScholarCrossref
30.
Ash  JS, Berg  M, Coiera  E.  Some unintended consequences of information technology in health care: the nature of patient care information system–related errors.  J Am Med Inform Assoc. 2004;11(2):104-112.PubMedGoogle ScholarCrossref
31.
McDonald  CJ, Overhage  JM, Mamlin  BW, Dexter  PD, Tierney  WM.  Physicians, information technology, and health care systems: a journey, not a destination.  J Am Med Inform Assoc. 2004;11(2):121-124.PubMedGoogle ScholarCrossref
32.
Koppel  R, Metlay  JP, Cohen  A,  et al.  Role of computerized physician order entry systems in facilitating medication errors.  JAMA. 2005;293(10):1197-1203.PubMedGoogle ScholarCrossref
33.
Sittig  DF, Singh  H.  Defining health information technology–related errors: new developments since To Err Is Human Arch Intern Med. 2011;171(14):1281-1284.PubMedGoogle ScholarCrossref
34.
Ratwani  RM, Fairbanks  RJ, Hettinger  AZ, Benda  NC.  Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.  J Am Med Inform Assoc. 2015;22(6):1179-1182.PubMedGoogle ScholarCrossref
35.
Hansen  LB, Fernald  D, Araya-Guerra  R, Westfall  JM, West  D, Pace  W.  Pharmacy clarification of prescriptions ordered in primary care: a report from the Applied Strategies for Improving Patient Safety (ASIPS) collaborative.  J Am Board Fam Med. 2006;19(1):24-30.PubMedGoogle ScholarCrossref
36.
Nanji  KC, Rothschild  JM, Salzberg  C,  et al.  Errors associated with outpatient computerized prescribing systems.  J Am Med Inform Assoc. 2011;18(6):767-773.PubMedGoogle ScholarCrossref
Original Investigation
April 2016

Analysis of Prescribers’ Notes in Electronic Prescriptions in Ambulatory Practice

Author Affiliations
  • 1Surescripts LLC, Arlington, Virginia
  • 2Department of Pharmacy Practice, Midwestern University, Glendale, Arizona
  • 3Houston Veterans Affairs Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas
  • 4Section of Health Services Research, Department of Medicine, Baylor College of Medicine, Houston, Texas
JAMA Intern Med. 2016;176(4):463-470. doi:10.1001/jamainternmed.2015.7786
Abstract

Importance  The optional free-text Notes field in ambulatory electronic prescriptions (e-prescriptions) allows prescribers to communicate additional prescription-related information to dispensing pharmacists. However, populating this field with irrelevant or inappropriate information can create confusion, workflow disruptions, and potential patient harm.

Objectives  To analyze the content of free-text prescriber notes in new ambulatory e-prescriptions and to develop recommendations to improve e-prescribing practices.

Design, Setting, and Participants  We performed a qualitative analysis of e-prescriptions containing free-text prescriber notes for conformance to the intended purpose of the free-text field as established in the national e-prescribing standard. The study sample contained 26 341 new e-prescriptions randomly selected from 3 024 737 e-prescriptions containing notes transmitted to community pharmacies across the United States during a 1-week period (November 10-16, 2013). The study e-prescriptions were issued by 22 549 community-based prescribers using 492 different electronic health record (EHR) or e-prescribing software application systems. Data analysis was conducted from February 23, 2014, to November 4, 2015.

Main Outcomes and Measures  Reviewers classified free-text prescriber notes as appropriate, inappropriate (content for which a standard, structured data-entry field is available in the widely implemented national e-prescribing standard), or unnecessary (irrelevant to dispensing pharmacists). We developed and applied a classification scheme to further characterize and quantify types of appropriate and inappropriate content.

Results  Of the 26 341 free-text notes, 17 421 (66.1%) contained inappropriate content, 7522 (28.6%) contained appropriate content, and 1398 (5.3%) contained information considered to be unnecessary. Further characterization of inappropriate content resulted in 20 192 classification codes, of which 3841 codes (19.0%) were assigned because of patient directions that conflicted with directions included in the designated standard field intended for this purpose. Characterization of appropriate content resulted in 7785 classification codes, of which 3685 (47.3%) contained information that could be communicated using structured fields already approved in a yet-to-be implemented version of the e-prescribing standard. An additional 745 (9.6%) were prescription cancellation requests for which a separate e-prescribing message currently exists but is not widely supported by software vendors or used by prescribers.

Conclusions and Relevance  The free-text Notes field in e-prescriptions is frequently used inappropriately, suggesting the need for better prerelease usability testing, consistent end user training and feedback, and rigorous postmarketing evaluation and surveillance of EHR or e-prescribing software applications. Accelerated implementation of new e-prescribing standards and rapid adoption of existing ones could also reduce prescribers’ reliance on free-text use in ambulatory e-prescriptions.

Introduction

As a key component in the health information technology infrastructure, electronic prescribing (e-prescribing) has the potential to improve the safety, quality, and cost-effectiveness of patient care.1-6 However, e-prescriptions sometimes contain information that is internally inconsistent, ambiguous, or incomplete and that can impede accurate and efficient processing and dispensing at receiving pharmacies.7-11

Although the widely implemented National Council for Prescription Drug Programs’ (NCPDP) SCRIPT e-prescribing standard organizes most new e-prescription content into structured fields, prescribers may add free-text data into certain fields for selected reasons.12-16 The optional, 210-character, free-text Notes field available in the e-prescription message is a well-documented source of potential miscommunication between prescribers and pharmacists.9,17-19 This field is intended to allow prescribers the option of including additional patient-specific information that is relevant to the prescription but for which a dedicated field does not exist in the currently implemented version of the SCRIPT standard (version 10.6).20

In practice, the Notes field may be populated with irrelevant information or data that should have been included in a designated structured field.19 This misallocation may be partly owing to electronic health record (EHR) systems that are overly restrictive or difficult to use, inadequate user training, and/or space limitations, such as the 140-character limit on the Patient Direction (Sig) field in the presently most widely implemented SCRIPT standard (version 10.6).9,11,21,22 Regardless of the reasons, the inclusion of unnecessary or conflicting prescription information in the Notes field can cause confusion at receiving pharmacies and workflow disruptions at prescribing clinics when pharmacists must contact the prescribers to clarify the intent. Unnecessary or inappropriate free-text information can also lead to dispensing delays, medication errors, and adverse patient outcomes.11,12,14,15,23

The SCRIPT standard continues to be revised regularly with addition of new structured data segments and fields and refinements to existing ones. At the time of the study analysis, SCRIPT, version 2015071 had been approved by the NCPDP membership. In addition, an important feature that allows prescription changes and discontinuations (ie, change and cancel request/response message types) is available for industry-wide adoption in the widely implemented 10.6 version of the SCRIPT standard, although few EHR and pharmacy vendors currently support its use. The objectives of this study were to evaluate the appropriateness of free-text notes entered by prescribers in new ambulatory care e-prescriptions and to use the findings to inform recommendations to improve current e-prescribing practices.

Methods

We conducted a retrospective, qualitative analysis of free-text content in the Notes field of new e-prescription messages transmitted through the Surescripts Health Information Network during a 7-day period from November 10 to November 16, 2013. Data analysis was conducted from February 23, 2014, to November 4, 2015. Approximately 67% of all new e-prescriptions in the United States are transmitted over the Surescripts network, a secure network used by pharmacies, prescribers, benefit managers, and health information exchanges.24

An initial random sample of e-prescriptions that contained free-text content in the Notes field was drawn from all new e-prescriptions transmitted through the network during the sampling period. The sample size was determined using the Raosoft sample size calculator to yield a margin of error of 0.98% with a confidence level of 99.9%.25 The number of messages selected for the study was weighted to reflect networkwide e-prescription volume during each day of the 7-day sampling period. For example, 22.7% of e-prescriptions transmitted through the Surescripts network during the sampling period occurred on Monday; hence, a similar percentage of e-prescriptions in the study sample was drawn from that Monday’s network volume.

Data elements extracted for the analysis included (1) prescriber’s identification number, (2) drug description, (3) patient directions, (4) free-text notes, (5) prescribed quantity, (6) quantity qualifier or potency unit code, and (7) days’ supply. Prior to analysis, e-prescription data were deidentified by an independent expert and certified to meet the requirements for deidentification as defined by the Health Insurance Portability and Accountability Act Privacy Rule (45 CFR §164.514). No prescriber information, clinical data, or patient demographics were made available to the investigation team for analysis. The analysis of e-prescription notes content was conducted in 4 phases.

Phase 1: Identification of Inappropriate Notes

In phase 1, 3 certified pharmacy technicians independently reviewed the Notes field content of each e-prescription in the sample to distinguish appropriate notes from inappropriate or unnecessary notes. Each reviewer had more than 3 years of experience interpreting and processing prescriptions in community practice settings and extensive familiarity with the SCRIPT standard. Reviewers were trained during multiple sessions with the principal investigator followed by individual assessment to ensure proficiency.

Reviewers first evaluated the notes content of each e-prescription in the sample and identified those containing information that was unclear or indecipherable as a result of having lost essential context information during the deidentification process. These e-prescriptions were eliminated from further analysis.

Reviewers next identified all inappropriate notes in the sample. For the purpose of the study, a note was considered inappropriate if it contained any content for which a designated standard field exists within the new e-prescription message in the widely implemented SCRIPT, version 10.6.12,26 If consensus on a particular note was not reached after initial independent review, the 3 reviewers discussed the note as a group to reconcile differences. A licensed pharmacist with prior experience in community and mail service pharmacy practice served as the final adjudicator when consensus could not be reached.

Agreement among the 3 reviewers prior to group reconciliation was measured using the κ coefficient with Light modification to account for multiple raters.27 The κ coefficient was calculated based on reviewers’ assignment of either “inappropriate” or “other” to each e-prescription note in the sample.

Phase 2: Classification of Inappropriate Notes

In phase 2, the reviewers classified the content of each inappropriate e-prescription note using a content classification scheme created by the research team (A.A.D., Y.Y., S.W.-C., and J.R.) (Table 1). Again, if consensus on note content classification was not reached after independent review, the 3 reviewers discussed the note as a group to reconcile differences. A residency-trained pharmacist (Y.Y.) with ambulatory care experience served as the final adjudicator when consensus could not be reached. The κ coefficient was used to measure agreement among the reviewers on coding inappropriate notes content according to the classification scheme.

Phase 3: Classification of Appropriate Notes

In phase 3, the content of e-prescription notes that had not been judged to be inappropriate in phase 1 were further examined to determine how prescribers appropriately use this field. A panel of 11 experts representing various e-prescribing stakeholder groups was assembled to evaluate appropriate note content. The panel contained representatives from 2 EHR vendors, a national pharmacy association, a national retail pharmacy chain, a national mail order pharmacy, and the Surescripts Health Information Network and included 7 pharmacists, 3 e-prescribing technologists, and 1 physician.

All panelists had extensive experience in e-prescribing and familiarity with the SCRIPT standard. The 11 panelists were divided into smaller review teams, each of which included at least 1 pharmacist or physician. Prior to review and classification of the remaining notes, members of the expert panel were trained by the study investigators (A.A.D., M.T.R., and J.R.). Training involved written guidance and follow-up group discussions that detailed identification and categorization criteria explained with examples. Each panel team first eliminated notes that, although not meeting the study criteria for being classified as inappropriate, were judged to contain information that was not relevant or useful to the pharmacist in the prescription fulfillment process. These notes were classified as unnecessary and were not subjected to further analysis.

The remaining appropriate notes were judged to contain content that was relevant to and necessary for the dispensing pharmacist and for which a designated structured field is not currently approved and available within the e-prescription message of SCRIPT, version 10.6, although it may be approved for a future version. Expert panel teams were directed to apply a content classification scheme that had been developed by the research team for the purposes of this study (Table 2).

Phase 4: Further Classification of Other Appropriate Note Content

In phase 4, two clinical residency-trained pharmacists (Y.Y. and S.W.-C.) from the research team reviewed a subset of notes that had been classified as “Other” during phase 3. When appropriate, these notes were further subcategorized to document the most commonly observed content.

Results

During the 7-day sampling period, 20 260 935 new e-prescriptions were transmitted through the Surescripts Health Information Network, of which 3 024 737 (14.9%) included data in the optional, free-text Notes field. From this sampling frame, 28 002 e-prescriptions were randomly selected. Sampled e-prescriptions had been issued by 22 549 community-based prescribers practicing in all 50 states, the District of Columbia, and all US territories except American Samoa using 492 different EHR or e-prescribing software applications. During initial review, 1661 of the 28 002 e-prescriptions were excluded from further analysis because their notes content contained unclear or indecipherable information resulting from the deidentification process. Thus, the final analysis sample included 26 341 e-prescriptions that contained free-text notes.

Identification of Inappropriate Notes

As illustrated in Figure 1, the 3 primary reviewers agreed that the content of 15 406 notes (58.5%) met the study’s definition of an inappropriate note (ĸ = 0.83; ie, a designated standard field is available within SCRIPT, version 10.6). The reviewers agreed that another 7894 notes (30.0%) did not meet this criterion but were not able to reach consensus on 3041 (11.5%) after team reconciliation. On review of the 3041 disputed notes, the pharmacist adjudicator determined that 2015 (7.6%) were inappropriate, resulting in a total of 17 421 (66.1%) that were determined to contain inappropriate content according to the study criteria.

Classification of Inappropriate Notes

The three reviewers were able to reach consensus on assignment of content classification codes for 12 979 (ĸ = 0.62) inappropriate notes. The 4442 remaining notes were subsequently classified by the adjudicating pharmacist reviewer.

As reported in Table 1, a total of 20 192 classification codes were assigned to characterize the content of the 17 421 inappropriate notes since some contained more than one type of inappropriate content. The most common inappropriate notes content was information relating to benefits, insurance, or coupons (30.9%) followed by quantity and quantity qualifier (23.9%) and patient directions (19.0%). These top 3 categories accounted for more than 73.8% of all inappropriate notes content.

Classification of Appropriate Notes

The expert panelists reviewed 8920 notes that did not meet the study criteria for an inappropriate note during phase 1 of the analysis. Of these, 1398 (15.7% [5.3% of the total]) were determined to contain unnecessary information that panelists concluded would not provide any benefit to the dispensing pharmacist. Examples of unnecessary notes included comments such as “reviewed, OK,” and “thank you.”

For the remaining 7522 (28.6% of the total) notes that were considered to be appropriate, the reviewers assigned a total of 7785 classification codes (Table 2). Of these codes, 3559 (45.7%) could be communicated using structured fields that, although not appearing in version 10.6, have been approved for a future version of the SCRIPT standard that has yet to be implemented. These codes included APPT (patient needs appointment/office visit/laboratory tests) (25.8%), HOLD (place prescription on hold/do not dispense until later date) (8.5%), ALGY (patient allergy notification/alert to pharmacy) (4.5%), LAN (label in patient’s preferred language) (4.1%), and DEL (deliver this prescription) (2.8%). Another 745 of the 7785 codes (9.6%) were classified as CANC (cancel existing/previous therapy) and contained directions for discontinuation or cancellation of prescriptions, which could be communicated through the use of a separate message that is available in version 10.6 of the standard but is not widely supported by EHR and pharmacy vendors or used by prescribers and pharmacies. Another 9 categories (26.0%) of appropriate notes content in Table 2 are not supported in the current or any future approved version of the SCRIPT standard and may represent a need for the addition of new fields.

In 1213 of the 7785 classification codes (15.6%), the content of the prescriber’s note was classified by the expert panel as Other because the content did not fit the classification criteria for any of the established categories. Further examination of these notes by the investigators revealed several common themes described in Table 3, of which 126 (1.6%) contained information that could be communicated using structured fields already approved in a yet-to-be-implemented version of the standard. Thus, a total of 3685 (47.3%) appropriate notes content contained information that could be communicated using structured fields in an approved but yet-to-be-implemented version of the NCPDP SCRIPT e-prescribing standard.

Discussion

We found that 14.9% of e-prescriptions included free-text notes, of which 66.1% contained inappropriate content for which an available standard, structured data-entry field should have been used. Patient directions, included in 19.0% of the inappropriate notes, represent a potential safety concern since this information may conflict with what is transmitted in the standard Directions field. Vague, ambiguous, or conflicting patient directions in the Notes field are also disruptive to pharmacy workflow and can result in dispensing errors if unnoticed, ignored, or misinterpreted by pharmacy staff. An example in our study was an order for Dilantin [phenytoin sodium], 100-mg oral capsule, with directions of “1 capsule every morning” but a free-text note that read “2 capsules QPM [every night],” thereby directly contradicting information contained in the Directions field. In addition, irrelevant information (5.3% of all notes) consumes pharmacy staff time and maintains the potential for misinterpretation of the prescriber’s intent and subsequent patient harm.

Many instances of appropriate use of the free-text Notes field result from delays in implementing newer approved versions of the SCRIPT standard by the e-prescribing industry. Implementation of these standards could lead to improved efficiency by eliminating time required for prescribers to manually enter this content and time for pharmacists to interpret it. Although enhancements to the SCRIPT standard are approved and published on at least a biannual basis, their adoption and implementation by the industry is lagging owing to federal statutory standards adoption process requirements, long software development and deployment cycles, and competing software development or maintenance priorities.28 Our findings suggest that the e-prescribing industry should address these factors and establish a clear, expeditious adoption roadmap to facilitate more rapid implementation of newer versions.

Free-text notes were also used by prescribers to make adjustments to or discontinue existing medications (2.8% of all notes). Prescribers’ inability to communicate this information to the pharmacy in a standardized fashion can have serious patient safety implications.29 Although the “cancel prescription request/response” messages are available today in SCRIPT, version 10.6, they have not been widely implemented by e-prescribing software vendors or used by prescribers and pharmacies. Our findings suggest the need for accelerated industry-wide adoption of this functionality.

Multiple categories of appropriate e-prescription notes content could not be accommodated in either the current or future approved versions of the SCRIPT standard, suggesting that new structured data fields may be recommended for inclusion in future versions of the standard. A description of possible future recommendations is presented in Figure 2. Alternatively, the communication needs represented by these concepts could be met without modifying the SCRIPT standard if EHRs made them available as standardized text strings in drop-down menus within the user interface. Both options require intuitive product design, robust end user training, rigorous usability testing, and iterative product refinement.

Further analysis of appropriate note content classified as other revealed several themes. The most frequently encountered theme reflected prescribers’ need to communicate prescription formulary status or drug substitution information. If prescribers are presented with current, complete formulary information that is integrated into the prescription-writing workflow, it can alleviate their perceived need to enter the information as free text.

The remaining categories of notes content in Tables 2 and 3 represented somewhat more diverse and infrequent prescriber needs that do not appear to justify the creation of new fields or standardized text strings. Rather, these themes represent a compelling argument for maintaining prescribers’ ability to enter pertinent free-text information to ensure their intent is fully communicated to the pharmacist.11,30-32

Our findings call for changes in premarketing and postmarketing testing and surveillance of e-prescribing software applications. First, better user-interface design that facilitates the use of appropriate designated data fields would help to discourage the inappropriate use of free text in notes. To accomplish this, vendors should consistently apply user-centered design procedures that solicit feedback from diverse cohorts of prescribers.33,34 Second, EHR certification testing is conducted in a controlled environment using a predefined number of test cases and is not intended to replicate a busy prescriber’s practice. The vendors should solicit and consider user feedback following product release and use this feedback to guide system refinements and improvements. Although prescribers might recognize that pharmacy call-backs to their practice are burdensome and disruptive, they might be not be aware of the corrective measures available for mitigating these disruptions. Third, vendors, health care systems, and professional societies should raise this awareness through ongoing data content monitoring along with enhanced end user training, support, and feedback.35,36

This study has several limitations. First, the sample consisted of e-prescription messages that were transmitted during a 7-day period. The content of prescriber notes in these e-prescriptions may therefore not be entirely representative of those in all e-prescriptions in the ambulatory care setting. Second, the deidentification process resulted in several hundred notes becoming indecipherable and subsequently being excluded from further analysis, which could have affected the results. Third, the notes classification criteria were based on the collective judgment of a team of informatics pharmacists, 3 reviewers, and our panel of industry experts and have not yet been validated in a broader industry setting.

Conclusions

Our analysis of free-text notes content in ambulatory e-prescriptions provides a better understanding of how prescribers use and misuse this important field. We found that most of the notes content should have been communicated in designated structured fields available in the standard, widely implemented SCRIPT, version 10.6, suggesting the need for better prerelease usability and conformance testing, consistent end user training and feedback related to e-prescription content and practices, and rigorous postmarketing evaluation and surveillance of e-prescribing software applications. One specific area of patient safety concern was the use of free text to discontinue medication therapy that could be reduced by industry-wide implementation of cancel prescription request/response messages available in SCRIPT, version 10.6. Adoption and implementation of the most recently approved version of the standard could also reduce the prescribers’ reliance on free-text notes. Nevertheless, our analysis suggests that prescribers still require free-text notes to communicate content to the pharmacy in some situations.

Back to top
Article Information

Corresponding Author: Ajit A. Dhavle, PharmD, MBA, Surescripts LLC, 2800 Crystal Dr, Arlington, VA 22202 (ajit.dhavle@surescripts.com).

Accepted for Publication: November 27, 2015.

Published Online: March 7, 2016. doi:10.1001/jamainternmed.2015.7786.

Author Contributions: Drs Dhavle and Yang had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Dhavle, Yang.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Dhavle, Yang, Rupp.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Dhavle, Yang, Ward-Charlerie.

Administrative, technical, or material support: Dhavle, Rupp, Ruiz.

Study supervision: Dhavle, Yang.

Conflict of Interest Disclosures: Drs Dhavle, Yang, and Ward-Charlerie and Mr Ruiz are employees of Surescripts LLC. Dr Rupp reported receiving consulting fees from Surescripts LLC during the conduct of the study. No other disclosures were reported.

Funding/Support: Dr Singh is partially supported by grant CIN 13-413 from the Veterans Affairs (VA) Health Services Research & Development Service to the Houston VA Center for Innovations in Quality, Effectiveness, and Safety.

Role of the Funder/Sponsor: The funding source had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Disclaimer: The content in this article is solely the responsibility of the authors and does not necessarily represent the official view of Surescripts LLC, the Department of Veterans Affairs, Baylor College of Medicine, or Midwestern University.

Previous Presentation: Selected preliminary findings from the analysis reported here were presented in a web-based seminar to the Best Practices Task Group of the National Council for Prescription Drug Programs’ Work Group 11 (e-Prescribing); September 26, 2014.

Additional Contributions: We thank the members of our expert industry review panel for their gracious contribution to this study. They did not receive financial compensation.

References
1.
Fischer  MA, Vogeli  C, Stedman  M, Ferris  T, Brookhart  MA, Weissman  JS.  Effect of electronic prescribing with formulary decision support on medication use and cost.  Arch Intern Med. 2008;168(22):2433-2439.PubMedGoogle ScholarCrossref
2.
Schiff  GD, Rucker  TD.  Computerized prescribing: building the electronic infrastructure for better medication usage.  JAMA. 1998;279(13):1024-1029.PubMedGoogle ScholarCrossref
3.
Bates  DW, Cohen  M, Leape  LL, Overhage  JM, Shabot  MM, Sheridan  T.  Reducing the frequency of errors in medicine using information technology.  J Am Med Inform Assoc. 2001;8(4):299-308.PubMedGoogle ScholarCrossref
4.
Bates  DW, Gawande  AA.  Improving safety with information technology.  N Engl J Med. 2003;348(25):2526-2534.PubMedGoogle ScholarCrossref
5.
Chaudhry  B, Wang  J, Wu  S,  et al.  Systematic review: impact of health information technology on quality, efficiency, and costs of medical care.  Ann Intern Med. 2006;144(10):742-752.PubMedGoogle ScholarCrossref
6.
Buntin  MB, Burke  MF, Hoaglin  MC, Blumenthal  D.  The benefits of health information technology: a review of the recent literature shows predominantly positive results.  Health Aff (Millwood). 2011;30(3):464-471.PubMedGoogle ScholarCrossref
7.
Kaushal  R, Kern  LM, Barrón  Y, Quaresimo  J, Abramson  EL.  Electronic prescribing improves medication safety in community-based office practices.  J Gen Intern Med. 2010;25(6):530-536.PubMedGoogle ScholarCrossref
8.
Devine  EB, Wilson-Norton  JL, Lawless  NM,  et al.  Characterization of prescribing errors in an internal medicine clinic.  Am J Health Syst Pharm. 2007;64(10):1062-1070.PubMedGoogle ScholarCrossref
9.
Donyai  P, O’Grady  K, Jacklin  A, Barber  N, Franklin  BD.  The effects of electronic prescribing on the quality of prescribing.  Br J Clin Pharmacol. 2008;65(2):230-237.PubMedGoogle ScholarCrossref
10.
Palchuk  MB, Fang  EA, Cygielnik  JM,  et al.  An unintended consequence of electronic prescriptions: prevalence and impact of internal discrepancies.  J Am Med Inform Assoc. 2010;17(4):472-476.PubMedGoogle ScholarCrossref
11.
Rupp  MT, Warholak  TL.  Evaluation of e-prescribing in chain community pharmacy: best-practice recommendations.  J Am Pharm Assoc (2003). 2008;48(3):364-370.PubMedGoogle ScholarCrossref
12.
Bates  DW, Boyle  DL, Teich  JM.  Impact of computerized physician order entry on physician time.  Proc Annu Symp Comput Appl Med Care. 1994:996.PubMedGoogle Scholar
13.
Tierney  WM, Miller  ME, Overhage  JM, McDonald  CJ.  Physician inpatient order writing on microcomputer workstations: effects on resource utilization.  JAMA. 1993;269(3):379-383.PubMedGoogle ScholarCrossref
14.
Stein  HD, Nadkarni  P, Erdos  J, Miller  PL.  Exploring the degree of concordance of coded and textual data in answering clinical queries from a clinical data repository.  J Am Med Inform Assoc. 2000;7(1):42-54.PubMedGoogle ScholarCrossref
15.
Hohnloser  JH, Fischer  MR, König  A, Emmerich  B.  Data quality in computerized patient records: analysis of a haematology biopsy report database.  Int J Clin Monit Comput. 1994;11(4):233-240.PubMedGoogle ScholarCrossref
16.
Johnson  SB, Bakken  S, Dine  D,  et al.  An electronic health record based on structured narrative.  J Am Med Inform Assoc. 2008;15(1):54-64.PubMedGoogle ScholarCrossref
17.
Dhavle  AA, Rupp  MT.  Towards creating the perfect electronic prescription.  J Am Med Inform Assoc. 2015;22(e1):e7-e12.PubMedGoogle Scholar
18.
Dhavle  AA, Corley  ST, Rupp  MT,  et al.  Evaluation of a user guidance reminder to improve the quality of electronic prescription messages.  Appl Clin Inform. 2014;5(3):699-707.PubMedGoogle ScholarCrossref
19.
Grossman  JM, Cross  DA, Boukus  ER, Cohen  GR.  Transmitting and processing electronic prescriptions: experiences of physician practices and pharmacies.  J Am Med Inform Assoc. 2012;19(3):353-359.PubMedGoogle ScholarCrossref
20.
National Council for Prescription Drug Programs (NCPDP). NCPDP SCRIPT implementation recommendations. http://www.ncpdp.org/NCPDP/media/pdf/SCRIPTImplementationRecommendationsV1-29.pdf. Published December 2014. Accessed January 22, 2015.
21.
Wolf  MS, Shekelle  P, Choudhry  NK, Agnew-Blais  J, Parker  RM, Shrank  WH.  Variability in pharmacy interpretations of physician prescriptions.  Med Care. 2009;47(3):370-373.PubMedGoogle ScholarCrossref
22.
Singh  H, Mani  S, Espadas  D, Petersen  N, Franklin  V, Petersen  LA.  Prescription errors and outcomes related to inconsistent information transmitted through computerized order entry: a prospective study.  Arch Intern Med. 2009;169(10):982-989.PubMedGoogle ScholarCrossref
23.
Hincapie  AL, Warholak  T, Altyar  A, Snead  R, Modisett  T.  Electronic prescribing problems reported to the Pharmacy and Provider ePrescribing Experience Reporting (PEER) portal.  Res Social Adm Pharm. 2014;10(4):647-655.PubMedGoogle ScholarCrossref
24.
Surescripts LLC. 2014 National progress report: more connected than ever before. http://surescripts.com/docs/default-source/national-progress-reports/surescripts-2014-national-progress-report.pdf. Published May 2014. Accessed July 19, 2015.
25.
Raosoft.com. Sample size calculator by Raosoft, Inc. http://www.raosoft.com/samplesize.html. Posted 2004. Accessed June 13, 2015.
26.
National Council for Prescription Drug Programs.  SCRIPT Standard Implementation Guide, Version 10.6. Scottsdale, AZ: National Council for Prescription Drug Programs; October 2014.
27.
Light  RJ.  Measures of response agreement for qualitative data: some generalizations and alternatives.  Psychol Bull. 1971;76(5):365-377.Google ScholarCrossref
28.
Halamka  JD. There’s more to eprescribing standards than you think. http://geekdoctor.blogspot.com/2014/07/theres-more-to-eprescribing-standards.html. Published July 28, 2014. Accessed June 5, 2015.
29.
Allen  AS, Sequist  TD.  Pharmacy dispensing of electronically discontinued medications.  Ann Intern Med. 2012;157(10):700-705.PubMedGoogle ScholarCrossref
30.
Ash  JS, Berg  M, Coiera  E.  Some unintended consequences of information technology in health care: the nature of patient care information system–related errors.  J Am Med Inform Assoc. 2004;11(2):104-112.PubMedGoogle ScholarCrossref
31.
McDonald  CJ, Overhage  JM, Mamlin  BW, Dexter  PD, Tierney  WM.  Physicians, information technology, and health care systems: a journey, not a destination.  J Am Med Inform Assoc. 2004;11(2):121-124.PubMedGoogle ScholarCrossref
32.
Koppel  R, Metlay  JP, Cohen  A,  et al.  Role of computerized physician order entry systems in facilitating medication errors.  JAMA. 2005;293(10):1197-1203.PubMedGoogle ScholarCrossref
33.
Sittig  DF, Singh  H.  Defining health information technology–related errors: new developments since To Err Is Human Arch Intern Med. 2011;171(14):1281-1284.PubMedGoogle ScholarCrossref
34.
Ratwani  RM, Fairbanks  RJ, Hettinger  AZ, Benda  NC.  Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.  J Am Med Inform Assoc. 2015;22(6):1179-1182.PubMedGoogle ScholarCrossref
35.
Hansen  LB, Fernald  D, Araya-Guerra  R, Westfall  JM, West  D, Pace  W.  Pharmacy clarification of prescriptions ordered in primary care: a report from the Applied Strategies for Improving Patient Safety (ASIPS) collaborative.  J Am Board Fam Med. 2006;19(1):24-30.PubMedGoogle ScholarCrossref
36.
Nanji  KC, Rothschild  JM, Salzberg  C,  et al.  Errors associated with outpatient computerized prescribing systems.  J Am Med Inform Assoc. 2011;18(6):767-773.PubMedGoogle ScholarCrossref
×