[Skip to Content]
[Skip to Content Landing]
Table 1.  
Definitions and Examples of Electronic Health Record (EHR) Usability and Clinical Processes Issues Identified in Possible Patient Harm Reports, 2013-2016a
Definitions and Examples of Electronic Health Record (EHR) Usability and Clinical Processes Issues Identified in Possible Patient Harm Reports, 2013-2016a
Table 2.  
Frequency of Clinical Process and Usability Issues Identified in the Possible Patient Harm Reports, 2013-2016a
Frequency of Clinical Process and Usability Issues Identified in the Possible Patient Harm Reports, 2013-2016a
1.
Institute of Medicine.  Health IT and patient safety building safer systems for better care. https://www.nap.edu/catalog/13269/health-it-and-patient-safety-building-safer-systems-for-better. Accessed February 19, 2018.
2.
International Organization for Standardization.  ISO/IEC 25010:2011—Systems and software engineering—Systems and Software Quality Requirements and Evaluation (SQUARE). https://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en. Accessed February 19, 2018.
3.
Ellsworth  MA, Dziadzko  M, O’Horo  JC, Farrell  AM, Zhang  J, Herasevich  V.  An appraisal of published usability evaluations of electronic health records via systematic review.  J Am Med Inform Assoc. 2016;24(1):218-226.PubMedGoogle ScholarCrossref
4.
Office of the National Coordinator for Health Information Technology.  Certified health IT developers and editions reported by hospitals participating in the Medicare EHR incentive program. https://dashboard.healthit.gov/quickstats/pages/FIG-Vendors-of-EHRs-to-Participating-Hospitals.php. Accessed April 7, 2016.
5.
Cullen  DJ, Bates  DW, Small  SD, Cooper  JB, Nemeskal  AR, Leape  LL.  The incident reporting system does not detect adverse drug events: a problem for quality improvement.  Jt Comm J Qual Improv. 1995;21(10):541-548.PubMedGoogle Scholar
6.
Ratwani  RM, Benda  NC, Hettinger  AZ, Fairbanks  RJ.  Electronic health record vendor adherence to usability certification requirements and testing standards.  JAMA. 2015;314(10):1070-1071.PubMedGoogle ScholarCrossref
Research Letter
March 27, 2018

Electronic Health Record Usability Issues and Potential Contribution to Patient Harm

Author Affiliations
  • 1National Center for Human Factors in Healthcare, MedStar Health, Washington, DC
JAMA. 2018;319(12):1276-1278. doi:10.1001/jama.2018.1171

Electronic health record (EHR) usability, which is the extent that EHRs support clinicians in achieving their goals in a satisfying, effective, and efficient manner, is a point of frustration for clinicians and can have patient safety consequences.1,2 However, specific usability issues and EHR clinical processes that contribute to possible patient harm across different health care facilities have not been identified.3 We analyzed reports of possible patient harm that explicitly mentioned a major EHR vendor or product.4

Methods

This study was approved by the MedStar Health Institutional review board with an informed consent waiver. Patient safety reports, which are free-text descriptions of safety events, were analyzed from 2013 through 2016. Reports were retrieved from the Pennsylvania Patient Safety Authority database, which collects reports from 571 health care facilities in Pennsylvania, and from a large multihospital academic health care system in the mid-Atlantic, outside of Pennsylvania. Reports were voluntarily entered by health care staff, mostly nurses, and included several sentences describing the safety event, contributing factors, and categorization of effect on the patient. This categorization indicates whether the event reached the patient (meaning additional health care services were required), whether there was harm at the time of reporting, or the potential for harm to the patient. The harm categories were (1) reached the patient and potentially required monitoring to preclude harm, (2) potentially caused temporary harm, (3) potentially caused permanent harm, and (4) could have necessitated intervention to sustain life or could have resulted in death.

Reports were included for analysis if 1 of the top 5 EHRs (vendors or products; based on the number of health care organizations attesting to meeting meaningful use requirements with that EHR) was mentioned and if the report was categorized as reaching the patient with possible harm.4 Two usability experts reviewed reports to determine if the report contained explicit language to associate the possible harm report with an EHR usability issue. Usability-related reports were further categorized into 1 of 7 usability topics to describe the primary usability challenge, based on a synthesis of previous taxonomies, and categorized into 1 of 4 EHR clinical processes, based on existing categories of EHR interactions (Table 1). A subset of the data (15%) were dually coded. Inter-rater reliability κ scores were 0.9 (usability as a contributing factor), 0.83 (usability category), and 0.87 (EHR clinical process).

Results

Of 1.735 million reported safety events, 1956 (0.11%) explicitly mentioned an EHR vendor or product and were reported as possible patient harm and 557 (0.03%) had language explicitly suggesting EHR usability contributed to possible patient harm. Harm level analysis of the 557 reports were reached the patient and potentially required monitoring to preclude harm (84%, n = 468), potentially caused temporary harm (14%, n = 80), potentially caused permanent harm (1%, n = 7), and could have necessitated intervention to sustain life or could have resulted in death (<1%, n = 2).

Of the 7 usability categories, challenges were data entry (27%, n = 152), alerting (22%, n = 122), interoperability (18%, n = 102), visual display (9%, n = 52), availability of information (9%, n = 50), system automation and defaults (8%, n = 43), and workflow support (7%, n = 36). Of the 4 EHR clinical processes, usability challenges occurred during order placement (38%, n = 213), medication administration (37%, n = 207), review of results (16%, n = 87), and documentation (9%, n = 50) (Table 2).

Discussion

EHR usability may have been a contributing factor to some possible patient harm events. Only a small percentage of potential harm events were associated with EHR usability, but the analysis was conservative because safety reports only capture a small fraction of the actual number of safety incidents, and only reports with explicit mentions of the top 5 vendors or products were included.5

Patient safety reports contain limited information making it difficult to identify causal factors and may be subject to reporter bias, inaccuracies, and a tendency to attribute blame for an event to the EHR. Additional research is needed to determine causal relationships between EHR usability and patient harm and the frequency of occurrence. Although federal policies promote EHR usability and safety, additional research may support policy refinement.6

Section Editor: Jody W. Zylke, MD, Deputy Editor.
Back to top
Article Information

Accepted for Publication: January 26, 2018.

Corresponding Author: Raj Ratwani, PhD, National Center for Human Factors in Healthcare, MedStar Health, 3007 Tilden St NW, Ste 7L, Washington, DC 20008 (raj.m.ratwani@medstar.net).

Author Contributions: Dr Ratwani had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: Howe, Adams, Ratwani.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: All authors.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Howe, Adams, Ratwani.

Obtained funding: Ratwani.

Administrative, technical, or material support: All authors.

Supervision: Ratwani.

Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.

Funding/Support: This project was funded by grant R01 HS023701-02 from the Agency for Healthcare Research and Quality (AHRQ) of the US Department of Health and Human Services (Dr Ratwani).

Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Disclaimer: The opinions expressed in this document are those of the authors and do not reflect the official position of AHRQ or the US Department of Health and Human Services.

References
1.
Institute of Medicine.  Health IT and patient safety building safer systems for better care. https://www.nap.edu/catalog/13269/health-it-and-patient-safety-building-safer-systems-for-better. Accessed February 19, 2018.
2.
International Organization for Standardization.  ISO/IEC 25010:2011—Systems and software engineering—Systems and Software Quality Requirements and Evaluation (SQUARE). https://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en. Accessed February 19, 2018.
3.
Ellsworth  MA, Dziadzko  M, O’Horo  JC, Farrell  AM, Zhang  J, Herasevich  V.  An appraisal of published usability evaluations of electronic health records via systematic review.  J Am Med Inform Assoc. 2016;24(1):218-226.PubMedGoogle ScholarCrossref
4.
Office of the National Coordinator for Health Information Technology.  Certified health IT developers and editions reported by hospitals participating in the Medicare EHR incentive program. https://dashboard.healthit.gov/quickstats/pages/FIG-Vendors-of-EHRs-to-Participating-Hospitals.php. Accessed April 7, 2016.
5.
Cullen  DJ, Bates  DW, Small  SD, Cooper  JB, Nemeskal  AR, Leape  LL.  The incident reporting system does not detect adverse drug events: a problem for quality improvement.  Jt Comm J Qual Improv. 1995;21(10):541-548.PubMedGoogle Scholar
6.
Ratwani  RM, Benda  NC, Hettinger  AZ, Fairbanks  RJ.  Electronic health record vendor adherence to usability certification requirements and testing standards.  JAMA. 2015;314(10):1070-1071.PubMedGoogle ScholarCrossref
×