[Skip to Content]
[Skip to Content Landing]
Figure.
Comparison of System Usability Scale (SUS) Scores for 2014 and 2015 Certified Products by Vendor
Comparison of System Usability Scale (SUS) Scores for 2014 and 2015 Certified Products by Vendor

Vendor-reported electronic health record (EHR) SUS scores for 2014 and 2015 certified products are compared with average benchmark (dotted line) and above-average benchmark (solid line) SUS scores.

1.
International Organization for Standardization. Ergonomics of human-system interaction, part 11: usability: definitions and concepts. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:en. Accessed July 22, 2019.
2.
Babbott  S, Manwell  LB, Brown  R,  et al.  Electronic medical records and physician stress in primary care: results from the MEMO Study.  J Am Med Inform Assoc. 2014;21(e1):e100-e106. doi:10.1136/amiajnl-2013-001875PubMedGoogle ScholarCrossref
3.
Friedberg  MW, Chen  PG, Van Busum  KR,  et al.  Factors Affecting Physician Professional Satisfaction and Their Implications for Patient Care, Health Systems, and Health Policy. Santa Monica, CA: The RAND Corporation; 2013.
4.
Howe  JL, Adams  KT, Hettinger  AZ, Ratwani  RM.  Electronic health record usability issues and potential contribution to patient harm.  JAMA. 2018;319(12):1276-1278. doi:10.1001/jama.2018.1171PubMedGoogle ScholarCrossref
5.
Office of the National Coordinator for Health Information Technology (ONC), Department of Health and Human Services (HHS).  2015 edition Health Information Technology (Health IT) certification criteria, 2015 edition base electronic health record (EHR) definition and ONC Health IT certification program modifications: final rule.  Fed Regist. 2015;80(200):62601-62759.PubMedGoogle Scholar
6.
Lewis  JR, Sauro  J.  Item benchmarks for the system usability scale.  J Usability Stud. 2018;13(3):158-167. http://uxpajournal.org/wp-content/uploads/sites/8/pdf/JUS_Lewis_May2018.pdf. Accessed October 25, 2019.Google Scholar
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    Research Letter
    Health Informatics
    December 13, 2019

    Evaluating Improvements and Shortcomings in Clinician Satisfaction With Electronic Health Record Usability

    Author Affiliations
    • 1University of Virginia, Charlottesville, Virginia
    • 2MedStar Health National Center for Human Factors in Healthcare, MedStar Health, Georgetown University School of Medicine, Washington, DC
    JAMA Netw Open. 2019;2(12):e1916651. doi:10.1001/jamanetworkopen.2019.16651
    Introduction

    With the widespread adoption of electronic health records (EHRs), there is increased focus on addressing the challenges of EHR usability, ie, the extent to which the technology enables users to achieve their goals effectively, efficiently, and satisfactorily.1 Poor usability is associated with clinician job dissatisfaction and burnout and could have patient safety consequences.2-4

    The US Department of Health and Human Services Office of the National Coordinator for Health Information Technology established safety-enhanced design certification requirements for EHRs to promote usability. These requirements stipulate that vendors must conduct and report results of formal usability testing, including measuring satisfaction with the EHR system.5 Results are publicly available. While some vendors use a 5-point, ease-of-use rating scale, most vendors use the system usability scale (SUS), which is a validated posttest questionnaire that measures user satisfaction with product usability.6 The questionnaire provides a score (range, 0-100) based on a participant’s rating of 10 statements regarding a product’s usability.6 Higher scores indicate greater satisfaction with usability.6 Based on an analysis of more than 200 studies of various products in various industries, an SUS score of 68 is considered the average benchmark, and an SUS of 80 is considered the above-average benchmark.6 Recognizing the importance of satisfaction with EHR usability to clinician burnout and patient safety, reported product 2015 SUS scores for EHR systems were compared with 2014 SUS scores and with benchmarks to evaluate whether satisfaction is improving.2-4

    Methods

    Per Common Rule, institutional review board approval was not required for this study because these are publicly available data sets that do not contain protected human participant information. This report followed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline.

    We identified the 70 EHR vendors with the most attestations to meaningful use from health care facilities between July 1, 2016, and April 30, 2018. For inclusion in analysis the vendor must have had an EHR product with computerized provider order entry functionality, certified according to the safety-enhanced design criterion, and a reported SUS score for the 2014 and 2015 certification requirements. For each vendor, the usability report for the most recent version of the product meeting the 2014 certification requirements (ie, before January 14, 2016, when the 2015 certification requirements became effective) and the usability report for the most recent version of the product meeting the 2015 certification requirements were retrieved, and the SUS scores were analyzed. A paired t test, with a 2-tailed P < .05 indicating statistical significance, was used to determine differences in SUS scores between 2014 and 2015, with means and standard deviations reported. All statistical analyses were performed with SPSS statistical software version 25 (IBM Corp).

    Results

    A total of 27 vendors met the inclusion criteria. Mean (SD) SUS scores for 2014 and 2015 products were not statistically different (73.2 [16.6] vs 75.0 [14.2]; t26 = 0.674; P = .51). Comparing 2014 products to benchmarks, 9 (33%) were below the average benchmark SUS score of 68, 18 (67%) were at or above average, and 11 (41%) met or exceeded the above-average benchmark score of 80 (Figure). For 2015 products, 7 (26%) were below the average benchmark, 20 (74%) were at or above average, and 12 (44%) met or exceeded the above-average benchmark. Between 2014 and 2015, SUS scores for 12 products (44%) decreased, 13 (48%) increased, and 2 (7%) were unchanged.

    Discussion

    There was no statistical improvement in EHR SUS scores between products certified according to 2014 and 2015 standards. One-third of 2014 products and one-quarter of 2015 products fell below the average benchmark SUS score. Despite the implications of EHR dissatisfaction on clinician burnout and patient safety, SUS scores decreased for 44% of vendors from 2014 to 2015.2-4

    This study has limitations. Vendor-reported SUS scores may not reflect satisfaction with implemented EHRs, and only a subset of vendors were analyzed because of differences in methods for measuring satisfaction.

    Based on vendor-reported SUS scores, clinician satisfaction with EHR usability is not improving for many widely used products. An increased focus on clinician end users during product design and development as well as optimized certification requirements are needed to improve usability.

    Back to top
    Article Information

    Accepted for Publication: October 11, 2019.

    Published: December 13, 2019. doi:10.1001/jamanetworkopen.2019.16651

    Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2019 Gomes KM et al. JAMA Network Open.

    Corresponding Author: Raj M. Ratwani, PhD, MedStar Health National Center for Human Factors in Healthcare, MedStar Health, 3007 Tilden St, Ste 7M, Washington, DC 20008 (raj.m.ratwani@medstar.net).

    Author Contributions: Ms Gomes had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

    Concept and design: Both authors.

    Acquisition, analysis, or interpretation of data: Both authors.

    Drafting of the manuscript: Both authors.

    Critical revision of the manuscript for important intellectual content: Gomes.

    Statistical analysis: Gomes.

    Obtained funding: Ratwani.

    Administrative, technical, or material support: Ratwani.

    Supervision: Ratwani.

    Conflict of Interest Disclosures: Dr Ratwani reported receiving grants from the Agency for Healthcare Research and Quality outside the submitted work. No other disclosures were reported.

    Funding/Support: This work was supported by grant number R01 HS025136 from the Agency for Healthcare Research and Quality to Dr Ratwani.

    Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

    References
    1.
    International Organization for Standardization. Ergonomics of human-system interaction, part 11: usability: definitions and concepts. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:en. Accessed July 22, 2019.
    2.
    Babbott  S, Manwell  LB, Brown  R,  et al.  Electronic medical records and physician stress in primary care: results from the MEMO Study.  J Am Med Inform Assoc. 2014;21(e1):e100-e106. doi:10.1136/amiajnl-2013-001875PubMedGoogle ScholarCrossref
    3.
    Friedberg  MW, Chen  PG, Van Busum  KR,  et al.  Factors Affecting Physician Professional Satisfaction and Their Implications for Patient Care, Health Systems, and Health Policy. Santa Monica, CA: The RAND Corporation; 2013.
    4.
    Howe  JL, Adams  KT, Hettinger  AZ, Ratwani  RM.  Electronic health record usability issues and potential contribution to patient harm.  JAMA. 2018;319(12):1276-1278. doi:10.1001/jama.2018.1171PubMedGoogle ScholarCrossref
    5.
    Office of the National Coordinator for Health Information Technology (ONC), Department of Health and Human Services (HHS).  2015 edition Health Information Technology (Health IT) certification criteria, 2015 edition base electronic health record (EHR) definition and ONC Health IT certification program modifications: final rule.  Fed Regist. 2015;80(200):62601-62759.PubMedGoogle Scholar
    6.
    Lewis  JR, Sauro  J.  Item benchmarks for the system usability scale.  J Usability Stud. 2018;13(3):158-167. http://uxpajournal.org/wp-content/uploads/sites/8/pdf/JUS_Lewis_May2018.pdf. Accessed October 25, 2019.Google Scholar
    ×