Many electronic health records (EHRs) have poor usability, leading to user frustration and safety risks.1 Usability is the extent to which the technology helps users achieve their goals in a satisfying, effective, and efficient manner within the constraints and complexities of their work environment.2
The US Department of Health and Human Services’ Office of the National Coordinator for Health Information Technology (ONC) has established certification requirements to promote usability practices by EHR vendors as a part of the meaningful use program. To develop a certified EHR, vendors are required to attest to using user-centered design (UCD), a process that places the cognitive and information needs of the frontline user at the forefront of software development, and to conduct formal usability testing on 8 different EHR capabilities to ensure the product meets performance objectives.3
Third-party ONC-authorized certification bodies are responsible for certifying EHR products. Vendors are required to provide a written statement naming the UCD process they used and describing the process if it is not a recognized industry standard.3 Vendors must also provide written results of their usability tests, including the number, clinical background, and demographics of the participants. The ONC has endorsed guidelines from the National Institute of Standards and Technology stipulating that usability testing should include at least 15 representative end-user participants.4 Reports must be made public once the product is certified. We analyzed these reports to determine whether usability certification requirements and testing standards were met.
The ONC maintains a web-based authoritative listing of all certified vendor products and the associated certification reports. Reports meeting the 2014 certification requirements were retrieved for the 50 EHR vendors with the highest number of providers (hospitals and small private practices) attesting to meeting meaningful use requirements with that product between April 1, 2013, and November 30, 2014, representing more than 90% of provider attestations during this period.5 For vendors with multiple certified EHR products, we included the product report with the most frequent provider attestations; if a product report lacked usability information, the next most frequently attested to report from that vendor was analyzed.
From each report, we extracted the stated UCD process and the number and clinical background of usability test participants. Participants whose occupation involved direct patient contact were categorized as having a clinical background. Because some vendors occasionally use different numbers of participants and backgrounds for testing different capabilities, we focused on computerized provider order entry because it is primarily used by clinicians and presents significant safety hazards when not designed well.6
Of 50 certified vendor reports, 41 were available for review (82%); the remaining 9 (18%) were not publicly available. Of 41 vendors, 14 (34%) had not met the ONC certification requirement of stating their UCD process (Figure 1), 19 (46%) used an industry standard, and 6 (15%) used an internally developed UCD process.
There was variability in the number of participants enrolled in the usability tests (mean [SD], 14 [10] participants; range, 4-51). Of the 41 vendors, 26 (63%) used less than the standard of 15 participants (Figure 2) and only 9 (22%) used at least 15 participants with clinical backgrounds. In addition, 1 of the 41 vendors used no clinical participants, 7 (17%) used no physician participants, and 2 (5%) used their own employees. Of the 41 vendor reports available, 5 (12%) lacked enough detail to determine whether physicians participated and 21 (51%) did not provide the required demographic details.
Our findings reveal a lack of adherence to ONC certification requirements and usability testing standards among several widely used EHR products that were certified as having met these requirements.
Limitations include inclusion of a subset of vendors and products, sole focus on computerized provider order entry, and reliance on vendor self-reports.
The lack of adherence to usability testing may be a major factor contributing to the poor usability experienced by clinicians. Enforcement of existing standards, specific usability guidelines, and greater scrutiny of vendor UCD processes may be necessary to achieve the functional and safety goals for the next generation of EHRs.
Corresponding Author: Raj M. Ratwani, PhD, MedStar Health, 3007 Tilden St, Washington, DC 20008 (raj.ratwani@medicalhfe.org).
Author Contributions: Dr Ratwani had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Ratwani.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: Ratwani.
Critical revision of the manuscript for important intellectual content: Benda, Hettinger, Fairbanks.
Statistical analysis: Ratwani.
Administrative, technical, or material support: All authors.
Study supervision: All authors.
Conflict of Interest Disclosures: The authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Dr Fairbanks reported receiving grant support from the Agency for Healthcare Research and Quality. No other disclosures were reported.
1.Institute of Medicine. Health IT and Patient Safety Building Safer Systems for Better Care. Washington, DC: National Academy Press; 2012.
3.Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 45 CFR part 170 (2014).
5.Office of the National Coordinator for Health Information Technology. Certified health IT product list.
http://oncchpl.force.com/ehrcert. Accessibility verified August 7, 2015.
6.Magrabi
F, Ong
MS, Runciman
W, Coiera
E. An analysis of computer-related patient safety incidents to inform the development of a classification.
J Am Med Inform Assoc. 2010;17(6):663-670.
PubMedGoogle ScholarCrossref