[Skip to Navigation]
Sign In
Table 1.  Eight Sociotechnical Dimensions Guiding Data Analysisa
Eight Sociotechnical Dimensions Guiding Data Analysisa
Table 2.  Health Information Technology (HIT) Safety Domains Guiding Data Analysisa
Health Information Technology (HIT) Safety Domains Guiding Data Analysisa
Table 3.  Health Information Technology (HIT)–Related Safety Concerns by HIT Safety Domain and Sociotechnical Dimension
Health Information Technology (HIT)–Related Safety Concerns by HIT Safety Domain and Sociotechnical Dimension
Table 4.  Health Information Technology–Related High-risk Areas in Root Cause Analysis (RCA) Events Associated With Diagnostic Delays
Health Information Technology–Related High-risk Areas in Root Cause Analysis (RCA) Events Associated With Diagnostic Delays
Table 5.  High-risk Areas and Suggested Interventions for Health Information Technology–Related Diagnostic Delays
High-risk Areas and Suggested Interventions for Health Information Technology–Related Diagnostic Delays
1.
Bates  DW, Singh  H.  Two decades since To Err Is Human: an assessment of progress and emerging priorities in patient safety.   Health Aff (Millwood). 2018;37(11):1736-1743. doi:10.1377/hlthaff.2018.0738 PubMedGoogle ScholarCrossref
2.
El-Kareh  R, Hasan  O, Schiff  GD.  Use of health information technology to reduce diagnostic errors.   BMJ Qual Saf. 2013;22(suppl 2):ii40-ii51. doi:10.1136/bmjqs-2013-001884 PubMedGoogle ScholarCrossref
3.
Singh  H, Naik  AD, Rao  R, Petersen  LA.  Reducing diagnostic errors through effective communication: harnessing the power of information technology.   J Gen Intern Med. 2008;23(4):489-494. doi:10.1007/s11606-007-0393-z PubMedGoogle ScholarCrossref
4.
Gawande  A. Why doctors hate their computers. The New Yorker. Published November 5, 2018. Accessed March 12, 2020. https://www.newyorker.com/magazine/2018/11/12/why-doctors-hate-their-computers
5.
Campione  JR, Mardon  RE, McDonald  KM.  Patient safety culture, health information technology implementation, and medical office problems that could lead to diagnostic error.   J Patient Saf. 2019;15(4):267-273. doi:10.1097/PTS.0000000000000531 PubMedGoogle ScholarCrossref
6.
Singh  H, Sittig  DF.  Measuring and improving patient safety through health information technology: the Health IT Safety framework.   BMJ Qual Saf. 2016;25(4):226-232. doi:10.1136/bmjqs-2015-004486 PubMedGoogle ScholarCrossref
7.
Sittig  DF, Singh  H.  A new sociotechnical model for studying health information technology in complex adaptive healthcare systems.   Qual Saf Health Care. 2010;19(suppl 3):i68-i74. doi:10.1136/qshc.2010.042085 PubMedGoogle ScholarCrossref
8.
Meeks  DW, Smith  MW, Taylor  L, Sittig  DF, Scott  JM, Singh  H.  An analysis of electronic health record–related patient safety concerns.   J Am Med Inform Assoc. 2014;21(6):1053-1059. doi:10.1136/amiajnl-2013-002578 PubMedGoogle ScholarCrossref
9.
Menon  S, Singh  H, Giardina  TD,  et al.  Safety huddles to proactively identify and address electronic health record safety.   J Am Med Inform Assoc. 2017;24(2):261-267.PubMedGoogle Scholar
10.
Graber  ML, Byrne  C, Johnston  D.  The impact of electronic health records on diagnosis.   Diagnosis (Berl). 2017;4(4):211-223. doi:10.1515/dx-2017-0012 PubMedGoogle ScholarCrossref
11.
Schiff  GD, Bates  DW.  Can electronic clinical documentation help prevent diagnostic errors?   N Engl J Med. 2010;362(12):1066-1069. doi:10.1056/NEJMp0911734 PubMedGoogle ScholarCrossref
12.
Upadhyay  DK, Sittig  DF, Singh  H.  Ebola US Patient Zero: lessons on misdiagnosis and effective use of electronic health records.   Diagnosis (Berl). 2014;1(4):283-287. doi:10.1515/dx-2014-0064 PubMedGoogle ScholarCrossref
13.
Liebovitz  D.  Next steps for electronic health records to improve the diagnostic process.   Diagnosis (Berl). 2015;2(2):111-116. doi:10.1515/dx-2014-0070 PubMedGoogle ScholarCrossref
14.
Giardina  TD, King  BJ, Ignaczak  AP,  et al.  Root cause analysis reports help identify common factors in delayed diagnosis and treatment of outpatients.   Health Aff (Millwood). 2013;32(8):1368-1375. doi:10.1377/hlthaff.2013.0130 PubMedGoogle ScholarCrossref
15.
Singh  H, Thomas  EJ, Mani  S,  et al.  Timely follow-up of abnormal diagnostic imaging test results in an outpatient setting: are electronic medical records achieving their potential?   Arch Intern Med. 2009;169(17):1578-1586. doi:10.1001/archinternmed.2009.263 PubMedGoogle ScholarCrossref
16.
Singh  H, Thomas  EJ, Sittig  DF,  et al.  Notification of abnormal lab test results in an electronic medical record: do any safety concerns remain?   Am J Med. 2010;123(3):238-244. doi:10.1016/j.amjmed.2009.07.027 PubMedGoogle ScholarCrossref
17.
Wahls  T, Haugen  T, Cram  P.  The continuing problem of missed test results in an integrated health system with an advanced electronic medical record.   Jt Comm J Qual Patient Saf. 2007;33(8):485-492. doi:10.1016/S1553-7250(07)33052-3 PubMedGoogle Scholar
18.
Callen  JL, Westbrook  JI, Georgiou  A, Li  J.  Failure to follow-up test results for ambulatory patients: a systematic review.   J Gen Intern Med. 2012;27(10):1334-1348. doi:10.1007/s11606-011-1949-5 PubMedGoogle ScholarCrossref
19.
Casalino  LP, Dunham  D, Chin  MH,  et al.  Frequency of failure to inform patients of clinically significant outpatient test results.   Arch Intern Med. 2009;169(12):1123-1129. doi:10.1001/archinternmed.2009.130 PubMedGoogle ScholarCrossref
20.
Wahls  TL, Cram  PM.  The frequency of missed test results and associated treatment delays in a highly computerized health system.   BMC Fam Pract. 2007;8:32. doi:10.1186/1471-2296-8-32 PubMedGoogle ScholarCrossref
21.
Schiff  GD, Kim  S, Krosnjar  N,  et al.  Missed hypothyroidism diagnosis uncovered by linking laboratory and pharmacy data.   Arch Intern Med. 2005;165(5):574-577. doi:10.1001/archinte.165.5.574 PubMedGoogle ScholarCrossref
22.
Cram  P, Rosenthal  GE, Ohsfeldt  R, Wallace  RB, Schlechte  J, Schiff  GD.  Failure to recognize and act on abnormal test results: the case of screening bone densitometry.   Jt Comm J Qual Patient Saf. 2005;31(2):90-97. doi:10.1016/S1553-7250(05)31013-0 PubMedGoogle Scholar
23.
Litchfield  I, Bentham  L, Lilford  R, McManus  RJ, Hill  A, Greenfield  S.  Test result communication in primary care: a survey of current practice.   BMJ Qual Saf. 2015;24(11):691-699. doi:10.1136/bmjqs-2014-003712 PubMedGoogle ScholarCrossref
24.
Singh  H, Hirani  K, Kadiyala  H,  et al.  Characteristics and predictors of missed opportunities in lung cancer diagnosis: an electronic health record–based study.   J Clin Oncol. 2010;28(20):3307-3315. doi:10.1200/JCO.2009.25.6636 PubMedGoogle ScholarCrossref
25.
Murphy  DR, Meyer  AN, Bhise  V,  et al.  Computerized triggers of big data to detect delays in follow-up of chest imaging results.   Chest. 2016;150(3):613-620. doi:10.1016/j.chest.2016.05.001 PubMedGoogle ScholarCrossref
26.
Murphy  DR, Meyer  AN, Vaghani  V,  et al.  Application of electronic algorithms to improve diagnostic evaluation for bladder cancer.   Appl Clin Inform. 2017;8(1):279-290. doi:10.4338/ACI-2016-10-RA-0176 PubMedGoogle Scholar
27.
Murphy  DR, Meyer  AN, Vaghani  V,  et al.  Electronic triggers to identify delays in follow-up of mammography: harnessing the power of big data in health care.   J Am Coll Radiol. 2018;15(2):287-295. doi:10.1016/j.jacr.2017.10.001PubMedGoogle ScholarCrossref
28.
Murphy  DR, Meyer  AND, Vaghani  V,  et al.  Development and validation of trigger algorithms to identify delays in diagnostic evaluation of gastroenterological cancer [published correction appears in Clin Gastroenterol Hepatol. 2019;17(6):1218].   Clin Gastroenterol Hepatol. 2018;16(1):90-98.PubMedGoogle ScholarCrossref
29.
Singh  H, Spitzmueller  C, Petersen  NJ, Sawhney  MK, Sittig  DF.  Information overload and missed test results in electronic health record–based settings.   JAMA Intern Med. 2013;173(8):702-704. doi:10.1001/2013.jamainternmed.61 PubMedGoogle ScholarCrossref
30.
Singh  H, Spitzmueller  C, Petersen  NJ,  et al.  Primary care practitioners’ views on test result management in EHR-enabled health systems: a national survey.   J Am Med Inform Assoc. 2013;20(4):727-735. doi:10.1136/amiajnl-2012-001267 PubMedGoogle ScholarCrossref
31.
Murphy  DR, Meyer  AN, Russo  E, Sittig  DF, Wei  L, Singh  H.  The burden of inbox notifications in commercial electronic health records.   JAMA Intern Med. 2016;176(4):559-560. doi:10.1001/jamainternmed.2016.0209 PubMedGoogle ScholarCrossref
32.
Murphy  DR, Reis  B, Kadiyala  H,  et al.  Electronic health record–based messages to primary care providers: valuable information or just noise?   Arch Intern Med. 2012;172(3):283-285. doi:10.1001/archinternmed.2011.740 PubMedGoogle ScholarCrossref
33.
Murphy  DR, Reis  B, Sittig  DF, Singh  H.  Notifications received by primary care practitioners in electronic health records: a taxonomy and time analysis.   Am J Med. 2012;125(2):209.e1-209.e7. doi:10.1016/j.amjmed.2011.07.029PubMedGoogle ScholarCrossref
34.
Karsh  BT, Weinger  MB, Abbott  PA, Wears  RL.  Health information technology: fallacies and sober realities.   J Am Med Inform Assoc. 2010;17(6):617-623. doi:10.1136/jamia.2010.005637 PubMedGoogle ScholarCrossref
35.
US Department of Veterans Affairs. Veterans Health Administration. Accessed March 12, 2020. https://www.va.gov/health/
36.
VA National Center for Patient Safety. Root cause analysis. Accessed May 18, 2020. https://www.patientsafety.va.gov/professionals/onthejob/rca.asp
37.
von Elm  E, Altman  DG, Egger  M, Pocock  SJ, Gøtzsche  PC, Vandenbroucke  JP; STROBE Initiative.  The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies.   Int J Surg. 2014;12(12):1495-1499. doi:10.1016/j.ijsu.2014.07.013 PubMedGoogle ScholarCrossref
38.
Ash  JS, Singh  H, Wright  A, Chase  D, Sittig  DF.  Essential activities for electronic health record safety: a qualitative study.   Health Informatics J. Published online March 8, 2019. doi:10.1177/1460458219833109 PubMedGoogle Scholar
39.
Elo  S, Kyngäs  H.  The qualitative content analysis process.   J Adv Nurs. 2008;62(1):107-115. doi:10.1111/j.1365-2648.2007.04569.x PubMedGoogle ScholarCrossref
40.
Ibrahim  S, Donelle  L, Regan  S, Sidani  S.  A qualitative content analysis of nurses’ comfort and employment of workarounds with electronic documentation systems in home care practice.   Can J Nurs Res. 2020;52(1):31-44. doi:10.1177/0844562119855509 PubMedGoogle ScholarCrossref
41.
Pope  C, Ziebland  S, Mays  N.  Qualitative research in health care: analysing qualitative data.   BMJ. 2000;320(7227):114-116. doi:10.1136/bmj.320.7227.114 PubMedGoogle ScholarCrossref
42.
Montini  T, Noble  AA, Stelfox  HT.  Content analysis of patient complaints.   Int J Qual Health Care. 2008;20(6):412-420. doi:10.1093/intqhc/mzn041 PubMedGoogle ScholarCrossref
43.
Wright  A, Ash  J, Erickson  J,  et al.  A qualitative study of the activities performed by people involved in clinical decision support: recommended practices for success.   J Am Med Inform Assoc. 2014;21(3):464-472. doi:10.1136/amiajnl-2013-001771PubMedGoogle Scholar
44.
Legler  A, Price  M, Parikh  M,  et al.  Effect on VA patient satisfaction of provider’s use of an integrated viewer of multiple electronic health records.   J Gen Intern Med. 2019;34(1):132-136. doi:10.1007/s11606-018-4708-z PubMedGoogle ScholarCrossref
45.
Menon  S, Smith  MW, Sittig  DF,  et al.  How context affects electronic health record–based test result follow-up: a mixed-methods evaluation.   BMJ Open. 2014;4(11):e005985. doi:10.1136/bmjopen-2014-005985 PubMedGoogle Scholar
46.
Yackel  TR, Embi  PJ.  Unintended errors with EHR-based result management: a case series.   J Am Med Inform Assoc. 2010;17(1):104-107. doi:10.1197/jamia.M3294 PubMedGoogle ScholarCrossref
47.
Peerally  MF, Carr  S, Waring  J, Dixon-Woods  M.  The problem with root cause analysis.   BMJ Qual Saf. 2017;26(5):417-422.PubMedGoogle Scholar
48.
Kellogg  KM, Hettinger  Z, Shah  M,  et al.  Our current approach to root cause analysis: is it contributing to our failure to improve patient safety?   BMJ Qual Saf. 2017;26(5):381-387.PubMedGoogle Scholar
49.
Trbovich  P, Shojania  KG.  Root-cause analysis: swatting at mosquitoes versus draining the swamp.   BMJ Qual Saf. 2017;26(5):350-353. doi:10.1136/bmjqs-2016-006229 PubMedGoogle Scholar
50.
Neily  J, Ogrinc  G, Mills  P,  et al.  Using aggregate root cause analysis to improve patient safety.   Jt Comm J Qual Saf. 2003;29(8):434-439, 381. doi:10.1016/S1549-3741(03)29052-3PubMedGoogle Scholar
51.
Corwin  GS, Mills  PD, Shanawani  H, Hemphill  RR.  Root cause analysis of ICU adverse events in the Veterans Health Administration.   Jt Comm J Qual Patient Saf. 2017;43(11):580-590. doi:10.1016/j.jcjq.2017.04.009 PubMedGoogle Scholar
52.
Human Factors and Ergonomics Society. Definitions of HF/E. Accessed March 12, 2020. https://www.hfes.org/resources/educational-and-professional-resources/new-item
53.
Murphy  DR, Giardina  TD, Satterly  T, Sittig  DF, Singh  H.  An exploration of barriers, facilitators, and suggestions for improving electronic health record inbox-related usability: a qualitative analysis.   JAMA Netw Open. 2019;2(10):e1912638. doi:10.1001/jamanetworkopen.2019.12638 PubMedGoogle Scholar
54.
Singh  H, Wilson  L, Reis  B, Sawhney  MK, Espadas  D, Sittig  DF.  Ten strategies to improve management of abnormal test result alerts in the electronic health record.   J Patient Saf. 2010;6(2):121-123. doi:10.1097/PTS.0b013e3181ddf652 PubMedGoogle ScholarCrossref
55.
Castellucci  M. One-on-one EHR training improves physician satisfaction, saves time. Accessed March 12, 2020. https://www.modernhealthcare.com/article/20180915/TRANSFORMATION02/180919952/one-on-one-ehr-training-improves-physician-satisfaction-saves-time
56.
Robinson  KE, Kersey  JA.  Novel electronic health record (EHR) education intervention in large healthcare organization improves quality, efficiency, time, and impact on burnout.   Medicine (Baltimore). 2018;97(38):e12319. doi:10.1097/MD.0000000000012319 PubMedGoogle Scholar
57.
Shah  T, Patel-Teague  S, Kroupa  L, Meyer  AND, Singh  H.  Impact of a national QI programme on reducing electronic health record notifications to clinicians.   BMJ Qual Saf. 2019;28(1):10-14. doi:10.1136/bmjqs-2017-007447 PubMedGoogle ScholarCrossref
58.
Danforth  KN, Smith  AE, Loo  RK, Jacobsen  SJ, Mittman  BS, Kanter  MH.  Electronic clinical surveillance to improve outpatient care: diverse applications within an integrated delivery system.   EGEMS (Wash DC). 2014;2(1):1056. doi:10.13063/2327-9214.1056PubMedGoogle Scholar
59.
Sim  JJ, Rutkowski  MP, Selevan  DC,  et al.  Kaiser Permanente creatinine safety program: a mechanism to ensure widespread detection and care for chronic kidney disease.   Am J Med. 2015;128(11):1204-1211.e1. doi:10.1016/j.amjmed.2015.05.037PubMedGoogle ScholarCrossref
60.
Danforth  KN, Hahn  EE, Slezak  JM,  et al.  Follow-up of abnormal estimated GFR results within a large integrated health care delivery system: a mixed-methods study.   Am J Kidney Dis. 2019;74(5):589-600. doi:10.1053/j.ajkd.2019.05.003 PubMedGoogle ScholarCrossref
61.
Murphy  DR, Wu  L, Thomas  EJ, Forjuoh  SN, Meyer  AN, Singh  H.  Electronic trigger-based intervention to reduce delays in diagnostic evaluation for cancer: a cluster randomized controlled trial.   J Clin Oncol. 2015;33(31):3560-3567. doi:10.1200/JCO.2015.61.1301 PubMedGoogle ScholarCrossref
62.
Pillemer  F, Price  RA, Paone  S,  et al.  Direct release of test results to patients increases patient engagement and utilization of care.   PLoS One. 2016;11(6):e0154743. doi:10.1371/journal.pone.0154743 PubMedGoogle Scholar
63.
Horsky  J, Kuperman  GJ, Patel  VL.  Comprehensive analysis of a medication dosing error related to CPOE.   J Am Med Inform Assoc. 2005;12(4):377-382. doi:10.1197/jamia.M1740 PubMedGoogle ScholarCrossref
64.
Sittig  DF, Murphy  DR, Smith  MW, Russo  E, Wright  A, Singh  H.  Graphical display of diagnostic test results in electronic health records: a comparison of 8 systems.   J Am Med Inform Assoc. 2015;22(4):900-904. doi:10.1093/jamia/ocv013 PubMedGoogle ScholarCrossref
65.
Middleton  B, Bloomrosen  M, Dente  MA,  et al; American Medical Informatics Association.  Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA.   J Am Med Inform Assoc. 2013;20(e1):e2-e8. doi:10.1136/amiajnl-2012-001458 PubMedGoogle ScholarCrossref
66.
Mazur  LM, Mosaly  PR, Moore  C, Marks  L.  Association of the usability of electronic health records with cognitive workload and performance levels among physicians.   JAMA Netw Open. 2019;2(4):e191709. doi:10.1001/jamanetworkopen.2019.1709 PubMedGoogle Scholar
67.
Howe  JL, Adams  KT, Hettinger  AZ, Ratwani  RM.  Electronic health record usability issues and potential contribution to patient harm.   JAMA. 2018;319(12):1276-1278. doi:10.1001/jama.2018.1171 PubMedGoogle ScholarCrossref
68.
Ratwani  RM, Fairbanks  RJ, Hettinger  AZ, Benda  NC.  Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.   J Am Med Inform Assoc. 2015;22(6):1179-1182. doi:10.1093/jamia/ocv050 PubMedGoogle ScholarCrossref
69.
Institute for Healthcare Improvement/National Patient Safety Foundation. Closing the loop: a guide to safer ambulatory referrals in the EHR era. Accessed May 18, 2020. http://www.ihi.org/resources/Pages/Publications/Closing-the-Loop-A-Guide-to-Safer-Ambulatory-Referrals.aspx
70.
Rayo  MF, Kowalczyk  N, Liston  BW, Sanders  EB, White  S, Patterson  ES.  Comparing the effectiveness of alerts and dynamically annotated visualizations (DAVs) in improving clinical decision making.   Hum Factors. 2015;57(6):1002-1014. doi:10.1177/0018720815585666 PubMedGoogle ScholarCrossref
71.
Sittig  DF, Belmont  E, Singh  H.  Improving the safety of health information technology requires shared responsibility: it is time we all step up.   Healthc (Amst). 2018;6(1):7-12. doi:10.1016/j.hjdsi.2017.06.004 PubMedGoogle ScholarCrossref
72.
ONC’s Cures Act Final Rule. 21st Century Cures Act: interoperability, information blocking, and the ONC Health IT Certification Program. Accessed March 12, 2020. https://www.healthit.gov/curesrule/
73.
Kroth  PJ, Morioka-Douglas  N, Veres  S,  et al.  The electronic elephant in the room: physicians and the electronic health record.   JAMIA Open. 2018;1(1):49-56. doi:10.1093/jamiaopen/ooy016 PubMedGoogle ScholarCrossref
74.
Cutrona  SL, Fouayzi  H, Burns  L,  et al.  Primary care providers’ opening of time-sensitive alerts sent to commercial electronic health record InBaskets.   J Gen Intern Med. 2017;32(11):1210-1219. doi:10.1007/s11606-017-4146-3 PubMedGoogle ScholarCrossref
75.
Barnett  ML, Mehrotra  A, Frolkis  JP,  et al.  Implementation Science Workshop: implementation of an electronic referral system in a large academic medical center.   J Gen Intern Med. 2016;31(3):343-352. doi:10.1007/s11606-015-3516-y PubMedGoogle ScholarCrossref
76.
Ratwani  RM, Savage  E, Will  A,  et al.  A usability and safety analysis of electronic health records: a multi-center study.   J Am Med Inform Assoc. 2018;25(9):1197-1201. doi:10.1093/jamia/ocy088 PubMedGoogle ScholarCrossref
77.
Rizvi  RF, Marquard  JL, Hultman  GM, Adam  TJ, Harder  KA, Melton  GB.  Usability evaluation of electronic health record system around clinical notes usage: an ethnographic study.   Appl Clin Inform. 2017;8(4):1095-1105. doi:10.4338/ACI-2017-04-RA-0067 PubMedGoogle ScholarCrossref
78.
Arndt  BG, Beasley  JW, Watkinson  MD,  et al.  Tethered to the EHR: primary care physician workload assessment using EHR event log data and time-motion observations.   Ann Fam Med. 2017;15(5):419-426. doi:10.1370/afm.2121 PubMedGoogle ScholarCrossref
79.
Tai-Seale  M, Dillon  EC, Yang  Y,  et al.  Physicians’ well-being linked to in-basket messages generated by algorithms in electronic health records.   Health Aff (Millwood). 2019;38(7):1073-1078. doi:10.1377/hlthaff.2018.05509 PubMedGoogle ScholarCrossref
80.
Roman  LC, Ancker  JS, Johnson  SB, Senathirajah  Y.  Navigation in the electronic health record: a review of the safety and usability literature.   J Biomed Inform. 2017;67:69-79. doi:10.1016/j.jbi.2017.01.005 PubMedGoogle ScholarCrossref
Original Investigation
Health Informatics
June 25, 2020

Assessment of Health Information Technology–Related Outpatient Diagnostic Delays in the US Veterans Affairs Health Care System: A Qualitative Study of Aggregated Root Cause Analysis Data

Author Affiliations
  • 1Veterans Affairs (VA) National Center for Patient Safety, Ann Arbor, Michigan
  • 2School of Biomedical Informatics, The University of Texas Health Science Center at Houston
  • 3Department of Urology, University of Michigan, Ann Arbor
  • 4Center for Innovations in Quality, Effectiveness, and Safety (IQuESt) at the Michael E. DeBakey VA Medical Center and Baylor College of Medicine, Houston, Texas
JAMA Netw Open. 2020;3(6):e206752. doi:10.1001/jamanetworkopen.2020.6752
Key Points

Question  What can be learned from analyzing health information technology–related outpatient diagnostic delays in a large, integrated health care system?

Findings  In this cohort study of 214 root cause analyses, aggregated root cause analysis data involving health information technology and outpatient diagnostic delays in the Department of Veterans Affairs from 2013 to 2018 suggest that most safety concerns (83%) involved problems with safe use of technology, which were predominantly attributable to sociotechnical factors associated with people, workflow and communication, and a poorly designed human-computer interface.

Meaning  This study suggests multiple interventions may be used to address outpatient diagnostic delays through improved design, configuration, and use of health information technology.

Abstract

Importance  Diagnostic delay in the outpatient setting is an emerging safety priority that health information technology (HIT) should help address. However, diagnostic delays have persisted, and new safety concerns associated with the use of HIT have emerged.

Objective  To analyze HIT-related outpatient diagnostic delays within a large, integrated health care system.

Design, Setting, and Participants  This cohort study involved qualitative content analysis of safety concerns identified in aggregated root cause analysis (RCA) data related to HIT and outpatient diagnostic delays. The setting was the US Department of Veterans Affairs using all RCAs submitted to the Veterans Affairs (VA) National Center for Patient Safety from January 1, 2013, to July 31, 2018.

Main Outcomes and Measures  Common themes associated with the role of HIT-related safety concerns were identified and categorized according to the Health IT Safety framework for measuring, monitoring, and improving HIT safety. This framework includes 3 related domains (ie, safe HIT, safe use of HIT, and using HIT to improve safety) situated within an 8-dimensional sociotechnical model accounting for interacting technical and nontechnical variables associated with safety. Hence, themes identified enhanced understanding of the sociotechnical context and domain of HIT safety involved.

Results  Of 214 RCAs categorized by the terms delay and outpatient submitted during the study period, 88 were identified as involving diagnostic delays and HIT, from which 172 unique HIT-related safety concerns were extracted (mean [SD], 1.97 [1.53] per RCA). Most safety concerns (82.6% [142 of 172]) involved problems with safe use of HIT, predominantly sociotechnical factors associated with people, workflow and communication, and a poorly designed human-computer interface. Fewer safety concerns involved problems with safe HIT (14.5% [25 of 172]) or using HIT to improve safety (0.3% [5 of 172]). The following 5 key high-risk areas for diagnostic delays emerged: managing electronic health record inbox notifications and communication, clinicians gathering key diagnostic information, technical problems, data entry problems, and failure of a system to track test results.

Conclusions and Relevance  This qualitative study of a national RCA data set suggests that interventions to reduce outpatient diagnostic delays could aim to improve test result management, interoperability, data visualization, and order entry, as well as to decrease information overload.

Introduction

Diagnostic delays are a major threat to outpatient safety.1 Health information technology (HIT) can reduce diagnostic delays by reliably transmitting and tracking test results, supporting intelligent test selection, improving information access and display, and facilitating electronic communication.2,3 However, problems persist despite electronic health record (EHR) implementation, and new unintended safety concerns have emerged, spurring efforts to understand the consequences of HIT on diagnosis.2,4-13

For instance, inadequate test result follow-up is a substantial cause of diagnostic delays in EHR-enabled settings.14 Although electronic test result transmission is more reliable than one on paper, action on test results may be delayed.15-23 More than one-third of patients with lung cancer experience diagnostic delays, mostly from delayed follow-up of abnormal imaging findings.24 Similar follow-up failures can occur in bladder, gastrointestinal, and breast cancer diagnosis.25-28

The use of EHRs enables electronic notification of test results directly to clinicians’ inboxes. However, primary care practitioners (PCPs) receive excessive notifications that increase the risk of failing to see key information: about one-third of PCPs admit to missing abnormal test results.29,30 Many PCPs spend more than 1 hour daily on inbox management alone.31-33

Addressing diagnostic delays in the context of HIT requires improved understanding of complex systems that accounts for interactions between technology, its users, involved workflows, and organizational policies and procedures.34 In this study, we used the analytic lens of the Health IT Safety framework6 to generate a better understanding of diagnostic delays in the setting of HIT. The Health IT Safety framework provides a conceptual foundation for measuring, monitoring, and improving HIT safety and includes 3 related domains situated within an 8-dimensional sociotechnical model accounting for interacting technical and nontechnical variables associated with safety (Table 1).7 These domains include (1) safe HIT, (2) safe use of HIT, and (3) using HIT to improve safety (Table 2).6 We applied the Health IT Safety framework to a database of aggregated root cause analyses (RCAs) conducted within the Veterans Affairs (VA) health care system to characterize the role of HIT safety in outpatient diagnostic delays and to lay a foundation for potential solutions.

Methods
Design and Setting

In this retrospective cohort study, qualitative content analysis was performed to evaluate the role of HIT in RCAs of outpatient diagnostic delays submitted to the VA National Center for Patient Safety (NCPS). The VA health care system provides care to 9 million veterans at 172 medical centers and 1069 outpatient sites using a comprehensive, in-house–designed EHR that since 2000 has been integrated into all facilities.35 The NCPS leads patient safety initiatives within the VA health care system and uses RCAs of adverse events and close calls to promote learning across the system. In addition, the NCPS provides extensive training for local patient safety managers responsible for conducting RCAs, emphasizing a focus on systems rather than individual errors, taking into account principles of just culture, human factors engineering, and EHR usability.36 Patient safety managers are responsible for assembling the local RCA team, which may or may not include an IT professional. On completion of an RCA, the team presents recommendations to local leadership and submits a detailed report to the NCPS database, including a narrative description of the event, root causes, contributing factors, lessons learned, and action plans designed to prevent future adverse events. Previous aggregated reports highlighted process breakdowns in outpatient diagnostic delays.14 Study approval and oversight was provided by the Ann Arbor VA Research and Development Committee. Informed consent was waived because of the use of deidentified data.

Data Collection

The setting was the US Department of Veterans Affairs. All RCAs categorized by the terms delay and outpatient and submitted to the VA NCPS from January 1, 2013, to July 31, 2018, were extracted (n = 214) (eFigure in the Supplement). Methods and findings are reported based on the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline.37 We identified RCAs associated with delay in diagnosis (vs treatment or surgery) through manual review (n = 135). One of us (L.P.) reviewed each case in detail to identify only those RCAs involving some aspect of HIT (n = 88) for subsequent detailed analysis. For each RCA, we recorded the diagnosis (if known), diagnostic delay time (ie, time from initial missed finding to diagnosis) if sufficient data were available, and information on diagnostic tests. One of us (L.P.) extracted unique HIT-related safety concerns from each RCA for subsequent coding. We coded safety concerns using content analysis, applying both a deductive approach to code according to the Health IT Safety framework and an inductive approach to allow for emergent codes to provide a richer descriptive analysis and development of subsequent themes.38-42

Qualitative Analysis

To ensure consistent application of the Health IT Safety framework, the first 30 RCAs (involving 77 unique safety concerns) were coded collaboratively by 3 of us (L.P., D.F.S., and H.S.) according to the HIT safety domain that was applicable (safe HIT, safe use of HIT, or using HIT to improve safety) and 1 or more of the 8 sociotechnical dimensions involved (Table 1). Discrepancies were resolved by consensus. After a shared understanding of the approach was achieved through consistent agreement, one of us (L.P.) coded the remaining RCAs, seeking input in complex cases that involved uncertainty in categorization. Throughout the deductive coding process, we also applied open inductive coding to identify emerging patterns. Content of emergent codes was further refined and combined via a collaborative and iterative analysis of the entire data set. Emergent codes were then organized into higher-level themes to identify high-risk areas to target potential solutions. Similar to previous work,9,43 we used consensus methods for coding. In addition, because the categories that we coded were not mutually exclusive, we were not able to calculate interrater reliability. However, such consensus methods improve identification of high-risk areas (compared with independent coding) because of multidisciplinary discussions between clinicians, safety experts, and informaticians. Microsoft Excel software (Microsoft Corp) was used to code and analyze the data.

Results

Of 214 RCAs included, 88 involved HIT-related safety factors in diagnostic delays. Delayed diagnoses involved cancer (n = 55), infection (n = 10), cardiovascular disease (n = 6), and other (n = 5) (ie, 2 diabetic ketoacidosis, 1 testicular torsion, 1 amyloidosis, and 1 benign pancreatic mass). In 12 cases, local site investigators did not report a single final diagnosis and instead described breakdowns in the diagnostic process (eg, loss of >1000 test result letters from a printer malfunction and delays in triage, consultations, and processing laboratory specimens). For cases in which sufficient information was available to calculate diagnostic delay time (n = 69), the median diagnostic delay was 6 months (range, 4 days to 60 months; interquartile range, 2-12 months). In most RCAs involving HIT and diagnostic delays (n = 64), the primary process breakdown involved inadequate follow-up of 1 or more abnormal test results, including imaging (n = 42), laboratory tests (n = 15), and biopsies (n = 8). In one RCA, 2 results were delayed (both an imaging result and a biopsy result).

From 88 RCAs, 172 unique HIT-related safety concerns (mean [SD], 1.97 [1.53] per RCA) were extracted. Table 3 summarizes categorization of the safety concerns according to HIT safety domain and sociotechnical dimension. Twenty-five safety concerns (14.5%) involved problems with safe HIT, primarily issues with hardware and software, clinical content, and human-computer interface. Examples included failure of test results to transmit, equipment malfunction, and issues with upgrades. Most safety concerns (142 [82.6%]) involved problems with safe use of HIT, predominantly sociotechnical factors associated with workflow and communication, people, and a poorly designed human-computer interface. Examples included failure to respond to inbox notifications, lack of EHR proficiency training, and failure to assign surrogates for inbox coverage. There were 5 safety concerns (0.3%) involving using HIT to improve safety through system measurement and monitoring. In all 5, HIT was used to generate a list of high-risk patient test results for follow-up by the clinical team (eg, positive hepatitis C test results and biopsy results), but follow-up and diagnoses were substantially delayed.

During the process of analyzing and coding safety concerns according to HIT safety domain and sociotechnical dimension, several distinct (but not mutually exclusive) themes emerged. These themes were classified into the following 5 high-risk areas associated with diagnostic delays: managing EHR inbox notifications and communication, clinicians gathering key diagnostic information, technical problems, data entry problems, and failure of a system to track test results (Table 4).

Managing EHR In-box Notifications and Communication

Clinicians rely on the EHR inbox for various types of electronic communication associated with test results, referrals, medication refill requests, patient portal messages, and phone calls. The following 2 notable issues occurred.

Notification Sent but Not Acted On

In 1 case, a PCP missed a notification from a specialist to order a mammogram for a patient with abnormal breast examination findings. In another case, a clinician was notified via a note to correct an order but simply signed off the note instead of making the correction. A third clinician processed multiple test results within the EHR inbox all at once, missed a positive stool test result, and thus failed to order a follow-up colonoscopy.

Several factors contributed to failure to act on notifications. The first factor is notification fatigue and/or information overload. One clinician received more than 100 notifications daily, which was associated with notification fatigue and missed test results. The second factor is inadequate surrogate coverage for staff absence. Both overload of covering clinicians and failure to assign coverage for inbox notifications occurred. One covering clinician received more than 200 notifications in 1 day and subsequently missed an abnormal test result. In other cases, no one was assigned to cover staff on extended leave (eg, no one received biopsy results sent to a clinician on maternity leave) or temporary clinicians, such as residents and locum tenens who had left the organization (eg, no one received imaging results sent to a resident who had completed the rotation). Inadequate coverage of nurses, clerks, and coordinators also led to diagnostic delays. The third factor is inadequate system knowledge (new staff or lack of training). Insufficient training contributed to missed notifications, such as when a new clinician did not know how to process notifications efficiently and became overwhelmed. Fourth is ambiguous responsibility for follow-up. A dermatology e-consultant recommended to a PCP that a patient be seen face-to-face in the dermatology clinic. Neither the dermatologist nor the PCP placed the consult order because there was local variation in processes and it was unclear who was responsible to take action. In another case, both the PCP and a specialist were notified of biopsy results, but neither took appropriate action.

Inadequate Electronic Communication (Delayed or Miscommunication)

Excessive reliance on EHR documentation for communicating time-sensitive or critical information (eg, through electronic messaging or notes) led to miscommunication and diagnostic delays. In 1 case, a mental health clinician wrote a critical laboratory note to convey important handoff information about a seriously ill patient being transferred to primary care rather than communicating verbally, delaying diagnosis of a life-threatening infection. In another instance, an emergency department (ED) clinician added a PCP as a cosigner on his note but buried abnormal imaging findings in the body of the note rather than in the assessment. The PCP read the note but missed critical information. Other electronic communication problems included using the wrong communication format (eg, placing important information regarding patient symptoms to be triaged in a scheduling tool rather than a triage tool) and relying too heavily on note templates that failed to communicate critical information (ie, low signal-to-noise ratio).

In the context of the Health IT Safety framework, safety concerns in this high-risk area involved mostly problems with safe use of HIT (ie, usability and workflow integration) rather than malfunctions of HIT itself as designed and often involved interactions of multiple sociotechnical domains. For example, 1 case of a missed inbox notification may have been associated with lack of clinician training in how to manage test results (ie, people), too many test results to process (ie, clinical content and workflow and communication), and poor visibility of an abnormal test result (ie, human-computer interface), reflecting the complex characteristics of these safety concerns.

Clinicians Gathering Key Diagnostic Information

There were several problems with clinicians gathering key diagnostic information. The first problem was a lack of interoperability (obtaining and viewing outside records). Issues with gathering information from both VA and non-VA clinicians included delays in obtaining records, missed fax reports, delays in outside organizations posting diagnostic information to web portals created specifically to share records, and failure to alert clinicians to review scanned records.44 The second problem was that necessary information was difficult to find, which often was associated with poor visibility of important data and low signal-to-noise ratio within the EHR. Examples included relevant information buried in hundreds of pages of scanned documents, abnormal findings located in the body rather than impression section of radiology reports, addenda to radiology reports and clinic notes missed because they were at the bottom of the screen, poor visibility of scanned laboratory results, and serious medical conditions buried in clinic notes rather than documented on the problem list. The third problem was that patients were seen in clinic without review of abnormal test results. Even though a patient was seen in clinic multiple times, prior abnormal test results were not reviewed. In the context of the Health IT Safety framework, safety concerns were mostly problems with safe use of HIT, in particular issues with human-computer interface (eg, cluttered screens with poor visibility of important data) and workflow and communication (eg, delays in obtaining records associated with missed fax reports).

Technical Problems

Technical safety concerns involved 6 problems with safe HIT, mainly malfunctioning hardware and software. The first safety concern was a failure to generate notifications. In some cases, no notifications were generated to cue clinicians to review outside records scanned into the medical record. In another case, the clinician had altered settings so that only abnormal test results would generate a notification. One laboratory test did not have an abnormal cutoff value listed and so was not flagged as abnormal even though it was. The clinician did not receive a notification and missed the test result. The second concern was malfunctioning radiology codes. The use of inactivated radiology codes failed to trigger notifications. The third concern was that notification disappeared on opening. Clinicians lost track of test results if they were interrupted when processing them as notifications disappeared after opening. The fourth concern was equipment malfunction. A malfunctioning printer failed to print more than 1000 test result notification letters. Laboratory processing equipment broke. The fifth concern was hidden dependencies. Orders were inadvertently left active in some places in the EHR when they were deactivated elsewhere. The sixth concern involved issues with software upgrades. For example, recall appointments were lost during an EHR software upgrade.

Data Entry Problems
Order Entry

Subspecialty consultations were discontinued or delayed because of inadequate information in the electronic consultation order. In 1 case, a consultation for a new finding was transmitted to the urology service. However, the consultation was discontinued administratively because the patient was already being followed up in urology for another issue, and the new finding on the consultation order was missed. In another case, even though it was against the organization’s local policy, an ED clinician was able to place a consultation for an outpatient subspecialist directly from the ED, which was subsequently discontinued without the PCP being notified to reorder it. Another problem was the inability to communicate priority for an urgent order because the only categories available were stat and routine, associated with both overuse and underuse of stat for orders with an urgent need. Other issues involved outdated tests listed in order menus and poor visibility of existing decision support to help clinicians order the correct test. Certain cases involved lack of bundling of required orders (eg, a consult order to an outside institution to perform magnetic resonance imaging was not bundled with the required imaging order, allowing a clinician to order the consult but not the imaging).

Missing Documentation

Clinic notes were missing, and attempts to notify patients of test results were not documented properly. In the context of the Health IT Safety framework, most safety concerns involved problems with safe use of HIT, primarily associated with human-computer interface (poor design of order entry and decision support), workflow and communication (failure to document attempts to notify patients of test results), and people (missing clinic notes).

Failure of a System to Track Test Results

Only a few RCAs specifically mentioned problems with systems for tracking test results, although this likely involved most cases of missed test results. In certain cases, failure or lack of an established tracking system was the main safety concern, whereas in others an established tracking system broke down, such as when a melanoma finding was not entered into the biopsy tracking system and when recall software for colonoscopies malfunctioned. In 5 cases, a tracking system eventually followed up on test results (eg, a nurse reviewing a registry noted a positive stool test result and alerted the clinician to order a colonoscopy), but the diagnosis was delayed. Although in certain cases tracking systems were safety nets to eventually help detect missed test results, they were not always widely used, timely, or error-proof. Safety concerns in this high-risk area involved problems with using HIT to improve safety through system measurement and monitoring.

Discussion

In this retrospective cohort study, outpatient diagnostic delays involving HIT were analyzed, and many sociotechnical problems with safe use of HIT were found, primarily including issues with people (eg, lack of training and failure to act on notifications), workflow and communication (eg, inadequate surrogate coverage and electronic miscommunication), and human-computer interface (eg, order entry design and poorly visible information). Problems involving safe HIT were less common and primarily involved hardware and software and clinical content. Despite the use of test result tracking systems to improve safety, diagnoses were still delayed in a few cases. The following 5 key high-risk areas led to diagnostic delays: managing EHR inbox notifications and communication, clinicians gathering key diagnostic information, technical problems, data entry problems, and failure of a system to track test results.

Study findings confirm the presence of delays in serious diagnoses, including cancers, infections, and cardiovascular disease, because of missed follow-up of test results.24-28 Our study builds on prior evidence of high inbox notification burden29-33 and suggests harm from diagnostic delays directly attributable to information overload from excessive notifications. RCA data support previous literature highlighting the hazards of inadequate surrogate coverage and ambiguous responsibility in dual-alert communication (ie, notification of both ordering clinician and PCP).15,45,46 In addition, although the EHR facilitates asynchronous electronic communication between clinicians through both electronic messaging and note-based communication, this discouraged verbal communication in several situations and increased reliance on EHR templates, with subsequent risk of misunderstanding.10,12 Application of the Health IT Safety framework suggests that many problems with diagnostic delays described herein were associated with usability, design, and workflow integration.

Analysis of aggregated RCAs provided meaningful information even though experts have recently questioned the value of RCA investigations for improving patient safety.47-49 Experts point to reasons like the singular focus on finding the root cause, questionable quality of investigations, hindsight bias, poorly functioning feedback loops, and failure to aggregate learning across incidents.47 Rather than implementing design or structural changes, many RCAs suggest weak actions, such as additional training and policy reinforcement, which are unlikely to decrease event occurrence.48,49 We attempted to overcome these limitations by aggregating analysis across the entire VA health care system. Such aggregate analysis of similar types of patient safety issues is rarely done at an individual organization level but is useful to focus attention on common and broader themes (Table 5) invisible to local site investigators,50,51 who tend to focus on weaker interventions, such as policy reinforcement and training, rather than high-level system changes with larger consequences.

Although it appears that a large number of safety concerns were associated with people using HIT, these cannot be considered as faults of the individuals involved. Cognitive lapses often occur even when the EHR is used as designed and are symptoms of broader system problems with clinical and administrative workflows and EHR design. A poorly designed system increases cognitive demands on individuals and heightens opportunity for human error. This complex interplay between human cognition and the system is well recognized within the discipline of human factors, including the “application of what we know about people, their abilities, characteristics, and limitations to the design of equipment they use, environments in which they function, and jobs they perform.”52 Therefore, interventions to reduce diagnostic delays will need to draw on principles from human factors engineering to design the EHR and work system so that it provides clinicians with the cognitive support they need to do their jobs.

Several interventions could address this multifactorial problem. The first intervention is to redesign EHR inboxes and message workflow. The EHR inboxes could be redesigned to better prioritize, display, and sort messages; track high-risk test results; and allow messages to be easily reassigned to support staff to reduce overload.53 Several recommendations for improvement exist, such as increasing message processing efficiency and decreasing clicks, redesigning the inbox interface, reducing cognitive load, and limiting duplicate or low-value messages.53 All clinicians should be competent in optimal test result management strategies that increase efficiency and decrease errors.54-56 Adequate inbox coverage should be ensured for clinicians who are out of the office or have recently left the organization.45 Efforts should be made to reduce the number of inbox notifications.57 New initiatives that rely on sending additional notifications to clinicians who are already overwhelmed should be avoided. Electronic communication could be streamlined to include only relevant information, and “FYI,” low-value, and duplicate communication should be minimized.32

The second intervention is to develop safety nets to identify missed test results. One example is Kaiser Permanente’s SureNet system to identify test results that still need action.58-60 Electronic trigger tools have been developed to selectively identify missed test results that have not received expected follow-up actions, and additional development and implementation could address diagnostic delays in high-risk conditions, such as cancer.26,61 These innovations are already being tested in the VA health care system. Encouraging patients to access test results directly through online portals may provide another safeguard.62

The third intervention is to improve display of diagnostic information. Research should focus on improving usability of interfaces that are difficult to use or those that obscure important patient information.8,63-67 Clinicians should be included in user interface design processes that strive to improve visibility of critical information and facilitate more efficient information review.68

The fourth intervention is to track referrals. Organizations should develop tracking systems for electronic specialty referrals to reduce breakdowns in the referral process and “close the loop” to referring clinicians.69

The fifth intervention is to optimize order entry design. Order sets could be redesigned to provide helpful, noninterruptive decision support and automatically pull in required information rather than relying on manual clinician entry for basic information.70 Adequate IT resources are needed to report and fix technical issues expeditiously.

The sixth intervention is to pursue interoperability. Lack of interoperability limits the availability of diagnostic information when patients transition care to a new clinician.71 Pursuing interoperability between different VA and non-VA community settings could improve access to important diagnostic information and reduce diagnostic delays. The recent Office of the National Coordinator for Health Information Technology’s Cures Act Final Rule is a step in the right direction.72

Limitations

This study has several limitations. All incidents involved the use of different configurations of the same EHR within a single, large, geographically distributed delivery system and might not be generalizable to other EHRs or health systems. However, other EHRs also have inbox–like notification mechanisms,73,74 electronic communication of test results,46 electronic referrals,75 and screen designs that are difficult to use.76,77 Evidence of order entry problems, diagnostic information that is difficult to find, information overload, and limited physician time to process EHR notifications has emerged from other health systems.67,78-80 Although our sample size was small, case descriptions were rich, spanned a period of 5 years, and involved multiple geographic locations across the United States. In addition, our sample was limited by voluntary reporting and may not be representative of all types of diagnostic delays. Although all events had high actual or potential for harm ascribed to them by local safety personnel, assigning harm, particularly potential harm, is subjective. Most safety personnel are not specifically trained to evaluate EHR usability, and some usability issues may have gone undetected. Indeed, our findings of predominantly people and workflow problems may reflect the tendency of RCA teams to complete their analysis after identifying a human error rather than digging deeper into system and design problems.47-49 Finally, reports are biased because of voluntary reporting, with no controls or noncases for comparison, and do not reflect the true underlying epidemiology of these errors. Nevertheless, cases identified high-risk areas associated with diagnostic delays that can be further explored in epidemiologic studies.

Conclusions

In this qualitative content analysis, the Health IT Safety framework was used as a lens to identify several high-risk areas in outpatient diagnostic delays, many of which are applicable to other health systems using EHRs. These aggregated RCA data provide evidence that high-yield interventions could be aimed at improving test result management, interoperability, data visualization, and order entry, as well as reducing information overload and overreliance on electronic documentation for communicating critical information. The complexity of the association between HIT and diagnostic delays described herein underscores the need for collaboration between clinicians, health system leaders, safety professionals, and HIT designers in the testing and implementation of interventions to improve outpatient safety.

Back to top
Article Information

Accepted for Publication: March 18, 2020.

Published: June 25, 2020. doi:10.1001/jamanetworkopen.2020.6752

Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2020 Powell L et al. JAMA Network Open.

Corresponding Author: Hardeep Singh, MD, MPH, Center for Innovations in Quality, Effectiveness, and Safety (IQuESt) at the Michael E. DeBakey VA Medical Center and Baylor College of Medicine, 2002 Holcombe Blvd, Ste 152, Houston, TX 77030 (hardeeps@bcm.edu).

Author Contributions: Dr Powell had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: All authors.

Acquisition, analysis, or interpretation of data: Powell, Singh.

Drafting of the manuscript: Powell, Singh.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Powell.

Obtained funding: Singh.

Administrative, technical, or material support: Singh.

Supervision: Sittig, Chrouser, Singh.

Conflict of Interest Disclosures: Dr Singh reported receiving grants from the Department of Veterans Affairs and the Agency for Healthcare Research and Quality (AHRQ). No other disclosures were reported.

Funding/Support: This work was supported in part by the Center for Innovations in Quality, Effectiveness and Safety (CIN13-413) (Houston, Texas). In addition, Dr Singh is supported by the Veterans Affairs Health Services Research and Development Service (CRE17-127), a Presidential Early Career Award for Scientists and Engineers (USA 14-274), the AHRQ (R01HS27363), the Veterans Affairs (VA) National Center for Patient Safety, and the Gordon and Betty Moore Foundation.

Role of the Funder/Sponsor: The funders had no role in design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the US government.

References
1.
Bates  DW, Singh  H.  Two decades since To Err Is Human: an assessment of progress and emerging priorities in patient safety.   Health Aff (Millwood). 2018;37(11):1736-1743. doi:10.1377/hlthaff.2018.0738 PubMedGoogle ScholarCrossref
2.
El-Kareh  R, Hasan  O, Schiff  GD.  Use of health information technology to reduce diagnostic errors.   BMJ Qual Saf. 2013;22(suppl 2):ii40-ii51. doi:10.1136/bmjqs-2013-001884 PubMedGoogle ScholarCrossref
3.
Singh  H, Naik  AD, Rao  R, Petersen  LA.  Reducing diagnostic errors through effective communication: harnessing the power of information technology.   J Gen Intern Med. 2008;23(4):489-494. doi:10.1007/s11606-007-0393-z PubMedGoogle ScholarCrossref
4.
Gawande  A. Why doctors hate their computers. The New Yorker. Published November 5, 2018. Accessed March 12, 2020. https://www.newyorker.com/magazine/2018/11/12/why-doctors-hate-their-computers
5.
Campione  JR, Mardon  RE, McDonald  KM.  Patient safety culture, health information technology implementation, and medical office problems that could lead to diagnostic error.   J Patient Saf. 2019;15(4):267-273. doi:10.1097/PTS.0000000000000531 PubMedGoogle ScholarCrossref
6.
Singh  H, Sittig  DF.  Measuring and improving patient safety through health information technology: the Health IT Safety framework.   BMJ Qual Saf. 2016;25(4):226-232. doi:10.1136/bmjqs-2015-004486 PubMedGoogle ScholarCrossref
7.
Sittig  DF, Singh  H.  A new sociotechnical model for studying health information technology in complex adaptive healthcare systems.   Qual Saf Health Care. 2010;19(suppl 3):i68-i74. doi:10.1136/qshc.2010.042085 PubMedGoogle ScholarCrossref
8.
Meeks  DW, Smith  MW, Taylor  L, Sittig  DF, Scott  JM, Singh  H.  An analysis of electronic health record–related patient safety concerns.   J Am Med Inform Assoc. 2014;21(6):1053-1059. doi:10.1136/amiajnl-2013-002578 PubMedGoogle ScholarCrossref
9.
Menon  S, Singh  H, Giardina  TD,  et al.  Safety huddles to proactively identify and address electronic health record safety.   J Am Med Inform Assoc. 2017;24(2):261-267.PubMedGoogle Scholar
10.
Graber  ML, Byrne  C, Johnston  D.  The impact of electronic health records on diagnosis.   Diagnosis (Berl). 2017;4(4):211-223. doi:10.1515/dx-2017-0012 PubMedGoogle ScholarCrossref
11.
Schiff  GD, Bates  DW.  Can electronic clinical documentation help prevent diagnostic errors?   N Engl J Med. 2010;362(12):1066-1069. doi:10.1056/NEJMp0911734 PubMedGoogle ScholarCrossref
12.
Upadhyay  DK, Sittig  DF, Singh  H.  Ebola US Patient Zero: lessons on misdiagnosis and effective use of electronic health records.   Diagnosis (Berl). 2014;1(4):283-287. doi:10.1515/dx-2014-0064 PubMedGoogle ScholarCrossref
13.
Liebovitz  D.  Next steps for electronic health records to improve the diagnostic process.   Diagnosis (Berl). 2015;2(2):111-116. doi:10.1515/dx-2014-0070 PubMedGoogle ScholarCrossref
14.
Giardina  TD, King  BJ, Ignaczak  AP,  et al.  Root cause analysis reports help identify common factors in delayed diagnosis and treatment of outpatients.   Health Aff (Millwood). 2013;32(8):1368-1375. doi:10.1377/hlthaff.2013.0130 PubMedGoogle ScholarCrossref
15.
Singh  H, Thomas  EJ, Mani  S,  et al.  Timely follow-up of abnormal diagnostic imaging test results in an outpatient setting: are electronic medical records achieving their potential?   Arch Intern Med. 2009;169(17):1578-1586. doi:10.1001/archinternmed.2009.263 PubMedGoogle ScholarCrossref
16.
Singh  H, Thomas  EJ, Sittig  DF,  et al.  Notification of abnormal lab test results in an electronic medical record: do any safety concerns remain?   Am J Med. 2010;123(3):238-244. doi:10.1016/j.amjmed.2009.07.027 PubMedGoogle ScholarCrossref
17.
Wahls  T, Haugen  T, Cram  P.  The continuing problem of missed test results in an integrated health system with an advanced electronic medical record.   Jt Comm J Qual Patient Saf. 2007;33(8):485-492. doi:10.1016/S1553-7250(07)33052-3 PubMedGoogle Scholar
18.
Callen  JL, Westbrook  JI, Georgiou  A, Li  J.  Failure to follow-up test results for ambulatory patients: a systematic review.   J Gen Intern Med. 2012;27(10):1334-1348. doi:10.1007/s11606-011-1949-5 PubMedGoogle ScholarCrossref
19.
Casalino  LP, Dunham  D, Chin  MH,  et al.  Frequency of failure to inform patients of clinically significant outpatient test results.   Arch Intern Med. 2009;169(12):1123-1129. doi:10.1001/archinternmed.2009.130 PubMedGoogle ScholarCrossref
20.
Wahls  TL, Cram  PM.  The frequency of missed test results and associated treatment delays in a highly computerized health system.   BMC Fam Pract. 2007;8:32. doi:10.1186/1471-2296-8-32 PubMedGoogle ScholarCrossref
21.
Schiff  GD, Kim  S, Krosnjar  N,  et al.  Missed hypothyroidism diagnosis uncovered by linking laboratory and pharmacy data.   Arch Intern Med. 2005;165(5):574-577. doi:10.1001/archinte.165.5.574 PubMedGoogle ScholarCrossref
22.
Cram  P, Rosenthal  GE, Ohsfeldt  R, Wallace  RB, Schlechte  J, Schiff  GD.  Failure to recognize and act on abnormal test results: the case of screening bone densitometry.   Jt Comm J Qual Patient Saf. 2005;31(2):90-97. doi:10.1016/S1553-7250(05)31013-0 PubMedGoogle Scholar
23.
Litchfield  I, Bentham  L, Lilford  R, McManus  RJ, Hill  A, Greenfield  S.  Test result communication in primary care: a survey of current practice.   BMJ Qual Saf. 2015;24(11):691-699. doi:10.1136/bmjqs-2014-003712 PubMedGoogle ScholarCrossref
24.
Singh  H, Hirani  K, Kadiyala  H,  et al.  Characteristics and predictors of missed opportunities in lung cancer diagnosis: an electronic health record–based study.   J Clin Oncol. 2010;28(20):3307-3315. doi:10.1200/JCO.2009.25.6636 PubMedGoogle ScholarCrossref
25.
Murphy  DR, Meyer  AN, Bhise  V,  et al.  Computerized triggers of big data to detect delays in follow-up of chest imaging results.   Chest. 2016;150(3):613-620. doi:10.1016/j.chest.2016.05.001 PubMedGoogle ScholarCrossref
26.
Murphy  DR, Meyer  AN, Vaghani  V,  et al.  Application of electronic algorithms to improve diagnostic evaluation for bladder cancer.   Appl Clin Inform. 2017;8(1):279-290. doi:10.4338/ACI-2016-10-RA-0176 PubMedGoogle Scholar
27.
Murphy  DR, Meyer  AN, Vaghani  V,  et al.  Electronic triggers to identify delays in follow-up of mammography: harnessing the power of big data in health care.   J Am Coll Radiol. 2018;15(2):287-295. doi:10.1016/j.jacr.2017.10.001PubMedGoogle ScholarCrossref
28.
Murphy  DR, Meyer  AND, Vaghani  V,  et al.  Development and validation of trigger algorithms to identify delays in diagnostic evaluation of gastroenterological cancer [published correction appears in Clin Gastroenterol Hepatol. 2019;17(6):1218].   Clin Gastroenterol Hepatol. 2018;16(1):90-98.PubMedGoogle ScholarCrossref
29.
Singh  H, Spitzmueller  C, Petersen  NJ, Sawhney  MK, Sittig  DF.  Information overload and missed test results in electronic health record–based settings.   JAMA Intern Med. 2013;173(8):702-704. doi:10.1001/2013.jamainternmed.61 PubMedGoogle ScholarCrossref
30.
Singh  H, Spitzmueller  C, Petersen  NJ,  et al.  Primary care practitioners’ views on test result management in EHR-enabled health systems: a national survey.   J Am Med Inform Assoc. 2013;20(4):727-735. doi:10.1136/amiajnl-2012-001267 PubMedGoogle ScholarCrossref
31.
Murphy  DR, Meyer  AN, Russo  E, Sittig  DF, Wei  L, Singh  H.  The burden of inbox notifications in commercial electronic health records.   JAMA Intern Med. 2016;176(4):559-560. doi:10.1001/jamainternmed.2016.0209 PubMedGoogle ScholarCrossref
32.
Murphy  DR, Reis  B, Kadiyala  H,  et al.  Electronic health record–based messages to primary care providers: valuable information or just noise?   Arch Intern Med. 2012;172(3):283-285. doi:10.1001/archinternmed.2011.740 PubMedGoogle ScholarCrossref
33.
Murphy  DR, Reis  B, Sittig  DF, Singh  H.  Notifications received by primary care practitioners in electronic health records: a taxonomy and time analysis.   Am J Med. 2012;125(2):209.e1-209.e7. doi:10.1016/j.amjmed.2011.07.029PubMedGoogle ScholarCrossref
34.
Karsh  BT, Weinger  MB, Abbott  PA, Wears  RL.  Health information technology: fallacies and sober realities.   J Am Med Inform Assoc. 2010;17(6):617-623. doi:10.1136/jamia.2010.005637 PubMedGoogle ScholarCrossref
35.
US Department of Veterans Affairs. Veterans Health Administration. Accessed March 12, 2020. https://www.va.gov/health/
36.
VA National Center for Patient Safety. Root cause analysis. Accessed May 18, 2020. https://www.patientsafety.va.gov/professionals/onthejob/rca.asp
37.
von Elm  E, Altman  DG, Egger  M, Pocock  SJ, Gøtzsche  PC, Vandenbroucke  JP; STROBE Initiative.  The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies.   Int J Surg. 2014;12(12):1495-1499. doi:10.1016/j.ijsu.2014.07.013 PubMedGoogle ScholarCrossref
38.
Ash  JS, Singh  H, Wright  A, Chase  D, Sittig  DF.  Essential activities for electronic health record safety: a qualitative study.   Health Informatics J. Published online March 8, 2019. doi:10.1177/1460458219833109 PubMedGoogle Scholar
39.
Elo  S, Kyngäs  H.  The qualitative content analysis process.   J Adv Nurs. 2008;62(1):107-115. doi:10.1111/j.1365-2648.2007.04569.x PubMedGoogle ScholarCrossref
40.
Ibrahim  S, Donelle  L, Regan  S, Sidani  S.  A qualitative content analysis of nurses’ comfort and employment of workarounds with electronic documentation systems in home care practice.   Can J Nurs Res. 2020;52(1):31-44. doi:10.1177/0844562119855509 PubMedGoogle ScholarCrossref
41.
Pope  C, Ziebland  S, Mays  N.  Qualitative research in health care: analysing qualitative data.   BMJ. 2000;320(7227):114-116. doi:10.1136/bmj.320.7227.114 PubMedGoogle ScholarCrossref
42.
Montini  T, Noble  AA, Stelfox  HT.  Content analysis of patient complaints.   Int J Qual Health Care. 2008;20(6):412-420. doi:10.1093/intqhc/mzn041 PubMedGoogle ScholarCrossref
43.
Wright  A, Ash  J, Erickson  J,  et al.  A qualitative study of the activities performed by people involved in clinical decision support: recommended practices for success.   J Am Med Inform Assoc. 2014;21(3):464-472. doi:10.1136/amiajnl-2013-001771PubMedGoogle Scholar
44.
Legler  A, Price  M, Parikh  M,  et al.  Effect on VA patient satisfaction of provider’s use of an integrated viewer of multiple electronic health records.   J Gen Intern Med. 2019;34(1):132-136. doi:10.1007/s11606-018-4708-z PubMedGoogle ScholarCrossref
45.
Menon  S, Smith  MW, Sittig  DF,  et al.  How context affects electronic health record–based test result follow-up: a mixed-methods evaluation.   BMJ Open. 2014;4(11):e005985. doi:10.1136/bmjopen-2014-005985 PubMedGoogle Scholar
46.
Yackel  TR, Embi  PJ.  Unintended errors with EHR-based result management: a case series.   J Am Med Inform Assoc. 2010;17(1):104-107. doi:10.1197/jamia.M3294 PubMedGoogle ScholarCrossref
47.
Peerally  MF, Carr  S, Waring  J, Dixon-Woods  M.  The problem with root cause analysis.   BMJ Qual Saf. 2017;26(5):417-422.PubMedGoogle Scholar
48.
Kellogg  KM, Hettinger  Z, Shah  M,  et al.  Our current approach to root cause analysis: is it contributing to our failure to improve patient safety?   BMJ Qual Saf. 2017;26(5):381-387.PubMedGoogle Scholar
49.
Trbovich  P, Shojania  KG.  Root-cause analysis: swatting at mosquitoes versus draining the swamp.   BMJ Qual Saf. 2017;26(5):350-353. doi:10.1136/bmjqs-2016-006229 PubMedGoogle Scholar
50.
Neily  J, Ogrinc  G, Mills  P,  et al.  Using aggregate root cause analysis to improve patient safety.   Jt Comm J Qual Saf. 2003;29(8):434-439, 381. doi:10.1016/S1549-3741(03)29052-3PubMedGoogle Scholar
51.
Corwin  GS, Mills  PD, Shanawani  H, Hemphill  RR.  Root cause analysis of ICU adverse events in the Veterans Health Administration.   Jt Comm J Qual Patient Saf. 2017;43(11):580-590. doi:10.1016/j.jcjq.2017.04.009 PubMedGoogle Scholar
52.
Human Factors and Ergonomics Society. Definitions of HF/E. Accessed March 12, 2020. https://www.hfes.org/resources/educational-and-professional-resources/new-item
53.
Murphy  DR, Giardina  TD, Satterly  T, Sittig  DF, Singh  H.  An exploration of barriers, facilitators, and suggestions for improving electronic health record inbox-related usability: a qualitative analysis.   JAMA Netw Open. 2019;2(10):e1912638. doi:10.1001/jamanetworkopen.2019.12638 PubMedGoogle Scholar
54.
Singh  H, Wilson  L, Reis  B, Sawhney  MK, Espadas  D, Sittig  DF.  Ten strategies to improve management of abnormal test result alerts in the electronic health record.   J Patient Saf. 2010;6(2):121-123. doi:10.1097/PTS.0b013e3181ddf652 PubMedGoogle ScholarCrossref
55.
Castellucci  M. One-on-one EHR training improves physician satisfaction, saves time. Accessed March 12, 2020. https://www.modernhealthcare.com/article/20180915/TRANSFORMATION02/180919952/one-on-one-ehr-training-improves-physician-satisfaction-saves-time
56.
Robinson  KE, Kersey  JA.  Novel electronic health record (EHR) education intervention in large healthcare organization improves quality, efficiency, time, and impact on burnout.   Medicine (Baltimore). 2018;97(38):e12319. doi:10.1097/MD.0000000000012319 PubMedGoogle Scholar
57.
Shah  T, Patel-Teague  S, Kroupa  L, Meyer  AND, Singh  H.  Impact of a national QI programme on reducing electronic health record notifications to clinicians.   BMJ Qual Saf. 2019;28(1):10-14. doi:10.1136/bmjqs-2017-007447 PubMedGoogle ScholarCrossref
58.
Danforth  KN, Smith  AE, Loo  RK, Jacobsen  SJ, Mittman  BS, Kanter  MH.  Electronic clinical surveillance to improve outpatient care: diverse applications within an integrated delivery system.   EGEMS (Wash DC). 2014;2(1):1056. doi:10.13063/2327-9214.1056PubMedGoogle Scholar
59.
Sim  JJ, Rutkowski  MP, Selevan  DC,  et al.  Kaiser Permanente creatinine safety program: a mechanism to ensure widespread detection and care for chronic kidney disease.   Am J Med. 2015;128(11):1204-1211.e1. doi:10.1016/j.amjmed.2015.05.037PubMedGoogle ScholarCrossref
60.
Danforth  KN, Hahn  EE, Slezak  JM,  et al.  Follow-up of abnormal estimated GFR results within a large integrated health care delivery system: a mixed-methods study.   Am J Kidney Dis. 2019;74(5):589-600. doi:10.1053/j.ajkd.2019.05.003 PubMedGoogle ScholarCrossref
61.
Murphy  DR, Wu  L, Thomas  EJ, Forjuoh  SN, Meyer  AN, Singh  H.  Electronic trigger-based intervention to reduce delays in diagnostic evaluation for cancer: a cluster randomized controlled trial.   J Clin Oncol. 2015;33(31):3560-3567. doi:10.1200/JCO.2015.61.1301 PubMedGoogle ScholarCrossref
62.
Pillemer  F, Price  RA, Paone  S,  et al.  Direct release of test results to patients increases patient engagement and utilization of care.   PLoS One. 2016;11(6):e0154743. doi:10.1371/journal.pone.0154743 PubMedGoogle Scholar
63.
Horsky  J, Kuperman  GJ, Patel  VL.  Comprehensive analysis of a medication dosing error related to CPOE.   J Am Med Inform Assoc. 2005;12(4):377-382. doi:10.1197/jamia.M1740 PubMedGoogle ScholarCrossref
64.
Sittig  DF, Murphy  DR, Smith  MW, Russo  E, Wright  A, Singh  H.  Graphical display of diagnostic test results in electronic health records: a comparison of 8 systems.   J Am Med Inform Assoc. 2015;22(4):900-904. doi:10.1093/jamia/ocv013 PubMedGoogle ScholarCrossref
65.
Middleton  B, Bloomrosen  M, Dente  MA,  et al; American Medical Informatics Association.  Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA.   J Am Med Inform Assoc. 2013;20(e1):e2-e8. doi:10.1136/amiajnl-2012-001458 PubMedGoogle ScholarCrossref
66.
Mazur  LM, Mosaly  PR, Moore  C, Marks  L.  Association of the usability of electronic health records with cognitive workload and performance levels among physicians.   JAMA Netw Open. 2019;2(4):e191709. doi:10.1001/jamanetworkopen.2019.1709 PubMedGoogle Scholar
67.
Howe  JL, Adams  KT, Hettinger  AZ, Ratwani  RM.  Electronic health record usability issues and potential contribution to patient harm.   JAMA. 2018;319(12):1276-1278. doi:10.1001/jama.2018.1171 PubMedGoogle ScholarCrossref
68.
Ratwani  RM, Fairbanks  RJ, Hettinger  AZ, Benda  NC.  Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors.   J Am Med Inform Assoc. 2015;22(6):1179-1182. doi:10.1093/jamia/ocv050 PubMedGoogle ScholarCrossref
69.
Institute for Healthcare Improvement/National Patient Safety Foundation. Closing the loop: a guide to safer ambulatory referrals in the EHR era. Accessed May 18, 2020. http://www.ihi.org/resources/Pages/Publications/Closing-the-Loop-A-Guide-to-Safer-Ambulatory-Referrals.aspx
70.
Rayo  MF, Kowalczyk  N, Liston  BW, Sanders  EB, White  S, Patterson  ES.  Comparing the effectiveness of alerts and dynamically annotated visualizations (DAVs) in improving clinical decision making.   Hum Factors. 2015;57(6):1002-1014. doi:10.1177/0018720815585666 PubMedGoogle ScholarCrossref
71.
Sittig  DF, Belmont  E, Singh  H.  Improving the safety of health information technology requires shared responsibility: it is time we all step up.   Healthc (Amst). 2018;6(1):7-12. doi:10.1016/j.hjdsi.2017.06.004 PubMedGoogle ScholarCrossref
72.
ONC’s Cures Act Final Rule. 21st Century Cures Act: interoperability, information blocking, and the ONC Health IT Certification Program. Accessed March 12, 2020. https://www.healthit.gov/curesrule/
73.
Kroth  PJ, Morioka-Douglas  N, Veres  S,  et al.  The electronic elephant in the room: physicians and the electronic health record.   JAMIA Open. 2018;1(1):49-56. doi:10.1093/jamiaopen/ooy016 PubMedGoogle ScholarCrossref
74.
Cutrona  SL, Fouayzi  H, Burns  L,  et al.  Primary care providers’ opening of time-sensitive alerts sent to commercial electronic health record InBaskets.   J Gen Intern Med. 2017;32(11):1210-1219. doi:10.1007/s11606-017-4146-3 PubMedGoogle ScholarCrossref
75.
Barnett  ML, Mehrotra  A, Frolkis  JP,  et al.  Implementation Science Workshop: implementation of an electronic referral system in a large academic medical center.   J Gen Intern Med. 2016;31(3):343-352. doi:10.1007/s11606-015-3516-y PubMedGoogle ScholarCrossref
76.
Ratwani  RM, Savage  E, Will  A,  et al.  A usability and safety analysis of electronic health records: a multi-center study.   J Am Med Inform Assoc. 2018;25(9):1197-1201. doi:10.1093/jamia/ocy088 PubMedGoogle ScholarCrossref
77.
Rizvi  RF, Marquard  JL, Hultman  GM, Adam  TJ, Harder  KA, Melton  GB.  Usability evaluation of electronic health record system around clinical notes usage: an ethnographic study.   Appl Clin Inform. 2017;8(4):1095-1105. doi:10.4338/ACI-2017-04-RA-0067 PubMedGoogle ScholarCrossref
78.
Arndt  BG, Beasley  JW, Watkinson  MD,  et al.  Tethered to the EHR: primary care physician workload assessment using EHR event log data and time-motion observations.   Ann Fam Med. 2017;15(5):419-426. doi:10.1370/afm.2121 PubMedGoogle ScholarCrossref
79.
Tai-Seale  M, Dillon  EC, Yang  Y,  et al.  Physicians’ well-being linked to in-basket messages generated by algorithms in electronic health records.   Health Aff (Millwood). 2019;38(7):1073-1078. doi:10.1377/hlthaff.2018.05509 PubMedGoogle ScholarCrossref
80.
Roman  LC, Ancker  JS, Johnson  SB, Senathirajah  Y.  Navigation in the electronic health record: a review of the safety and usability literature.   J Biomed Inform. 2017;67:69-79. doi:10.1016/j.jbi.2017.01.005 PubMedGoogle ScholarCrossref
×