Electronic Health Records—A System Only as Beneficial as Its Data | Electronic Health Records | JAMA Network Open | JAMA Network
[Skip to Navigation]
Sign In
Views 3,876
Citations 0
Invited Commentary
Health Policy
September 18, 2019

Electronic Health Records—A System Only as Beneficial as Its Data

Author Affiliations
  • 1Department of Neurology, Icahn School of Medicine at Mount Sinai, New York, New York
  • 2Department of Population Health Science and Policy, Icahn School of Medicine at Mount Sinai, New York, New York
JAMA Netw Open. 2019;2(9):e1911679. doi:10.1001/jamanetworkopen.2019.11679

Health care innovations can influence patient care, and enhancements in health information technology have avowed to improve patient safety and reduce medical errors.1 Studies to improve safety and decrease medical errors have been identified as research priorities by the National Academy of Medicine since publishing its report on building a safer health system in 1999.2 Electronic health records (EHRs) can achieve this vision by improving the efficiency and quality of health care and by streamlining health care processes.3 Despite health information technology advances, we are still limited by the data that are elicited at the bedside and recorded in the EHR by health care professionals.

In 1995, the Centers for Medicare & Medicaid Services (CMS) established policies tying reimbursement for evaluation and management services to documentation requirements. Since then, CMS rules and regulations have influenced the documentation structure of EHRs. To encourage the implementation of EHR and align financial incentives, the 2009 Health Information Technology for Economics and Clinical Health Act launched the Medicare and Medicaid Meaningful Use reward and incentive program. From 2008 to 2012, a rapid adoption of EHRs among general acute hospitals increased the use of EHRs from 9% to 44%.4 In principle, the benefits of health information technology seem obvious, eg, it improves quality of care, increases patient safety, leads to fewer medical errors, and reinforces interaction between physician and patient. However, even with the wide adoption of EHRs, there have been mixed results.3 It is not uncommon to see studies that show that EHRs may lead to inadvertent consequences, resulting in new safety risks and medical errors.5

Since its inception, CMS has tried to find ways to reduce the documentation burden associated with evaluation and management services, stating that requirements were often outdated with respect to the practice of medicine and that coding nuances were too complex and ambiguous. Very few peer-reviewed studies have evaluated the veracity of physician documentation within EHRs.6 The study by Berdahl et al7 reports on the association of concordance of EHR documentation with emergency physicians’ observed behaviors. In November 2018, CMS released the 2019 Medicare Physician Fee Schedule Final Rule,8 comprising new documentation rules and regulations, which details a new payment methodology for evaluation and management services that will come into effect on January 1, 2021.

There exist 7 elements within emergency department evaluation and management service standards, of which the first 3 are deemed key factors: (1) history, (2) examination, (3) medical decision-making, (4) counseling, (5) coordination of care, (6) nature of presenting concern, and (7) time. Certain sections of emergency physician documentation, such as the review of systems (ROS) and the physical examination (PE), may be more susceptible to errors owing to the widespread use of autopopulated text. The study by Berdahl et al7 evaluated how well EHR documentation represented the ROS and PE performed by a small group of emergency physician residents. As part of this study, 10 real-time patient-physician encounters were observed per physician resident (9 final-year emergency medicine physicians) per site (2 sites) to quantify the percentage of documentation of the ROS and PE that observers (12 observers, comprising 10 undergraduate students and 2 attending emergency physicians) could confirm with subsequent EHR review. Observers confirmed approximately 40% of ROS and 56% of PE documentation, demonstrating the need for improvement in physician documentation. Unsubstantiated documentation was more common for elements that seemed to be less clinically relevant.

There are limitations and strengths to this study. One of the most significant limitations is that the observed physicians were residents rather than practicing attending physicians. Residents may not be aware of compliance requirements, and this may highlight the need for earlier compliance education in residency training. Another concern is that most observers were undergraduate students who did not have any medical training. Although they were trained by the study team, it is possible that they could have missed some of the information or not realized that a particular PE maneuver was the same as another and, as a result, noted it as missing from the examination (eg, a variation in testing for Babinski sign). Despite these limitations, the authors performed 1 of the few EHR information accuracy investigations using concurrent observation. It is a well–thought-out observational study that overcomes the challenges of data collection, physician resistance to auditing, and the desire to preserve an image of physician infallibility.

It is of vital importance for clinical health services and legal purposes that clinicians document medical record information consistent with the level of care given. While these findings raise the possibility that some documentation did not reflect physician actions, further studies are necessary to see how widespread this occurrence is within all specialties and whether this is also the case for attending physicians. There is also a need to investigate differing health systems. An improved understanding of the root cause of discrepancies between patient report and physician documentation will be helpful in detecting ways to prevent them in the future. Health system accountability is becoming increasingly important for health policy makers and is ultimately necessary to ensure that patients receive the best possible care. Further studies will be vital to better understand physician reporting behaviors and to identify optimal approaches to improve it.

Back to top
Article Information

Published: September 18, 2019. doi:10.1001/jamanetworkopen.2019.11679

Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2019 Jetté N et al. JAMA Network Open.

Corresponding Author: Nathalie Jetté, MD, MSc, FRCPC, Department of Neurology, Icahn School of Medicine at Mount Sinai, One Gustave L. Levy Place, New York, NY 10029 (nathalie.jette@mssm.edu).

Conflict of Interest Disclosures: Dr Jetté reported receiving grant funding paid to her institution from the National Institute of Neurological Disorders and Stroke and Alberta Health outside the submitted work and receiving grant funding from the Patient-Centered Outcomes Research Institute for research on epilepsy e-learning health systems, which, although outside this work, is aimed at using electronic health records for quality improvement and comparative effectiveness research. No other disclosures reported.

Additional Information: Drs Jetté and Kwon contributed equally to this work.

References
1.
Bates  DW, Gawande  AA.  Improving safety with information technology.  N Engl J Med. 2003;348(25):2526-2534. doi:10.1056/NEJMsa020847PubMedGoogle ScholarCrossref
2.
Kohn  LT, Corrigan  JM, Donaldson  MS.  To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press; 1999.
3.
Chaudhry  B, Wang  J, Wu  S,  et al.  Systematic review: impact of health information technology on quality, efficiency, and costs of medical care.  Ann Intern Med. 2006;144(10):742-752. doi:10.7326/0003-4819-144-10-200605160-00125PubMedGoogle ScholarCrossref
4.
DesRoches  CM, Charles  D, Furukawa  MF,  et al.  Adoption of electronic health records grows rapidly, but fewer than half of US hospitals had at least a basic system in 2012.  Health Aff (Millwood). 2013;32(8):1478-1485. doi:10.1377/hlthaff.2013.0308PubMedGoogle ScholarCrossref
5.
Sittig  DF, Singh  H.  Defining health information technology-related errors: new developments since To Err Is Human Arch Intern Med. 2011;171(14):1281-1284. doi:10.1001/archinternmed.2011.327PubMedGoogle ScholarCrossref
6.
Kuhn  T, Basch  P, Barr  M, Yackel  T; Medical Informatics Committee of the American College of Physicians.  Clinical documentation in the 21st century: executive summary of a policy position paper from the American College of Physicians.  Ann Intern Med. 2015;162(4):301-303. doi:10.7326/M14-2128PubMedGoogle ScholarCrossref
7.
Berdahl  CT, Moran  GJ, McBride  O, Santini  AM, Verzhbinsky  IA, Schriger  DL.  Concordance between electronic clinical documentation and physicians’ observed behavior.  JAMA Netw Open. 2019;2(9):e1911390. doi:10.1001/jamanetworkopen.2019.11390.Google Scholar
8.
Centers for Medicare & Medicaid Services. Medicare program; revisions to payment policies under the Physician Fee Schedule and other revisions to part B for CY 2019. https://www.govinfo.gov/content/pkg/FR-2018-11-23/pdf/2018-24170.pdf. Accessed August 19, 2019.
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    1 Comment for this article
    EXPAND ALL
    Quality of data in electroinic medical records
    Peter Goldschmidt, MD, DrPH, DMS | President, WORLD DEVELOPMENT GROUP, Inc
    The adoption of electronic health records (EHR) was accompanied by the expectation that they would somehow improve the quality of care, for example, thorough 1) better documentation, 2) use of EHR-based clinical decision support systems (CDSS) that operationalize clinical practice guidelines (CPG), and 3) the secondary use of real-world-data for all manner of purposes including machine learning. This expectation rests on the fundamental assumption that 1) documentation captures what was done is practice and 2) practice reflects what should have been done and documented. In their article Berdahl et al describe inconsistencies between the documentation of findings in the EHR and observation reports of physician actions [1]. The quality of EHR data is further degraded when 1) clinicians fail to elicit necessary data or make errors in observations or 2) patients provide false reports that are accepted and documented. As the authors note an EHR is only as beneficial as its data. Only what clinicians document can be known to others. Moreover, data quality problems are compounded when data are transported from their source devoid of context. Despite its obvious importance, 1) documenting patient care is a tedious and sometimes burdensome task, 2) ensuring the quality of medical records is a thankless undertaking, and 3) researching the subject is unglamorous. There are few studies of the quality of medical record documentation and fewer still on how to improve it. Some 30 years ago, QMS - the first commercially-available computerized system for assessing the quality of medical care - monitored routinely the completeness of medical record documentation. Benchmarking showed 1) large variation in documentation completeness, 2) some hospitals were routinely achieving good documentation, and 3) documentation improved through feedback [2]. While EHR systems can enable documentation, they do not guarantee, good quality data. In the digital age, CDSS integrated seamlessly into workflows and tightly coupled with patient registries can create learning loops that 1) can materially improve CDSS utility by evaluating current performance and suggesting future improvements, 2) can help to resolve many of the current problems with CPG, and 3) can lead to improved EHR systems and hence can promote more appropriate documentation. Uncertainly about data quality is one of the greatest obstacles to the use of artificial intelligence in medicine. The time has come to utilize fully the capabilities afforded by the digitation of medicine to drive improvement in clinical care and patient outcomes; that requires 1) using technology to enable clinicians to document care without undue burden and 2) assuring the quality of data in medical records.

    References
    1. Berdahl CT, Moran GJ, McBride O, et al. Concordance between electronic clinical documentation and physicians' observed behavior. JAMA Network Open 2019; 2(9): e1911390.
    2. Wilson LL, Goldschmidt PG. Quality management in health care.
    CONFLICT OF INTEREST: None Reported
    READ MORE
    ×