Figure. Screen brought up after clinician clicks the “P” alert on the electronic tracking board. Estimated likelihood of pneumonia as well as relevant clinical features are presented to the clinician, who may agree or disagree with the diagnosis.
Dean NC, Jones BE, Ferraro JP, Vines CG, Haug PJ. Performance and Utilization of an Emergency Department Electronic Screening Tool for Pneumonia. JAMA Intern Med. 2013;173(8):699-701. doi:10.1001/jamainternmed.2013.3299
Author Affiliations: Division of Pulmonary and Critical Care Medicine (Drs Dean and Jones) and Emergency Department (Dr Vines), Intermountain Medical Center, and Department of Internal Medicine, University of Utah (Drs Dean and Jones), Salt Lake City; and Homer Warner Center for Informatics Research, Salt Lake City, Utah (Mr Ferraro and Dr Haug).
Appropriate treatment of pneumonia begins with accurate diagnosis. However, clinicians have difficulty integrating data for clinical decision making.1 Significant variability in pneumonia management exists in the emergency department (ED).2 Decision support might decrease variability and improve care, but physician utilization is historically low.3 An alerting tool is needed for physicians to utilize computer-based pneumonia decision support.
We developed a real-time electronic screening tool that identifies patients with pneumonia by applying Bayesian probabilistic logic to the electronic medical record, and we implemented the tool in 4 EDs. A “P” appears on the ED electronic tracker board when pneumonia likelihood reaches 40%. Clicking on the icon displays pneumonia likelihood and the relevant data (Figure). After confirming the diagnosis, the ED physician proceeds with a linked decision support tool that provides management recommendations.
We measured tool sensitivity, specificity, and predictive values against physician panel case review. We also measured physician acknowledgment of the screening tool alert and utilization of the linked pneumonia management tool.
In 1995, Intermountain Healthcare began deploying an electronic medical record in Salt Lake County, Utah EDs, currently staffed by 80 attending physicians and 24 emergency medicine residents. A paper-based pneumonia guideline was implemented with moderate success in 1996.4 Electronic screening tool development began in 1998,5 but implementation was impractical until dictated radiology reports became electronically available a median of 20 minutes after study completion.
The pneumonia screening tool is based on a medical screening and diagnosis framework within Intermountain Healthcare's computerized environment. The tool monitors ED patient data, extracts relevant clinical features, and calculates the pneumonia probability. The tool development reference data set included 48 449 Intermountain ED patient encounters from 2008 through 2010, including 2413 patients with pneumonia. Utilized data include 6 vital sign variables, 6 laboratory values, 25 nursing assessment variables, patient age, patient's chief complaint, and findings extracted from the chest imaging report using natural language processing. For example, complaints of fever or cough increase likelihood of pneumonia. Since pneumonia is “a constellation of suggestive clinical features with a demonstrable infiltrate,”6(pS39) the tool runs when any ED patient's dictated chest imaging report becomes available.
The study population included all ED patients with chest imaging during two 2-month periods in 2011, immediately following implementation in May and then 6 months later. We thereby studied tool performance during early summer and early winter periods of pneumonia incidence and measured physician utilization over time. We randomly selected from each time period 300 plus 60 additional patients for whom the tool alerted for possible pneumonia. Three physician authors independently reviewed patient records to determine presence of pneumonia. Initial disagreement was resolved by discussion and consensus. The Intermountain Healthcare institutional review board approved tool development, implementation, and data collection as quality improvement.
Random sample data allowed unbiased estimates and CIs using standard formulas. To correct for verification bias in the combined sample, sensitivity and specificity estimates were computed using the Begg and Greenes approach7 with 95% CIs by bias-corrected bootstrap. Tool utilization was analyzed by χ2 analysis.
Chest imaging was performed on 13 859 ED patients during the 2 study intervals. Pneumonia incidence was 8.2%. Among patients with pneumonia confirmed by physician panel review (n = 109), mean (SD) age was 56.2 (20.7) years; 56% were women; and hospital admission rate was 48%.
The screening tool alerted in 81 of 109 true cases of pneumonia (74%) by physician review. The tool falsely alerted in 78 of 612 patients without pneumonia (13%). Sensitivity was 40.9% (95% CI, 32.0%-52.0%); specificity, 96.6% (95% CI, 95.9%-97.3%); positive predictive value, 50.9% (95% CI, 42.9%-58.9%); and negative predictive value, 95.0% (95% CI, 92.9%-96.7%). Imaging reports incorrectly read positive by the tool caused 59% of false-positive cases. Most false-negative cases were caused by lack of clinical pneumonia features other than chest imaging.
Among true-positive pneumonia cases, physicians' agreement with the tool alert increased over the first 6 months nonsignificantly from 37% (15 of 41) to 53% (21 of 40) (P = .18). Of true pneumonia cases, tool utilization increased from 12% (6 of 49) to 48% (29 of 60) (P < .001).
We developed and implemented an electronic screening tool for pneumonia with moderate accuracy compared with physician review. We have not found a similar tool reported previously.6 Our tool's greatest challenge has been natural language processing translation of dictated, unstructured radiology reports into discrete conclusions. Variability in radiology reports makes interpretation difficult both for computers and clinicians. Changing to structured reporting would improve tool performance (eg, “new parenchymal opacity consistent with pneumonia in the appropriate clinical setting: yes/no/possibly”).
Choosing the alert threshold of 40% was another challenge. Screening tools usually maximize sensitivity over specificity, but we emphasized specificity to avoid “alert fatigue.”8 Because lower sensitivity resulted, we allowed clinicians to launch the management tool via a desktop icon in a parallel process.
Accuracy and utilization will likely improve with iterative improvements. The next performance test will be to determine if the tool helps physicians diagnose pneumonia and improve clinical outcomes.6
Correspondence: Dr Dean, Intermountain Medical Center, 5121 S Cottonwood St, Murray, UT 84107 (firstname.lastname@example.org).
Published Online: March 18, 2013. doi:10.1001/jamainternmed.2013.3299
Author Contributions:Study concept and design: Dean, Jones, and Haug. Acquisition of data: Dean, Jones, Ferraro, and Vines. Analysis and interpretation of data: Dean, Jones, and Ferraro. Drafting of the manuscript: Dean and Jones. Critical revision of the manuscript for important intellectual content: Dean, Ferraro, Vines, and Haug. Statistical analysis: Jones. Administrative, technical, and/or material support: Ferraro. Study supervision: Haug. Tool implementation: Dean.
Conflict of Interest Disclosures: None reported.
Funding/Support: This study was funded in part by Intermountain Healthcare and the Intermountain Research and Medical Foundation Salt Lake City, Utah, and by Pfizer US from an unrestricted, investigator-initiated competitive grant.
Role of the Sponsors: The sponsors had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript. None of the authors have relevant financial interests, activities, relationships, or affiliations.
Previous Presentation: This work was presented in abstract form at the European Respiratory Society International Conference; September 3, 2012; Vienna, Austria.
Additional Contributions: The authors thank Al Jephson, BS, for data extraction; Gregory J. Stoddard, MS, for statistical analysis; Dominik Aronsky, MD, for his original work on the electronic tool and many suggestions since; Todd L. Allen, MD, for helping design and implement the tool in the ED; Herman B. Post, BS, Kumar Mynam, MS, and Darin Wilcox, BS, for their design and programming of the tool; and the ED physicians providing care at the Salt Lake County Intermountain Hospitals for their interest in quality improvement and helpful suggestions during development of the electronic decision support tool.