[Skip to Navigation]
Sign In
Table. 
Distribution of Order Indication Quality by Montha
Distribution of Order Indication Quality by Montha
1.
Cohen  MD Accuracy of information on imaging requisitions: does it matter?  J Am Coll Radiol 2007;4 (9) 617- 621PubMedGoogle ScholarCrossref
2.
Loy  CTIrwig  L Accuracy of diagnostic tests read with and without clinical information: a systematic review.  JAMA 2004;292 (13) 1602- 1609PubMedGoogle ScholarCrossref
3.
Leslie  AJones  AJGoddard  PR The influence of clinical information on the reporting of CT by radiologists.  Br J Radiol 2000;73 (874) 1052- 1055PubMedGoogle ScholarCrossref
4.
Cohen  MDCurtin  SLee  R Evaluation of the quality of radiology requisitions for intensive care unit patients.  Acad Radiol 2006;13 (2) 236- 240PubMedGoogle ScholarCrossref
5.
Gunderman  RBPhillips  MDCohen  MD Improving clinical histories on radiology requisitions.  Acad Radiol 2001;8 (4) 299- 303PubMedGoogle ScholarCrossref
6.
Sistrom  CLDang  PAWeilburg  JBDreyer  KJRosenthal  DIThrall  JH Effect of computerized order entry with integrated decision support on the growth of outpatient procedure volumes: seven-year time series analysis.  Radiology 2009;251 (1) 147- 155PubMedGoogle ScholarCrossref
Research Letter
Health Care Reform
June 13, 2011

Effect of Computerized Physician Order Entry on Radiologic Examination Order Indication Quality

Author Affiliations

Author Affiliations: Imaging Institute (Drs Schneider and Obuchowski, Mr Franz, and Ms Spitznagel), Otolaryngology–Head and Neck Surgery Information Technology Division (Dr Bascom), and Quantitative Health Sciences (Dr Obuchowski), The Cleveland Clinic, Cleveland, Ohio.

Arch Intern Med. 2011;171(11):1036-1038. doi:10.1001/archinternmed.2011.234

Order information is critical to perform the appropriate examination1 and interpretation.1-3 Of 58 paper radiograph requisitions, 1 study found that 91% lacked appropriate indications, 10% were not in the medical records, 20% were without indications, 34% had inadequate/incomplete information, and 27% contained different information than in the medical records.4 Another study of 150 inpatient paper orders for chest radiographs found that 29% were missing current indications and 31% were missing appropriate indications.5 The large volume of diagnostic radiologic examinations performed annually (approximately 599 million in 2006 for the United States), the potential cost of incomplete and repeated examinations and the potential for decreased quality of care motivated an investigation of the improvements afforded by computerized physician order entry (CPOE).

Methods

Until spring 2008, our inpatient practice was paper based and clerks entered radiologic examination orders. After implementation of CPOE, indications were required in 2 free-text fields (signs and symptoms and presumed diagnoses) with review and approval by the attending physician.

Order indications were assessed 2 months prior to, 1 month immediately after, and 3 to 4 months after implementation of CPOE. We assigned medical abbreviations a low score because of their nonuniqueness and the ease of making a typographical error. Other classifications included the following: incomplete (blank fields or terms such as routine or doctor's orders without additional information); inadequate (either field contained prior surgery status, line or intubation checks, and the other contained terms such as per protocol or routine); nonapplicable (indication not relevant to the anatomic site, eg, radiograph abdomen with “change in mental status”); reasonable (sufficient information); or, excellent (relevant or valid reason).

Data were retrospectively collected from all inpatients, 18 years or older, with radiologic examination orders during January, May, or August 2008. This study was approved by The Cleveland Clinic institutional review board and granted a waiver of informed consent.

Results

The frequency of “reasonable” or “excellent” indications was calculated. The time before or after CPOE implementation, hospital unit, specialty of the ordering health care provider, and imaging modality were evaluated along with 2-way interactions with order month. Clinical significance was considered with the doubling of the percentage of complete and meaningful indications; P ≤ .05 was considered statistically significant. Logistic regression analysis assessed the outcome variable as a function of CPOE intervention. Test times were the predictor variable. Generalized estimating equations were used to account for multiple studies per patient.

A total of 37 494 imaging orders from 6332 inpatients were investigated: 15 081 from January, 8734 from May, and 13 679 from August 2008 (Table). Not all units had CPOE implemented by May 1; hence, the number of orders is smaller in May. A reduction in hospital census likely contributed to the 8% decrease in the number of inpatient radiologic examination orders for August compared with January. All units had CPOE implemented for at least 3 full months as of August 1. A statistically significant increase from 6.4% (965 of 15 081 orders) to 21.6% (2955 of 13 679 orders) in “reasonable” and “excellent” quality indications between January and August resulted from CPOE implementation (overall, P < .001) for all levels of patient care units, imaging modalities, and provider specialties (eTables1–4). While statistically significant and in agreement with other studies,1,3,6 the overall level of 22% was unacceptably low.

Comment

Computerized physician order entry can be implemented using free-text fields, discrete diagnoses, discrete signs and symptoms, and order decision support.6 This unsatisfactory result is attributed to the free-text fields. While obvious in hindsight, discrete fields based on clinical terminology such as SnoMed CT (International Health Terminology Standards Development Organization) should automatically eliminate the “incomplete” and “medical abbreviations” categories by providing more complete documentation (eg, SOB, shortness of breath; SBO, small-bowel obstruction). Depending on the discrete selections available, “inadequate” (“status-post”) and “nonapplicable” reasons can remain as causes of poor-quality indications. Once discrete fields are available, order validation (appropriateness assessment) can be performed and should result in only relevant and valid indications. Order decision support software6 using evidence-based guidelines, such as provided by the American College of Radiology or American College of Cardiology, along with physician-to-physician communication for nonstandard orders, may be one solution to achieve improved communication and patient care.

Poor radiologic examination order indications are analogous to poor patient hand-offs and include performance of incorrect or incomplete examinations, overlooking patient-specific issues, general examination interpretations (not addressing clinical question), safety and compliance issues, billing delays, and rejections.2,5,6 Computerized physician order entry significantly and clinically improved the quality and information content of indications. Despite this improvement, 78% of orders continued to have inadequate, incomplete, or nonapplicable indications. We believe the required text fields and restricting ordering privileges to licensed health care providers directly contributed to improvements in order quality. However these free-text fields (implementation choice) failed to communicate adequate information. Implementation of discrete signs and symptoms, tracking health care provider ordering patterns, and educational programs targeted at improved documentation of patient indications should further improve order quality. In spite of these efforts, work-arounds are possible using any system; “gaming” order validation and appropriateness assessment systems is possible. With the pivotal role radiologic examinations play in health care decision making,1-3 improving the quality of order indications for radiologic examinations should improve the quality of patient care.

Correspondence: Dr Schneider, Imaging Institute, The Cleveland Clinic, 9500 Euclid Ave, Mailing Code HB-6, Cleveland, OH 44195 (schneie1@ccf.org).

Author Contributions:Study concept and design: Schneider and Obuchowski. Acquisition of data: Schneider, Franz, and Spitznagel. Analysis and interpretation of data: Schneider, Bascom, and Obuchowski. Drafting of the manuscript: Schneider and Obuchowski. Critical revision of the manuscript for important intellectual content: Franz, Spitznagel, Bascom, and Obuchowski. Statistical analysis: Obuchowski. Administrative, technical, and material support: Schneider, Franz, Spitznagel, and Bascom. Study supervision: Schneider.

Financial Disclosure: None reported.

References
1.
Cohen  MD Accuracy of information on imaging requisitions: does it matter?  J Am Coll Radiol 2007;4 (9) 617- 621PubMedGoogle ScholarCrossref
2.
Loy  CTIrwig  L Accuracy of diagnostic tests read with and without clinical information: a systematic review.  JAMA 2004;292 (13) 1602- 1609PubMedGoogle ScholarCrossref
3.
Leslie  AJones  AJGoddard  PR The influence of clinical information on the reporting of CT by radiologists.  Br J Radiol 2000;73 (874) 1052- 1055PubMedGoogle ScholarCrossref
4.
Cohen  MDCurtin  SLee  R Evaluation of the quality of radiology requisitions for intensive care unit patients.  Acad Radiol 2006;13 (2) 236- 240PubMedGoogle ScholarCrossref
5.
Gunderman  RBPhillips  MDCohen  MD Improving clinical histories on radiology requisitions.  Acad Radiol 2001;8 (4) 299- 303PubMedGoogle ScholarCrossref
6.
Sistrom  CLDang  PAWeilburg  JBDreyer  KJRosenthal  DIThrall  JH Effect of computerized order entry with integrated decision support on the growth of outpatient procedure volumes: seven-year time series analysis.  Radiology 2009;251 (1) 147- 155PubMedGoogle ScholarCrossref
×