[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 54.197.142.219. Please contact the publisher to request reinstatement.
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Download PDF
Table 1.  
Clinical Decision Support (CDS) Appropriateness Ratings of Advanced Diagnostic Imaging Proceduresa
Clinical Decision Support (CDS) Appropriateness Ratings of Advanced Diagnostic Imaging Proceduresa
Table 2.  
Order Cancellations and Changes Following Feedback of an Inappropriate Rating
Order Cancellations and Changes Following Feedback of an Inappropriate Rating
1.
Garg  AX, Adhikari  NK, McDonald  H,  et al.  Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1223-1238.
PubMedArticle
2.
Blumenthal  D, Tavenner  M.  The “meaningful use” regulation for electronic health records. N Engl J Med. 2010;363(6):501-504.
PubMedArticle
3.
Protecting Access to Medicare Act of 2014, HR 4302, 113th Congress, 2nd Session, §218 (2014).
4.
Centers for Medicare & Medicaid Services. Medicare Imaging Demonstration. http://innovation.cms.gov/initiatives/Medicare-Imaging/. Accessed February 25, 2015.
5.
Lewin Group Inc. Medicare Appropriateness of Use Imaging Demonstration: Implementation Report. Falls Church, VA: Lewin Group Inc; 2014.
Research Letter
June 2, 2015

Appropriateness of Advanced Diagnostic Imaging Ordering Before and After Implementation of Clinical Decision Support Systems

Author Affiliations
  • 1RAND, Boston, Massachusetts
  • 2RAND, Arlington, Virginia
  • 3RAND-UCLA, Los Angeles, California
  • 4Centers for Medicare & Medicaid Services, Baltimore, Maryland
JAMA. 2015;313(21):2181-2182. doi:10.1001/jama.2015.5089

Computerized clinical decision support (CDS) systems that match patient characteristics against appropriateness criteria to produce algorithmic treatment recommendations are a potential means of improving care.1,2 The Protecting Access to Medicare Act of 2014 mandates use of CDS systems for the ordering of advanced diagnostic imaging in the Medicare program starting in 2017.3

In a descriptive observational study, we used data from the Medicare Imaging Demonstration to evaluate the relationship of CDS system use with the proportion of imaging orders matched to appropriateness criteria, the appropriateness of ordered images, and the proportion of orders changed following feedback.

Methods

Between October 2011 and November 2013, clinicians used computerized radiology order entry systems and CDS systems for selected magnetic resonance imaging, computed tomography, and nuclear medicine procedures.4 During a 6-month baseline period, the CDS systems tracked whether orders were linked with appropriateness criteria but did not provide clinicians with feedback on appropriateness of orders.

During the 18-month intervention period, the CDS systems provided feedback indicating whether the order was linked to appropriateness criteria and, if so, the appropriateness rating, any recommendations for alternative orders, and a link to documentation supporting each rating. National medical specialty societies developed the appropriateness criteria using expert panels that reviewed evidence and completed a structured rating process.5

Participating organizations located in 8 states included 3 academic medical centers, 2 integrated delivery systems, 1 group of independent practices in a single geographic area, and 1 group of independent practices recruited by a radiology benefits management organization. Although the CDS systems differed across participating organizations, they were programmed with the same appropriateness criteria. All clinicians affiliated with enrolled practices who ordered advanced imaging for Medicare beneficiaries participated, including primary care and specialist physicians and nonphysicians.

We examined the proportion of rated orders and the appropriateness of rated orders between the baseline and intervention periods to identify the relationship between the feedback from the CDS systems and the ordering behavior of the clinicians.

The study was approved, with need for informed consent waived, by RAND’s institutional review board.

Results

The 3340 participating clinicians placed 117 348 orders for advanced diagnostic imaging procedures. The CDS systems did not identify relevant appropriateness criteria for 63.3% of orders during the baseline period and for 66.5% during the intervention period (Table 1).

During the baseline period, 11.1% of final rated orders were inappropriate vs 6.4% during the intervention period. During the baseline period, 73.7% of final rated orders were appropriate vs 81.0% during the intervention period.

Of orders initially rated as inappropriate, 4.8% were changed and 1.9% were canceled (Table 2). Of inappropriate initial orders for which the CDS systems suggested an alternative, 9.9% were changed and 0.4% were canceled. Of inappropriate initial orders for which no alternative was suggested, 1.4% were changed and 2.8% were canceled.

Discussion

Most orders were unable to be matched by the CDS systems to appropriateness criteria. Of those matched, there was a small increase in the percentage of orders rated appropriate between the baseline and intervention periods, although few inappropriate orders were changed or canceled immediately following feedback from the CDS systems.

Therefore, improvements in appropriate imaging ordering do not appear related to immediate feedback and instead may be related to physician learning or secular changes.

A strength of this study is the scale and diversity of settings included. Previous studies of CDS systems for advanced diagnostic imaging ordering were mostly conducted within single organizations or settings and reported mixed results, with some showing an association with increased appropriateness and some reporting low uptake by clinicians.1

Limitations included reliance on clinical data entered by ordering clinicians, who may have imprecisely specified patient characteristics, and variation in how the CDS systems incorporated characteristics into their algorithms. The pre-post design could lead to selection bias or reflect concomitant initiatives or secular trends.

Participants may not be representative of all clinicians and settings using CDS systems. The descriptive presentation of results does not indicate statistical significance; heterogeneity of participating organizations requires a more detailed statistical analysis.

Implementing CDS systems in real-world settings has many challenges that must be addressed to meaningfully affect patient care.

Back to top
Article Information
Section Editor: Jody W. Zylke, MD, Deputy Editor.

Corresponding Author: Peter S. Hussey, PhD, RAND, 20 Park Plaza, Boston, MA 02116 (hussey@rand.org).

Author Contributions: Dr Hussey had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Hussey, Timbie, Nyweide, Kahn.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Hussey, Timbie, Burgette.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Hussey, Timbie, Burgette.

Obtained funding: Hussey, Kahn.

Administrative, technical, or material support: Hussey, Nyweide, Kahn.

Study supervision: Hussey, Nyweide, Kahn.

Conflict of Interest Disclosures: The authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.

Funding/Support: Funding for this project was provided by the Centers for Medicare & Medicaid Services under contract HHSM-500-2005-00028I/T0003.

Role of the Funder/Sponsor: Dr Nyweide (an employee of the Centers for Medicare & Medicaid Services) was involved in design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, and approval of the manuscript; and decision to submit the manuscript for publication.

Disclaimer: The findings and views expressed are solely those of the authors and are not meant to reflect the views or policies of the US government.

Additional Contributions: We thank Ian Brantley, MPH (Remedy Partners), for his assistance in the analysis of clinical decision support data while a paid employee of RAND.

References
1.
Garg  AX, Adhikari  NK, McDonald  H,  et al.  Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1223-1238.
PubMedArticle
2.
Blumenthal  D, Tavenner  M.  The “meaningful use” regulation for electronic health records. N Engl J Med. 2010;363(6):501-504.
PubMedArticle
3.
Protecting Access to Medicare Act of 2014, HR 4302, 113th Congress, 2nd Session, §218 (2014).
4.
Centers for Medicare & Medicaid Services. Medicare Imaging Demonstration. http://innovation.cms.gov/initiatives/Medicare-Imaging/. Accessed February 25, 2015.
5.
Lewin Group Inc. Medicare Appropriateness of Use Imaging Demonstration: Implementation Report. Falls Church, VA: Lewin Group Inc; 2014.
×