[Skip to Navigation]
Sign In
Table 1.  Characteristics of Clinicians With Quality Data on Physician Compare Website for Fiscal Year 2016a
Characteristics of Clinicians With Quality Data on Physician Compare Website for Fiscal Year 2016a
Table 2.  Characteristics of the Quality Data on Physician Compare Website
Characteristics of the Quality Data on Physician Compare Website
1.
Azar  AM  II. Value-based transformation of America’s healthcare system [published online March 8, 2018]. https://www.hhs.gov/about/leadership/secretary/speeches/2018-speeches/value-based-transformation-of-americas-healthcare-system.html. Accessed September 17, 2018.
2.
Burwell  SM. Centers for Medicare & Medicaid Services. Physician Compare report to Congress. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/physician-compare-initiative/downloads/physician-compare-report-to-congress.pdf. Published 2014. Accessed December 4, 2018.
3.
Centers for Medicare & Medicaid Services. Payment adjustment information. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/pqrs/payment-adjustment-information.html. Published 2018. Accessed December 20, 2018.
4.
Centers for Medicare & Medicaid Services. 2016 Physician Quality Reporting System reporting experience and trends. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/2016_PQRS_Experience_Report.docx. Published 2018. Accessed December 10, 2018.
5.
Centers for Medicare & Medicaid Services. Quality data and Physician Compare. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/physician-compare-initiative/quality-data-and-physician-compare-.html. Published 2016. Accessed December 20, 2018.
6.
Centers for Medicare & Medicaid Services. 2016 Physician Quality Reporting System (PQRS) payment adjustment fact sheet. https://www.cms.gov/newsroom/fact-sheets/2016-physician-quality-reporting-system-pqrs-payment-adjustment-fact-sheet. Published 2015. Accessed December 20, 2018.
Research Letter
Health Care Reform
May 6, 2019

Assessing the Quality of Public Reporting of US Physician Performance

Author Affiliations
  • 1School of Public Health, University of Michigan, Ann Arbor
  • 2Department of Medicine, University of Chicago, Chicago, Illinois
  • 3Division of General Medicine, Department of Internal Medicine, University of Michigan, Ann Arbor
JAMA Intern Med. 2019;179(8):1133-1135. doi:10.1001/jamainternmed.2019.0398

To empower patients and their caregivers, federal policymakers have recommended greater transparency about the quality of care delivered by the health care sector.1 Physician Compare is the Centers for Medicare & Medicaid Services’ (CMS’s) flagship website about the quality of care provided by US physicians and other clinicians. The website is in its final phase of expansion, the focus of which has been the addition of clinician-level performance data to existing practice-level data, to further help patients and their caregivers choose high-quality clinicians.2 However, it is unclear whether Physician Compare is comprehensive: reporting is voluntary (although failure to submit data evokes a 2% reimbursement penalty3); clinicians choose the measures4; and CMS displays a subset of measures.2 Therefore, using current data from Physician Compare, we addressed 3 questions: How many and what types of clinicians have quality information? How comprehensive are the quality data? How well do the measures differentiate the performance of included clinicians?

Methods

We included 1 025 015 US clinicians caring for Medicare beneficiaries. We created this sample from the Physician Compare National Downloadable File (https://data.medicare.gov/data/physician-compare) (1 023 552 individuals) and the 2015 Medicare Data on Provider Practice and Specialty database to include individuals missing from the National Downloadable File (1463 individuals). Because clinicians report as individuals and/or as part of a group practice, we obtained quality information from Physician Compare’s 2016 individual and group files. We estimated the prevalence of clinicians with quality information available across specialties, the number of quality domains reported, and the distribution of quality performance. The University of Michigan Institutional Research Board approved the study. Because this study involves secondary analysis of existing data, the University of Michigan Institutional Review Board provided a waiver of the Health Insurance Portability and Accountability Act informed consent. All analyses were conducted using statistical software, Stata, version 15 SE (StataCorp).

Results

Although 238 936 of 1 025 015 clinicians (23.3%) had quality information on Physician Compare, only 2563 (0.3%) had individual quality information (Table 1). Among clinicians with quality data, individual reporters had a median performance score of 98.0% (interquartile range [IQR], 94.5%-100.0%), and group reporters had a median performance score of 68.1% (IQR, 64.5%-72.2%) (Table 1). Scores were based on performance in few domains (individual reporters: median, 2 of 6 possible domains; group reporters: median, 3 of 7 possible domains). Within each domain, clinicians reported on few of the available measures (Table 2). For example, 72.2% of individual reporters had patient safety data, but the median individual reporter had information about 1 of 15, or 6.7% available patient safety measures. Results were generally similar for group reporting.

Discussion

In this study, 76.7% of clinicians had no performance data on Physician Compare, 99.7% had no clinician-level performance data, and among clinicians with data, performance reflected only a few measures and the quality performance was generally high. As currently configured, Physician Compare fell short of its goal of providing information that is widely useful to patients and their caregivers for choosing clinicians.

Several factors may explain our findings. First, 50.1% of eligible clinicians submitted quality data (at the individual or practice level) that might have been used in Physician Compare.4 This may reflect low incentives for clinicians to submit quality data3; thus, CMS could consider providing larger incentives for reporting. Second, while 27.8% of clinicians submitted individual-level data to CMS, 0.3% had individual data on the Physician Compare website.4 This difference may reflect CMS’s selection of measures for public reporting, which was based on criteria such as measure reliability and utility to patients and caregivers.5 The Centers for Medicare & Medicaid Services does not release specific information about why many individual reporters are excluded from the website; such information would help policymakers assess the feasibility of clinician-level public reporting. Third, clinicians are not required to report on all of their patients and can choose which measures to submit to CMS.6 These factors may have contributed to high performance rates observed on Physician Compare for individual clinicians.

Physician Compare’s weaknesses suggest that it will be important for policymakers to consider if and how major revisions to the website would help achieve the Department of Health and Human Services’ goals of increased transparency, or whether a different approach is needed altogether.

Back to top
Article Information

Accepted for Publication: February 2, 2019.

Corresponding Author: Jun Li, MSPH, School of Public Health, University of Michigan, 1415 Washington Heights, Ann Arbor, MI 48109 (lijununi@umich.edu).

Published Online: May 6, 2019. doi:10.1001/jamainternmed.2019.0398

Author Contributions: Ms Li, and Dr Chen had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: All authors.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Li, Chen.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Li, Chen.

Obtained funding: Chen.

Administrative, technical, or material support: Chen.

Supervision: Das, Chen.

Conflict of Interest Disclosures: Dr Chen reported receiving grants from the Agency for Healthcare Research and Quality, the National Institute on Aging, the University of Michigan Institute for Healthcare Policy, and the American Heart Association during the conduct of the study; receiving personal fees as a contractor for the US Department of Health and Human Services, and as honoraria from the National Institutes of Health and the Robert Wood Johnson Foundation outside the submitted work; and taking part in an Intergovernmental Personnel Act assignment at the US Department of Health and Human Services and effort support from the University of Michigan Institute for Healthcare Policy. No other disclosures were reported.

Funding/Support: This project was supported by grant R01HS024698 from the Agency for Healthcare Research and Quality (Dr Chen).

Role of Funder/Sponsor: The funding organization had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Additional Contributions: We thank Bailey Reale, MPH (University of Michigan Division of General Medicine), for providing invaluable administrative support; and Ju-Chen Hu, MHSA (Emory University Rollins School of Public Health), and Melinda Song, BA (University of Michigan School of Medicine), for their diligent help with background research for this manuscript. They were paid for their effort.

References
1.
Azar  AM  II. Value-based transformation of America’s healthcare system [published online March 8, 2018]. https://www.hhs.gov/about/leadership/secretary/speeches/2018-speeches/value-based-transformation-of-americas-healthcare-system.html. Accessed September 17, 2018.
2.
Burwell  SM. Centers for Medicare & Medicaid Services. Physician Compare report to Congress. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/physician-compare-initiative/downloads/physician-compare-report-to-congress.pdf. Published 2014. Accessed December 4, 2018.
3.
Centers for Medicare & Medicaid Services. Payment adjustment information. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/pqrs/payment-adjustment-information.html. Published 2018. Accessed December 20, 2018.
4.
Centers for Medicare & Medicaid Services. 2016 Physician Quality Reporting System reporting experience and trends. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/2016_PQRS_Experience_Report.docx. Published 2018. Accessed December 10, 2018.
5.
Centers for Medicare & Medicaid Services. Quality data and Physician Compare. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/physician-compare-initiative/quality-data-and-physician-compare-.html. Published 2016. Accessed December 20, 2018.
6.
Centers for Medicare & Medicaid Services. 2016 Physician Quality Reporting System (PQRS) payment adjustment fact sheet. https://www.cms.gov/newsroom/fact-sheets/2016-physician-quality-reporting-system-pqrs-payment-adjustment-fact-sheet. Published 2015. Accessed December 20, 2018.
×