Customize your JAMA Network experience by selecting one or more topics from the list below.
Identify all potential conflicts of interest that might be relevant to your comment.
Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.
Err on the side of full disclosure.
If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.
Not all submitted comments are published. Please see our commenting policy for details.
Li J, Das A, Chen LM. Assessing the Quality of Public Reporting of US Physician Performance. JAMA Intern Med. 2019;179(8):1133–1135. doi:10.1001/jamainternmed.2019.0398
To empower patients and their caregivers, federal policymakers have recommended greater transparency about the quality of care delivered by the health care sector.1 Physician Compare is the Centers for Medicare & Medicaid Services’ (CMS’s) flagship website about the quality of care provided by US physicians and other clinicians. The website is in its final phase of expansion, the focus of which has been the addition of clinician-level performance data to existing practice-level data, to further help patients and their caregivers choose high-quality clinicians.2 However, it is unclear whether Physician Compare is comprehensive: reporting is voluntary (although failure to submit data evokes a 2% reimbursement penalty3); clinicians choose the measures4; and CMS displays a subset of measures.2 Therefore, using current data from Physician Compare, we addressed 3 questions: How many and what types of clinicians have quality information? How comprehensive are the quality data? How well do the measures differentiate the performance of included clinicians?
We included 1 025 015 US clinicians caring for Medicare beneficiaries. We created this sample from the Physician Compare National Downloadable File (https://data.medicare.gov/data/physician-compare) (1 023 552 individuals) and the 2015 Medicare Data on Provider Practice and Specialty database to include individuals missing from the National Downloadable File (1463 individuals). Because clinicians report as individuals and/or as part of a group practice, we obtained quality information from Physician Compare’s 2016 individual and group files. We estimated the prevalence of clinicians with quality information available across specialties, the number of quality domains reported, and the distribution of quality performance. The University of Michigan Institutional Research Board approved the study. Because this study involves secondary analysis of existing data, the University of Michigan Institutional Review Board provided a waiver of the Health Insurance Portability and Accountability Act informed consent. All analyses were conducted using statistical software, Stata, version 15 SE (StataCorp).
Although 238 936 of 1 025 015 clinicians (23.3%) had quality information on Physician Compare, only 2563 (0.3%) had individual quality information (Table 1). Among clinicians with quality data, individual reporters had a median performance score of 98.0% (interquartile range [IQR], 94.5%-100.0%), and group reporters had a median performance score of 68.1% (IQR, 64.5%-72.2%) (Table 1). Scores were based on performance in few domains (individual reporters: median, 2 of 6 possible domains; group reporters: median, 3 of 7 possible domains). Within each domain, clinicians reported on few of the available measures (Table 2). For example, 72.2% of individual reporters had patient safety data, but the median individual reporter had information about 1 of 15, or 6.7% available patient safety measures. Results were generally similar for group reporting.
In this study, 76.7% of clinicians had no performance data on Physician Compare, 99.7% had no clinician-level performance data, and among clinicians with data, performance reflected only a few measures and the quality performance was generally high. As currently configured, Physician Compare fell short of its goal of providing information that is widely useful to patients and their caregivers for choosing clinicians.
Several factors may explain our findings. First, 50.1% of eligible clinicians submitted quality data (at the individual or practice level) that might have been used in Physician Compare.4 This may reflect low incentives for clinicians to submit quality data3; thus, CMS could consider providing larger incentives for reporting. Second, while 27.8% of clinicians submitted individual-level data to CMS, 0.3% had individual data on the Physician Compare website.4 This difference may reflect CMS’s selection of measures for public reporting, which was based on criteria such as measure reliability and utility to patients and caregivers.5 The Centers for Medicare & Medicaid Services does not release specific information about why many individual reporters are excluded from the website; such information would help policymakers assess the feasibility of clinician-level public reporting. Third, clinicians are not required to report on all of their patients and can choose which measures to submit to CMS.6 These factors may have contributed to high performance rates observed on Physician Compare for individual clinicians.
Physician Compare’s weaknesses suggest that it will be important for policymakers to consider if and how major revisions to the website would help achieve the Department of Health and Human Services’ goals of increased transparency, or whether a different approach is needed altogether.
Accepted for Publication: February 2, 2019.
Corresponding Author: Jun Li, MSPH, School of Public Health, University of Michigan, 1415 Washington Heights, Ann Arbor, MI 48109 (email@example.com).
Published Online: May 6, 2019. doi:10.1001/jamainternmed.2019.0398
Author Contributions: Ms Li, and Dr Chen had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: All authors.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: Li, Chen.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Li, Chen.
Obtained funding: Chen.
Administrative, technical, or material support: Chen.
Supervision: Das, Chen.
Conflict of Interest Disclosures: Dr Chen reported receiving grants from the Agency for Healthcare Research and Quality, the National Institute on Aging, the University of Michigan Institute for Healthcare Policy, and the American Heart Association during the conduct of the study; receiving personal fees as a contractor for the US Department of Health and Human Services, and as honoraria from the National Institutes of Health and the Robert Wood Johnson Foundation outside the submitted work; and taking part in an Intergovernmental Personnel Act assignment at the US Department of Health and Human Services and effort support from the University of Michigan Institute for Healthcare Policy. No other disclosures were reported.
Funding/Support: This project was supported by grant R01HS024698 from the Agency for Healthcare Research and Quality (Dr Chen).
Role of Funder/Sponsor: The funding organization had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Additional Contributions: We thank Bailey Reale, MPH (University of Michigan Division of General Medicine), for providing invaluable administrative support; and Ju-Chen Hu, MHSA (Emory University Rollins School of Public Health), and Melinda Song, BA (University of Michigan School of Medicine), for their diligent help with background research for this manuscript. They were paid for their effort.
Create a personal account or sign in to: