Christie Riemer, Monica Doctor, Robert P. Dellavalle. Analysis of Online Ratings of Dermatologists. JAMA Dermatol. 2016;152(2):218–219. doi:10.1001/jamadermatol.2015.4991
Online physician rating sites (PRSs) allow patients to recommend, grade, and publicly comment on physician performance. In 2015, PRSs experienced up to 6.4 million hits.1 Despite increases in the popularity of PRSs, little information exists regarding the online ratings of dermatologists. We investigated the patterns of ratings of dermatologists on commonly used PRSs to better understand the information available to patients online. We hypothesized that the mean online ratings for dermatologists are high, consistent with ratings reported in the literature for other subspecialties.2
One hundred dermatologists were randomly selected from August 2 to 28, 2015, from a public list of 11 848 members of the American Academy of Dermatology. Institutional review board approval was not obtained because no patients were involved, data were obtained from public sources, and data are presented in aggregate. Five popular websites were searched for physician ratings: ZocDoc.com, Yelp.com, RateMDs.com, Vitals.com, and Healthgrades.com.3 Mean overall ratings (all websites used a 5-star scale), total number of ratings, and the number of negative comments were recorded for each dermatologist per website. A repeated-measures design was used to determine if mean 5-star ratings were consistent across different websites, and unpaired 2-sided t tests were used to analyze whether sex or subspecialty training had effects on ratings. The numbers of negative written comments were compared using a χ2 test (critical value, 7.82; α = .05) to determine if certain websites had significantly fewer negative comments than other websites. Data analysis was conducted from August 19 to October 10, 2015.
Of the 100 dermatologists included, 55 were men (55%) and 25 were subspecialists (pediatric dermatology, dermatopathology, Mohs surgery). Individual dermatologists appeared on approximately 2 websites (mean, 2.41). Across all websites, the mean ratings for dermatologists were high, at more than 3.5 stars (Table 1). No significant differences were found between the ratings on the 3 PRSs with the most profiles of dermatologists (N = 37; P = .33). The results of t tests confirmed that neither sex (P = .32) nor specialty training (P = .89) had significant effects on mean ratings. Four of the 5 websites offer the option for users to write comments. Only 1 website (ZocDoc.com) had significantly fewer negative comments than the other websites (χ2 = 12.02; P = .007) (Table 2).
Patients are increasingly using social media to make health care decisions. A 2014 study found that 61% of patients used PRSs before choosing a physician, and 20% used online reviews to evaluate their current physician.4 While many PRSs exist, their structure and purpose differ. Only 1 website discourages physicians from soliciting reviews for fear of creating bias; this website has the lowest number of reviews and the lowest mean ratings per physician. In contrast, websites that offer features to increase patient reviews (email notifications or postcards) had the highest volume of reviews and the highest mean overall ratings. More important, ratings for the same dermatologists were consistently high across these top-used websites. We may conclude that when the total number of reviews for a physician is low, one outlier may have a disproportionately large effect on overall rating, creating apparent bias. Interestingly, the website with the most ratings per dermatologist (Table 1) had significantly fewer negative comments (Table 2). Prompting patients to provide reviews may encourage all patients to participate, not just those who had an extremely good or bad experience, thereby creating more transparent communication between patients and physicians.
The data presented are limited by the subjective quality of patient reviews. Therefore, it is not possible to draw correlations between ratings and actual quality of care. However, the data we gathered from PRSs are easy for patients to access when making decisions about health care. Overall, we confirmed our hypothesis that, as with other subspecialties, online ratings of dermatologists are consistently high. Furthermore, we conclude that, while a range of reviews is helpful to improve practice, websites that prompt more patient feedback are less susceptible to outlier bias. Therefore, we encourage dermatologists to familiarize themselves with the various features of PRSs to better use this social media resource to reach their patient population and improve patient satisfaction.
Accepted for Publication: October 19, 2015.
Corresponding Author: Robert P. Dellavalle, MD, PhD, MSPH, Dermatology Service, US Department of Veterans Affairs, Eastern Colorado Health Care System, 1055 Clermont St, PO Box 165, Denver, CO 80220.
Published Online: December 16, 2015. doi:10.1001/jamadermatol.2015.4991.
Author Contributions: Dr Dellavalle and Ms Riemer had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: All authors.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: Riemer, Doctor.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Riemer, Doctor
Administrative, technical, or material support: Riemer, Doctor.
Study supervision: Dellavalle.
Conflict of Interest Disclosures: None reported.
Additional Contributions: Abigail R. Ness, MA, and Brian Fox, PhD, Department of Psychology, University of Missouri, Kansas City, provided statistical analysis. They were not compensated for their contributions.