Distribution of mean review scores for a sample of dermatologists on Yelp (n = 45) and ZocDoc (n = 45).
Distribution of individual reviews for dermatology practices on Yelp (n = 518) and ZocDoc (n = 4921) compared with distribution of reviews across all of Yelp reviews.
aP < .05 for difference in proportion of scores compared with ZocDoc.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Smith RJ, Lipoff JB. Evaluation of Dermatology Practice Online Reviews: Lessons From Qualitative Analysis. JAMA Dermatol. 2016;152(2):153–157. doi:10.1001/jamadermatol.2015.3950
Patient satisfaction is an increasingly important component of health care quality measures. Online reviews of physicians represent a promising platform for capturing patient perspectives of care.
To identify qualitative themes associated with patient reviews of dermatologic care on consumer reporting websites.
Design, Setting, and Participants
A qualitative analysis was conducted of patient-generated reviews of dermatology practices on 2 consumer review platforms. Yelp is an online consumer portal for users to review their experience with local businesses; ZocDoc is an online patient-scheduling portal that provides opportunity for patients to write reviews of physician practices. A total of 518 reviews from 45 dermatology practices on Yelp and 4921 reviews from 45 dermatology providers on ZocDoc were collected from 3 geographically diverse cities: Philadelphia, Pennsylvania; Houston, Texas; and Seattle, Washington. The study was conducted from January 15 to July 15, 2015.
Main Outcomes and Measures
Reviews were separated into high-scoring and low-scoring groups. An inductive qualitative method was used to code and identify key themes associated with positive and negative patient experiences. Analysis was completed upon reaching thematic saturation.
Reported as mean (95% CI), the overall Yelp score for the 45 selected practices was 3.46 of 5 stars (3.17-3.75) and overall ZocDoc score for the 45 selected practices was 4.72 of 5 stars (4.47-4.80). The proportion of individual reviews giving a score of 5.0 was significantly higher on ZocDoc (3986 [81.0%]) than on Yelp (229 [44.2%]) (P < .001). Qualitative themes centered on characteristics of the physician and the practice. Themes that emerged from the high-scoring and low-scoring reviews were similar in content but opposite in valence. Physician-specific themes included temperament, knowledge and competency, physical examination, communication abilities, and mindfulness of cost. Practice-specific themes included scheduling, staff temperament, office cleanliness, waiting room, and insurance. Patients appreciated physicians who are kind, respectful, and thorough with the physical examination; empathetic about the emotional difficulty of skin disease; and cognizant of cost. Negative experiences were frequently affected by considerations outside of the physician-patient interaction, such as curt interactions with staff, difficulty with scheduling, practice cleanliness, and insurance problems. Patients reported relying on consumer websites to identify dermatology providers.
Conclusions and Relevance
Online consumer review websites are designed to facilitate instantaneous and public communication among patients. These platforms provide elaborate and timely data for dermatologists to garner insight into their patients’ experiences. The themes identified in this study are consistent with past satisfaction studies and may aid dermatologists in optimizing the patient care experience.
Leadership in health care reform has emphasized the “triple aim”: improving patient experience with health care, improving quality of care, and decreasing costs.1 Patient perspectives constitute a critical component of the first aim, with patient-centered outcomes now tied to Medicare reimbursement.2 Traditionally, standardized surveys, such as the Hospital Consumer Assessment of Healthcare Providers and Systems and Press Ganey, capture patient feedback.3 However, these survey data are rarely accessed by patients. The era of online connectivity allows patients to share feedback in public Internet forums. Given its public nature, this content may offer novel opportunities for health care professionals to identify domains of care that matter most to patients.4
A 2014 report by Hanauer et al5 indicated that 59% of patients describe online patient reviews as “somewhat” or “very” important when selecting physicians for care. Physicians have voiced concerns that online reviews will merely serve as platforms for disgruntled patients; however, current trends suggest that patient online reviews are overwhelmingly positive.6 Many individuals have decried the practice of online reviews for health care altogether, with critics arguing that patients are not qualified to evaluate the quality of the medical care they receive.7,8 Regardless, the practice of both using and writing online reviews appears to be expanding.9
As a primarily outpatient-based specialty with high volume, dermatology may be well informed from the study of these online perspectives. Previous qualitative studies10-15 have identified general factors associated with high patient satisfaction within medicine. These factors include care access, physician demeanor, quality of medical care processes, continuity of care, quality of health care facilities, and office staff characteristics. Other academic subspecialties, such as urology16 and orthopedics,17 have also engaged online reviews to identify key patient-relevant features important to care. Our objective was to qualitatively evaluate online reviews of dermatology practices to identify features of positive and negative reviews.
The University of Pennsylvania institutional review board approved this study. The study was conducted from January 15 to July 15, 2015. We selected 2 online portals with different features for qualitative analysis: Yelp, a general consumer review platform for businesses, and ZocDoc, a physician scheduling and review website. Other platforms were excluded because they (1) lacked comments in their patients’ reviews, thereby making qualitative analysis impossible (eg, Healthgrades); (2) required a subscription for viewing the reviews, thereby limiting access (eg, Angie’s List); or (3) were redundant in scope and purpose (eg, Vitals and RateMDs).
Founded in 2004, Yelp is a free online platform for individuals to rate businesses and share experiences through written reviews.18 With more than 71 million individual reviews, Yelp is the 35th most frequented website in the United States.19 Although Yelp is traditionally associated with the reviews of local businesses, Bardach et al20 recently showed that a quarter of all US hospitals have Yelp pages. In addition, among the websites available to review physicians, Yelp has been rated by patients as the most used and most trusted.21 In contrast, ZocDoc, founded in 2007, is an online medical scheduling service that provides opportunities for patients to post physician reviews.22 Unlike Yelp, physicians must elect to host a ZocDoc account. Patients may select to write additional comments about their experiences, or they may solely provide quantitative feedback.
Using the search functions on Yelp.com, we identified dermatology practices within 3 major metropolitan cities in the United States chosen for their geographic diversity: Philadelphia, Pennsylvania; Houston, Texas; and Seattle, Washington. For each city, dermatology offices were identified by searching for practices initially closest to the geographic center of the city. The first 15 practices with at least 5 reviews within each city were selected. The following data were collected for each practice: total number of reviews, mean overall rating (on a scale of 1-5 stars, with 1 as the lowest level), and all individual reviews and ratings. One-way analysis of variance was used to determine statistical differences in quantitative means between cities using an α level of .05. The same method was performed using the website ZocDoc.com.
The reviews from each city were separated into high-scoring (5-star) and low-scoring (1-star) categories. We used an inductive qualitative method to identify key themes associated with high- and low-scoring reviews. After completing a line-by-line reading of the Yelp reviews from Philadelphia, we convened to develop a consensus on the key topics that emerged from the data. The remaining reviews from Seattle and Houston were then coded with the identified topics. We continued coding reviews until reaching consensus thematic saturation, a validated qualitative measure indicating that no novel themes were emerging from the data.23 Once the coding process was completed, we combined the topics into the most salient themes. We then reviewed the qualitative comments from ZocDoc using the themes derived from the initial analysis. Data analysis was conducted from February 1 to July 31, 2015.
Qualitative themes centered on characteristics of the physicians and practices (Table). Themes that emerged from the 5- and 1-star reviews were similar in content but opposite in valence. Physician-specific themes included temperament, knowledge and competency, physical examination, communication abilities, and mindfulness of cost. Themes related to the practice included scheduling, staff temperament, office cleanliness, waiting room time, and insurance.
Within the physician-patient experience, patients appreciated physicians who were kind, courteous to patient concerns, empathetic about the emotional difficulty of skin disease, and cognizant of cost. Physicians were highly rated for being both thorough and efficient with their visits and answering patients’ questions in a manner that was complete and understandable. Patients recognized the importance of care quality and competence as only one metric they considered important to their visit. Patients described a sense of vulnerability with dermatology office visits, stemming largely from the full-body examination. As such, they appreciated physicians who approached the physical examination with respect and thoroughness. The positive reviews revealed that patients were moved when physicians discussed their own struggles with skin-related diseases. Physicians’ willingness to develop a care plan in collaboration with the patient was praised. Positive reviews also described physicians who were transparent about the costs of medications and procedures.
In contrast, 1-star reviews described physicians who had poor bedside manner (eg, failed to maintain eye contact and did not knock on doors), made medical errors (eg, prescribed a medication to which the patient is allergic), and failed to provide follow-up, leaving patients feeling disregarded. Patients also expected to have their skin signs and symptoms fully examined, and they described a sense of feeling cheated when this did not occur.
Patients frequently discussed their experience with the waiting room and check-in process. Positive reviews described the clinic staff as professional and the physical setting as clean and comfortable. These reviews also commented on the timeliness with which they are checked in, seen by a medical professional, and checked out of the office. Patients also noted elements of practice convenience, such as online appointment scheduling, the availability of multiple clinical sites, and free parking. In addition, these reviews highlighted a practice’s ability to accommodate complex health insurance situations.
Negative reviews were affected by considerations outside of the physician-patient relationship: curt interactions with staff, difficulty with scheduling appointments, extended times in the waiting room, uncleanliness of the setting, and confusion regarding insurance, referrals, and billing practices.
Patients noted relying on review websites to select dermatology practices. For example, “Thanks again to Yelp for helping me find this dermatologist!” and “Very pleased with the physician as well as front desk staff; found this clinic on ZocDoc.”
The mean (95% CI) overall Yelp score for the 45 selected practices was 3.46 of 5 stars (3.17-3.75). Seventeen (37.8%) of the practices received a mean Yelp score of at least 4.0 of 5.0, and 6 practices (13.3%) received a mean Yelp score of at most 2.0 of 5.0 (Figure 1). There was a significant difference in mean practice rating between cities (P = .005), with Philadelphia practices having a lower mean score (2.82 [2.30-3.33]) compared with Houston (3.70 [3.29-4.11]) and Seattle (3.87 [3.46-4.28]). These 45 practices had a total of 518 individual reviews, with a mean of 12 (9-14) reviews per practice. The mean number of reviews per practice varied significantly by city (P < .001), with Seattle practices having the highest mean number of reviews (15 [9-20]), followed by Philadelphia (10 [8-13]) and Houston (9 [6-11]). The most frequent scores at the level of the individual review were 5 (44%) and 1 (30%) (Figure 2).
The mean overall ZocDoc score for the 45 selected physicians was 4.72 of 5 stars (95% CI, 4.47-4.80). In contrast to Yelp, on ZocDoc, all practices received a mean score of at least 4.0 of 5.0 (Figure 1). There were no significant differences in mean practice rating between the 3 cities. These practices had a total of 4921 individual reviews. Of these, 2026 reviews (41.2%) contained qualitative comments. The mean number of qualitative reviews per practice did not vary significantly by city (45 [95% CI, 33-57] per practice). By far, a score of 5 was the most frequent at the level of the individual review on ZocDoc (3986 [81.0%]). Practices evaluated on ZocDoc received a significantly higher proportion of reviews with a score of 5 than did practices on Yelp (229 [44.2%]; P < .001) (Figure 2).
Online review platforms provide insight to patient experiences in outpatient dermatology practices. The identified themes are consistent with past satisfaction studies.11,12 However, these online portals provide instantaneous and public communication between patients, allowing more elaborate and timely data on health care professionals than traditional surveys. Patients in such forums write reviews for other potential patients rather than for practices. As such, these platforms demonstrate notable potential for patient engagement in the care review process.
Across both websites, the 5-star and 1-star reviews revealed themes that were common in content and binary in valence, suggesting that the quantitative scale captures both positive and negative patient experiences. The consistency in qualitative content across the 2 platforms suggests that, even though these websites serve different purposes, the content of patient reviews is generalizable.
Despite whether it is appropriate, some patients are using the perspectives within online reviews to make decisions about where to pursue dermatologic care. As such, physicians should take advantage of this opportunity to learn about patient concerns and develop initiatives to improve patient-centered care. Although there is resistance in medicine to thinking of patients as consumers, it is evident that, at least in some respects, patients approach outpatient practices as care vendors. As the number of reviews for any one physician may be low, it would be inappropriate to draw conclusions about the quality of a specific practice from these online platforms. However, the broader pool of data may provide insights to the general tenets that contribute to patient satisfaction.
Of note, many negative experiences are exacerbated by aspects of the patient’s visit outside of the physician-patient interaction. A focus on quality improvement within areas of appointment scheduling, practice cleanliness, professionalism of staff, and insurance handling could greatly improve the patient experience.
This study has several limitations. First, individual reviewers are anonymous. It is impossible to validate the identity of the individuals posting their comments or confirm their interactions with the dermatologists who provided care. However, Yelp has a proprietary algorithm that works to selectively hide reviews that may be falsified.24 We can presume that most reviews stem from legitimate physician-patient experiences. The reviews on ZocDoc are all given by “verified” patients. However, practices must choose to host an account for patients to post reviews; therefore, ZocDoc cannot capture all dermatology practices and dermatologists.
As with all qualitative research, these findings are exploratory and hypothesis generating. These reviews reflect a small sample of the total patient population seen by the selected dermatology practices. However, our intent was not to validate the representative accuracy of these reviews but rather to derive insight from the patient-generated content that these reviews provide on a public forum. By focusing on the comparison of high-scoring and low-scoring reviews within this set of dermatology practices, we have attempted to provide internal validity to the results.
Reviews of dermatology practices on the public consumer-ratings websites reveal a broad range of factors that affect the patient’s experience. In striving to achieve the broader aim of improved patient experiences in health care, dermatologists may use themes from these review data to derive insights on patient perspectives of the clinical experience and guide improvements in care.
Corresponding Author: Jules B. Lipoff, MD, Department of Dermatology, University of Pennsylvania, Penn Presbyterian Medical Center, Medical Arts Bldg, Ste 106, 51 N 39th St, Philadelphia, PA 19104 (firstname.lastname@example.org).
Accepted for Publication: September 23, 2015.
Published Online: November 25, 2015. doi:10.1001/jamadermatol.2015.3950.
Author Contributions: Mr Smith and Dr Lipoff had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Both authors.
Acquisition, analysis, or interpretation of data: Both authors.
Drafting of the manuscript: Both authors.
Critical revision of the manuscript for important intellectual content: Both authors.
Statistical analysis: Smith.
Administrative, technical, or material support: Both authors.
Study supervision: Lipoff.
Conflict of Interest Disclosures: None reported.
Create a personal account or sign in to: