[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Table 1.  
Readability Analysis Algorithmsa
Readability Analysis Algorithmsa
Table 2.  
Readability Assessment Scores
Readability Assessment Scores
1.
Rainie  L, Fox  S. The online health care revolution. Pew Research Center website. November 26, 2000. http://www.pewinternet.org/2000/11/26/the-online-health-care-revolution/. Accessed February 7, 2016.
2.
Walsh  TM, Volsko  TA.  Readability assessment of internet-based consumer health information.  Respir Care. 2008;53(10):1310-1315.PubMedGoogle Scholar
3.
How to write easy-to-read health materials. MedlinePlus website. https://www.nlm.nih.gov/medlineplus/etr.html. Accessed April 8, 2016.
4.
Tulbert  BH, Snyder  CW, Brodell  RT.  Readability of patient-oriented online dermatology resources.  J Clin Aesthet Dermatol. 2011;4(3):27-33.PubMedGoogle Scholar
5.
Berkman  ND, Sheridan  SL, Donahue  KE, Halpern  DJ, Crotty  K.  Low health literacy and health outcomes: an updated systematic review.  Ann Intern Med. 2011;155(2):97-107.PubMedGoogle ScholarCrossref
6.
Diaz  JA, Griffith  RA, Ng  JJ, Reinert  SE, Friedmann  PD, Moulton  AW.  Patients’ use of the Internet for medical information.  J Gen Intern Med. 2002;17(3):180-185.PubMedGoogle ScholarCrossref
Research Letter
August 2016

Patient Education Materials in Dermatology: Addressing the Health Literacy Needs of Patients

Author Affiliations
  • 1Medical student at University of Pittsburgh School of Medicine, Pittsburgh, Pennsylvania
  • 2Rutgers New Jersey Medical School, Newark
  • 3Department of Radiology, Thomas Jefferson University Hospitals, Philadelphia, Pennsylvania
  • 4Department of Neurosurgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania
  • 5Department of Dermatology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania
JAMA Dermatol. 2016;152(8):946-947. doi:10.1001/jamadermatol.2016.1135

With the increasing availability of digital educational resources and the growing number of media users, the Internet has become an invaluable resource for the dissemination of health care information to the general public. Seventy percent of American adults who use the Internet to obtain health information have reported that it influenced their decision about how to treat an illness or condition.1 Medical practitioners have the responsibility to develop and distribute materials that are readable and comprehensible to patients across different communities.1 The mean reading ability of US adults is at the 8th-grade level,2 and thus the American Medical Association and US National Institutes of Health recommend presenting patient education materials at a reading level between the 3rd and 7th grade.3 Herein, we assess the readability of more than 700 online dermatologic patient education resources published by a range of dermatologic organizations, and we use 10 widely accepted readability algorithms to determine whether these materials meet the national guidelines. This is a comprehensive analysis of publicly available Internet-based dermatology information using multiple readability assessments. We hope to build on prior research that compared the readability of selected dermatologic patient education materials from the American Academy of Dermatology and other common sources of patient education material (including WebMD.com and Wikipedia.org).4

Methods

Institutional review board approval was not required because all data were publicly available online for this study. In January and February 2016, a total of 706 dermatology-related internet-based patient education materials were downloaded from 20 professional websites (Table 1). These articles were reformatted to plain text using word processing software (Microsoft Corp), and any text unrelated to patient education, including figure legends and web page navigation text, was removed. The final edited articles were assessed for their level of readability using Readability Studio (Oleander Software, Ltd), which employs 10 quantitative readability scales that are widely used and accepted in the medical literature (Table 1).

Results

The Readability Studio composite of the 9 readability assessments found that the 706 dermatology patient-oriented education materials were written at a mean 12th grade reading level (mean [SD], 12.1 [2.1]; range, 8.9-14.3). Specifically, 691 (98%) articles were written above the recommended 7th grade level. The Flesch reading ease scale, a common readability scale used in the health literacy literature, further identified the articles as being “difficult” to read, with a mean (SD) score of 44.8 (14.4) out of a possible 100 (with lower scores denoting more complex articles) (Table 2).

Discussion

Low health literacy is associated with poor adherence to medication use, increased hospitalization, and increased mortality.5 Improving patient health literacy, of which readability is 1 component, could boost patients’ confidence to play a more active role in the health care decision-making process.2

The lack of consistent oversight of online content can result in inappropriately technical patient education resources.6 Given that a majority of the patient education sources from the 20 dermatology organizations were written above the National Institutes of Health–recommended reading level, these existing resources should be revised to reach a broader patient audience.

A limitation to this study is the lack of patient feedback in assessing the quality of online health care resources. Using quality assessment metrics to subjectively evaluate the online user experience may capture criticisms missed by our quantitative approach. Participants could reflect on the quality of nontextual components such as graphics, website design, interactivity, and user friendliness. A greater emphasis must be placed on developing simpler online dermatologic education resources, given the influence these materials can have on patient decision making in the clinical setting.

Back to top
Article Information

Accepted for Publication: March 25, 2016.

Corresponding Author: Arpan V. Prabhu, BS, University of Pittsburgh School of Medicine, 518 Scaife Hall, 3550 Terrace St, Pittsburgh, PA 15261 (prabhuav2@upmc.edu).

Published Online: May 18, 2016. doi:10.1001/jamadermatol.2016.1135.

Author Contributions: Mr Prabhu and Dr Koch had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Prabhu, Kashkoush, Hansberry, Agarwal, Koch.

Acquisition, analysis, or interpretation of data: Prabhu, Gupta, Kim, Hansberry, Agarwal.

Drafting of the manuscript: Prabhu, Gupta, Kashkoush.

Critical revision of the manuscript for important intellectual content: Prabhu, Gupta, Kim, Hansberry, Agarwal, Koch.

Statistical analysis: Prabhu, Gupta, Hansberry.

Administrative, technical, or material support: Koch.

Study supervision: Prabhu, Hansberry, Koch.

Conflict of Interest Disclosures: None reported.

References
1.
Rainie  L, Fox  S. The online health care revolution. Pew Research Center website. November 26, 2000. http://www.pewinternet.org/2000/11/26/the-online-health-care-revolution/. Accessed February 7, 2016.
2.
Walsh  TM, Volsko  TA.  Readability assessment of internet-based consumer health information.  Respir Care. 2008;53(10):1310-1315.PubMedGoogle Scholar
3.
How to write easy-to-read health materials. MedlinePlus website. https://www.nlm.nih.gov/medlineplus/etr.html. Accessed April 8, 2016.
4.
Tulbert  BH, Snyder  CW, Brodell  RT.  Readability of patient-oriented online dermatology resources.  J Clin Aesthet Dermatol. 2011;4(3):27-33.PubMedGoogle Scholar
5.
Berkman  ND, Sheridan  SL, Donahue  KE, Halpern  DJ, Crotty  K.  Low health literacy and health outcomes: an updated systematic review.  Ann Intern Med. 2011;155(2):97-107.PubMedGoogle ScholarCrossref
6.
Diaz  JA, Griffith  RA, Ng  JJ, Reinert  SE, Friedmann  PD, Moulton  AW.  Patients’ use of the Internet for medical information.  J Gen Intern Med. 2002;17(3):180-185.PubMedGoogle ScholarCrossref
×