[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 34.237.76.91. Please contact the publisher to request reinstatement.
[Skip to Content Landing]
Table 1.  
Questionnaire About Diabetic Retinopathy With Mean and Total Scoresa
Questionnaire About Diabetic Retinopathy With Mean and Total Scoresa
Table 2.  
Mean and Total Scores for Websites on Diabetic Retinopathy
Mean and Total Scores for Websites on Diabetic Retinopathy
Table 3.  
JAMA Benchmarks
JAMA Benchmarks
Table 4.  
Readability Analysis
Readability Analysis
1.
Centers for Disease Control and Prevention. Common eye disorders: basics—VHI. https://www.cdc.gov/visionhealth/basics/ced/index.html. Accessed January 19, 2019.
2.
Saaddine  JB, Honeycutt  AA, Narayan  KMV, Zhang  X, Klein  R, Boyle  JP.  Projection of diabetic retinopathy and other major eye diseases among people with diabetes mellitus: United States, 2005-2050.  Arch Ophthalmol. 2008;126(12):1740-1747. doi:10.1001/archopht.126.12.1740PubMedGoogle ScholarCrossref
3.
Sinclair  AJ, Bayer  AJ, Girling  AJ, Woodhouse  KW.  Older adults, diabetes mellitus and visual acuity: a community-based case-control study.  Age Ageing. 2000;29(4):335-339. doi:10.1093/ageing/29.4.335PubMedGoogle ScholarCrossref
4.
Pew Research Center. Health Online 2013. https://www.pewinternet.org/2013/01/15/health-online-2013/. Accessed January 19, 2019.
5.
Paolino  L, Genser  L, Fritsch  S, De’ Angelis  N, Azoulay  D, Lazzati  A.  The web-surfing bariatic patient: the role of the internet in the decision-making process.  Obes Surg. 2015;25(4):738-743. doi:10.1007/s11695-015-1578-xPubMedGoogle ScholarCrossref
6.
Lagan  BM, Sinclair  M, Kernohan  WG.  What is the impact of the Internet on decision-making in pregnancy? a global study.  Birth. 2011;38(4):336-345. doi:10.1111/j.1523-536X.2011.00488.xPubMedGoogle ScholarCrossref
7.
Kutner  G, Jin  P. The Health Literacy of America’s Adults: results from the 2003 National Assessment of Adult Literacy. 2003. https://nces.ed.gov/pubs2006/2006483.pdf. Accessed January 20, 2019.
8.
Daraz  L, Morrow  AS, Ponce  OJ,  et al.  Readability of online health information: a meta-narrative systematic review.  Am J Med Qual. 2018;33(5):487-492. doi:10.1177/1062860617751639PubMedGoogle ScholarCrossref
9.
Koo  K, Shee  K, Yap  RL.  Readability analysis of online health information about overactive bladder.  Neurourol Urodyn. 2017;36(7):1782-1787. doi:10.1002/nau.23176PubMedGoogle ScholarCrossref
10.
Berkman  ND, Sheridan  SL, Donahue  KE, Halpern  DJ, Crotty  K.  Low health literacy and health outcomes: an updated systematic review.  Ann Intern Med. 2011;155(2):97-107. doi:10.7326/0003-4819-155-2-201107190-00005PubMedGoogle ScholarCrossref
11.
Huang  G, Fang  CH, Agarwal  N, Bhagat  N, Eloy  JA, Langer  PD.  Assessment of online patient education materials from major ophthalmologic associations.  JAMA Ophthalmol. 2015;133(4):449-454. doi:10.1001/jamaophthalmol.2014.6104PubMedGoogle ScholarCrossref
12.
Edmunds  MR, Barry  RJ, Denniston  AK.  Readability assessment of online ophthalmic patient information.  JAMA Ophthalmol. 2013;131(12):1610-1616. doi:10.1001/jamaophthalmol.2013.5521PubMedGoogle ScholarCrossref
13.
Ivastinovic  D, Wackernagel  W, Wedrich  A.  Accuracy of freely available information about rhegmatogenous retinal detachment on the internet.  JAMA Ophthalmol. 2019;137(1):113-114. doi:10.1001/jamaophthalmol.2018.4682PubMedGoogle ScholarCrossref
14.
Silberg  WM, Lundberg  GD, Musacchio  RA.  Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor–Let the reader and viewer beware.  JAMA. 1997;277(15):1244-1245. doi:10.1001/jama.1997.03540390074039PubMedGoogle ScholarCrossref
15.
Readable. https://app.readable.com/text/?demo. Accessed January 29, 2019.
16.
Cline  RJW, Haynes  KM.  Consumer health information seeking on the Internet: the state of the art.  Health Educ Res. 2001;16(6):671-692. doi:10.1093/her/16.6.671PubMedGoogle ScholarCrossref
17.
Litsa  T. Is it important for SEO to rank first in 2018? https://searchenginewatch.com/2018/08/17/is-it-important-for-seo-to-rank-first-in-2018/. Accessed January 29, 2019.
18.
Chaffey  D. Comparison of Google clickthrough rates by position. https://www.smartinsights.com/search-engine-optimisation-seo/seo-analytics/comparison-of-google-clickthrough-rates-by-position/. Accessed January 29, 2019.
19.
EyeWiki. About. https://eyewiki.aao.org/EyeWiki%3AAbout. Accessed May 28, 2019.
Views 549
Citations 0
Original Investigation
August 22, 2019

Assessment of the Quality, Content, and Readability of Freely Available Online Information for Patients Regarding Diabetic Retinopathy

Author Affiliations
  • 1Bascom Palmer Eye Institute, Department of Ophthalmology, University of Miami Miller School of Medicine, Miami, Florida
  • 2Flaum Eye Institute, Department of Ophthalmology, University of Rochester Medical Center, Rochester, New York
JAMA Ophthalmol. Published online August 22, 2019. doi:10.1001/jamaophthalmol.2019.3116
Key Points

Question  What are the quality, accuracy, and readability of online information on diabetic retinopathy?

Findings  In this cross-sectional study, 11 diabetic retinopathy websites were analyzed. All were of poor quality and had a substantial variation in content accuracy and readability.

Meaning  These data suggest that available online information on diabetic retinopathy typically is not sufficient to support the patient in making appropriate medical decisions.

Abstract

Importance  Diabetic retinopathy is a global leading cause of blindness. Patients increasingly use the internet to search for health-related information that may affect medical decision-making, but to date, no standard exists across published websites.

Objective  To assess the quality, content, and readability of information found online for diabetic retinopathy.

Design and Setting  This cross-sectional study analyzed 11 medical sites with information on diabetic retinopathy. Twenty-six questions were composed to include information most relevant to patients, and each website was independently evaluated by 1 vitreoretinal surgeon and 2 vitreoretinal fellows. Readability was analyzed using an online readability tool. The JAMA benchmarks were used to evaluate the quality of each site. Data were collected from December 2018 to January 2019 and analyzed in February 2019.

Main Outcomes and Measures  A 26-question survey, JAMA benchmarks, Flesch reading ease score, Flesch-Kincaid grade level, Gunning Fog Index, Coleman Liau Index, and Simple Measure of Gobbledygook Index.

Results  The mean (SD) questionnaire score for all websites was 55.76 (13.38) (95% CI, 47.85-63.67) of 104 possible points. There was a difference between the content quality of the websites (H = 25.811, P = .004). The mean (SD) reading grade for all websites was 11.30 (1.79; 95% CI, 10.24-12.36). No correlation was found between content accuracy and the mean reading grade (r = 0.445, P = .17) or Google rank (r = −0.260, P = .43). No website achieved the full 4 JAMA benchmarks, and only 1 website achieved 3 of the 4 JAMA benchmarks. No correlation was found between the accuracy of the content of the website and JAMA benchmarks (r = 0.422, P = .20). The interobserver reproducibility was similar among the 3 observers (r = 0.87 between observers 1 and 2, r = 0.83 between observers 1 and 3, and r = 0.84 between observers 2 and 3, P < .001).

Conclusions and Relevance  These findings suggest that freely available information online about diabetic retinopathy varies by source but is generally of low quality. The material presented seems difficult to interpret and exceeds the recommended reading level for health information. Most websites reviewed did not provide sufficient information using the grading scheme used to support the patient in making medical decisions.

Introduction

Diabetic retinopathy (DR) is the leading cause of vision loss in US adults.1 It is projected that by 2050, the number of people with DR and vision-threatening DR will triple.2 Vision loss is associated with lower quality of life, progression of disabilities, and increased functional dependency.3

Patients at risk for DR or with a new diagnosis of DR may attempt to supplement information from their ophthalmologist and primary care physicians by conducting research on the internet. The Pew Research Center found that 72% of adults who use the internet search for health-related material.4 Studies5,6 have found that information presented online can have an important role in patient decision-making and treatment. However, no standard exists to date regarding the accuracy or quality of the content displayed. Online information can therefore be incomplete, be inaccurate, or have an underlying commercial bias. Thus, the usefulness of these resources can vary greatly by source.

Another challenge associated with online health information is the readability of the text. Information must strike a balance between being factually correct and being comprehensible to laypeople without a background in medicine. Health literacy, as defined by the National Assessment of Adult Literacy, refers to “the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions.”7 The latest report from the National Assessment of Adult Literacy states that the average adult can read and comprehend at the eighth-grade level.7 Thus, the US Department of Health and Human Services recommends that information targeted for patients should be written at or below the sixth-grade level. However, online health information can exceed the recommended reading level.8,9 Lower health literacy has been associated with more hospitalizations and poorer outcomes.10 It is essential that information accessed by patients is not only complete and accurate but also easy to read and comprehend.

Previous studies11,12 have found that online materials in ophthalmology exceed the recommended reading grade level. However, these analyses were limited to information provided by major ophthalmologic associations or the top 10 ranked patient-oriented websites, which were not named. Furthermore, these studies did not investigate the quality of the information provided. One group13 analyzed the quality of online sources on rhegmatogenous retinal detachment and found that websites had a significant difference in the content provided. Because that study was centered around a specific ophthalmic condition, the information gained cannot be extrapolated to all diseases in ophthalmology. However, it highlights the importance of assessing the information patients are accessing, especially in a disease as prevalent as DR. Given the important role of health-related internet searches and the increasing rates of DR, the aim of this study was to evaluate freely available information for patients about DR by analyzing the content, quality, and readability of commonly used medical websites.

Methods
Website Selection and Content Analysis

The keyword diabetic retinopathy was entered in a Google.com search, and major medical sites were selected for analysis. These sites included the American Academy of Ophthalmology (https://www.aao.org/), All About Vision (https://www.allaboutvision.com/), American Optometric Association (https://www.aoa.org/), American Society of Retinal Specialists (https://www.asrs.org/), EyeWiki (https://eyewiki.org/Main_Page), Mayo Clinic (https://www.mayoclinic.org/), Medical News Today (https://www.medicalnewstoday.com/), MedicineNet (https://www.medicinenet.com/), National Eye Institute (https://nei.nih.gov/), WebMD (https://www.webmd.com/), and Wikipedia (https://www.wikipedia.org/). The rank of each website was recorded. Twenty-six questions were composed by us based on frequently asked questions by patients in their practice. They were designed to encompass information generally conveyed during a patient evaluation and used to assess the accuracy and completeness of the content in each website (Table 1). The evaluation was conducted by 1 vitreoretinal surgeon (J.S.) and 2 vitreoretinal surgery fellows (N.A.Y. and N.A.P.). A grading scheme was created with a scale of 0 to 4, with 4 as a maximum, to assess each question. A score of 0 represents that no information for that question was available; 1 point corresponds to an answer that is unclear, is inaccurate, or omits significant information and displays poor organization; 2 points correspond to an answer that is partially complete and somewhat addresses the concept but has gaps in information and organization; 3 points correspond to a question that provides essential elements to answer the question and addresses the most relevant points in a focused and organized way; and 4 points correspond to an answer that is accurate and thorough, explaining the information in a clear, focused, and organized manner. Each question was independently assessed by each observer, and the interobserver reproducibility was measured using a Spearman correlation. The mean (SD) score among the 3 observers was used to compare the quality of each site. The mean score of each site was correlated with its rank on Google.com.

Accountability Analysis

The JAMA benchmarks were used to assess the accountability of each website.14 This instrument evaluates the presence of 4 key components: authorship, attributions, disclosure, and currency. To comply with the authorship requirements as outlined by JAMA, a website must include the authors and contributors as well as their affiliations and relevant credentials. All attributions, or references, should be clearly listed and any disclosures, currency, or date of update reported.

Readability Analysis

Readability was analyzed by Flesch reading ease score, Flesch-Kincaid grade level, Gunning Fog Index, Coleman Liau Index, and Simple Measure of Gobbledygook (SMOG) Index using an online readability tool (Readable).15 The Flesch reading ease score uses total words, sentences, and syllables in a text to provide a score of 0 to 100, with a higher score indicating an easy to read text. It is accepted that a score of 80 to 70 indicates a passage that can be understood by a seventh grader. The Flesch-Kincaid grade level, Gunning Fog Index, Coleman Liau Index, and SMOG Index report a US reading grade level.

Statistical Analysis

Data were collected from December 2018 to January 2019 and analyzed in February 2019. All statistical analysis was performed with IBM SPSS Statistics for Windows, version 25.0 (IBM Corp). For the content analysis, the data were treated as ordinal variables and analyzed with the Kruskal-Wallis test (with H being the test statistic). A post hoc Dunn-Bonferroni test was performed to evaluate pairwise comparisons. To evaluate the correlation between accuracy and ranking, a Spearman correlation test was used. Statistical significance was set at P ≤ .005 for the main comparisons and Spearman correlation and at P ≤ .05 for pairwise comparisons. For the readability analysis, the mean reading grade level for each website was compared using a Kruskal-Wallis test (with H being the test statistic), and a post hoc Dunn-Bonferroni test was performed to evaluate pairwise comparisons. To evaluate correlation between accuracy and readability, a Spearman correlation test was used. Statistical significance was set at P ≤ .005 for the main comparisons and Spearman correlation and at P ≤ .05 for pairwise comparisons. All P values were 2-sided.

Results
Website Selection and Content Analysis

Eleven websites were analyzed in our study. The interobserver reproducibility was similar among the 3 observers (r = 0.87 between J.S. and N.A.Y., r = 0.83 between J.S. and N.A.P., and r = 0.84 between N.A.Y. and N.A.P.; P < .001). The mean (SD) questionnaire score for all websites was 55.76 (13.38; 95% CI, 47.85-63.67) of 104 possible points. A statistically significant difference of content accuracy and completeness was found among the websites (H = 25.811, P = .004). Wikipedia was the top-scoring website, with a mean (SD) of 76.67 (0.98) points, representing 74% of total possible points. WebMD was the lowest-scoring website, with a mean (SD) of 33 (1.10) points, which represents 32% of total possible points (Table 2). A significant difference was found between the lowest-scoring website, WebMD, and the top-scoring website, Wikipedia (H = −26.667, P = .04). No significant correlation was found between the rank in Google.com and the content quality of the website (r = −0.260, P = .43).

Accountability Analysis

No website achieved all 4 JAMA benchmarks, and only 1 of 11 websites (9.1%) achieved 3 of the 4 JAMA benchmarks (Table 3). The most commonly displayed attributes were authorship and currency. No correlation was found between the content quality of the website and JAMA benchmarks (r = 0.422, P = .20).

Readability Analysis

The mean (SD) Flesch reading ease score was 46.05 (11.92; 95% CI, 39.01-53.10). The mean (SD) reading grade for all websites was 11.30 (1.79; 95% CI, 10.24-12.36) (Table 4). A significant correlation was found between the Flesch reading ease score and mean reading grade level (r = −0.955, P < .001). A significant difference was found in the mean reading grade level of the websites analyzed (H = 33.828, P < .001). WebMD had a significantly lower reading level than EyeWiki (H = 34.500, P = .008) and Wikipedia (H = −33.125, P = .02). The American Academy of Ophthalmology also had a significantly lower reading level than EyeWiki (H = -30.500, P = .04). No significant correlation was found between website quality and the mean reading grade (r = 0.445, P = .17).

Discussion

Patients often look at the internet to find health-related information.4 Resources must be accurate, reliable, and readable to aid patients in medical decision-making. Lack of standardization and regulation of available health care information often leads to marked variation in the quality of the material published.16 The aim of this study was to analyze the accuracy and completeness of the content displayed, as well as the quality and readability of selected websites regarding DR for patients.

We detected significant variation in the accuracy and completeness of patient information on DR. Of note, Wikipedia received the highest score of all 11 websites, with a mean (SD) score of 76.67 (0.98), which represents 74% of total possible points. This finding indicates that although Wikipedia is more informative than other sites, it is not the ideal reference for patients. Our study also found a significant difference among websites, highlighting the importance of directing patients to certain sources while avoiding others. No website achieved all 4 JAMA benchmarks, and even EyeWiki, a website launched by the American Academy of Ophthalmology, only met 3 of the 4 benchmarks. Of note, EyeWiki may be designed to educate medical professionals rather than laypeople. On average, there was 1 benchmark displayed, which indicated that the overall accountability of the websites is poor. Of interest, no correlation was found between the accountability of the website as defined by JAMA and the accuracy of the content displayed. This finding suggests that resources with inaccurate information may appear of better quality and therefore more reliable.

Websites aim to have the top rank in Google.com searches because this is associated with higher traffic and authority.17 A previous study18 found that the number of visitors that a website receives markedly changes with position in Google.com searches, with nearly 30% of users clicking on the first link and only 10% clicking on the third link. In this study, no correlation was found between the content quality of a website and its rank in Google.com searches, suggesting that this metric cannot be used by patients to gauge the reliability of the information shown.

The mean reading ease score across the 11 websites was equivalent to an 11th-grade reading grade level, which is in contrast to the recommended 6th-grade reading level suggested by the US Department of Health and Human Services guidelines. WebMD was the least complex website, with a Flesch reading ease score of 71.8, which correlates to a seventh-grade reading level, and a mean (SD) reading grade level of 7.60 (1.13). Although this website scored the lowest in our content analysis, no overall correlation was found between the mean reading level and mean questionnaire score. This analysis suggests that websites about DR are written at a level too complex for the average adult. Previous studies11,12 have found that online information regarding ophthalmic diseases is written at a reading level greater than what is recommended. This study further upholds those results, while providing material to aid in the counseling of patients with DR. The readability results, the poor content quality, and the lack of accountability displayed by most websites suggest that physicians should counsel patients on online health information–seeking behavior.

Websites that are freely available and do not require a membership or subscription were analyzed. However, some websites targeted for physicians can also be accessed by patients, with some even targeting both audiences. For example, EyeWiki is a site primarily intended for ophthalmologists that also seeks to inform the layperson.19 In our analysis, this site received the highest reading grade level and Flesch reading ease score, indicating that it uses complex language. Therefore, the usability and actionability of the information presented when read by a layperson may be less than intended by the authors.

Limitations

This study has some inherent limitations. The analysis performed may not be applicable to other diseases in ophthalmology because it was limited to information regarding DR. The keyword diabetic retinopathy was used to select websites, which may be different from the terms that patients use. Another limitation is that the patients’ understanding of DR from reading the material presented was not surveyed and analyzed against our assessment of each website. Further studies should be performed that include patients and compare how they search for information about DR online and what knowledge they gain after reading each website.

Conclusions

The freely available information online about DR varies by source but appears to be generally of low quality. The findings suggest that the material presented is difficult to read and exceeds the recommended reading level for health information. Most websites reviewed did not provide sufficient information to support the patient in making medical decisions.

Back to top
Article Information

Accepted for Publication: June 16, 2019.

Corresponding Author: Jayanth Sridhar, MD, Bascom Palmer Eye Institute, Department of Ophthalmology, University of Miami Miller School of Medicine, 900 NW 17th St, Miami, FL 33136 (jsridhar@med.miami.edu).

Published Online: August 22, 2019. doi:10.1001/jamaophthalmol.2019.3116

Author Contributions: Ms Kloosterboer and Dr Sridhar had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: Kloosterboer, Yannuzzi, Patel, Sridhar.

Acquisition, analysis, or interpretation of data: Kloosterboer, Yannuzzi, Kuriyan, Sridhar.

Drafting of the manuscript: Kloosterboer, Yannuzzi, Kuriyan, Sridhar.

Critical revision of the manuscript for important intellectual content: Yannuzzi, Patel, Sridhar.

Statistical analysis: Kloosterboer, Yannuzzi.

Supervision: Yannuzzi, Patel, Kuriyan, Sridhar.

Conflict of Interest Disclosures: Dr Kuriyan reported receiving personal fees from Alimera Sciences, Regeneron, Allergan, and Valeant and grants from Genentech outside the submitted work. Dr Sridhar reported receiving personal fees from Alcon, Alimera, and Thrombogenics outside the submitted work. No other disclosures were reported.

Funding/Support: The Bascom Palmer Eye Institute received funding from core grant P30EY014801 from the National Institutes of Health, grant W81XWH-13-1-0048 from the US Department of Defense, and an unrestricted grant from Research to Prevent Blindness.

Role of the Funder/Sponsor: The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
Centers for Disease Control and Prevention. Common eye disorders: basics—VHI. https://www.cdc.gov/visionhealth/basics/ced/index.html. Accessed January 19, 2019.
2.
Saaddine  JB, Honeycutt  AA, Narayan  KMV, Zhang  X, Klein  R, Boyle  JP.  Projection of diabetic retinopathy and other major eye diseases among people with diabetes mellitus: United States, 2005-2050.  Arch Ophthalmol. 2008;126(12):1740-1747. doi:10.1001/archopht.126.12.1740PubMedGoogle ScholarCrossref
3.
Sinclair  AJ, Bayer  AJ, Girling  AJ, Woodhouse  KW.  Older adults, diabetes mellitus and visual acuity: a community-based case-control study.  Age Ageing. 2000;29(4):335-339. doi:10.1093/ageing/29.4.335PubMedGoogle ScholarCrossref
4.
Pew Research Center. Health Online 2013. https://www.pewinternet.org/2013/01/15/health-online-2013/. Accessed January 19, 2019.
5.
Paolino  L, Genser  L, Fritsch  S, De’ Angelis  N, Azoulay  D, Lazzati  A.  The web-surfing bariatic patient: the role of the internet in the decision-making process.  Obes Surg. 2015;25(4):738-743. doi:10.1007/s11695-015-1578-xPubMedGoogle ScholarCrossref
6.
Lagan  BM, Sinclair  M, Kernohan  WG.  What is the impact of the Internet on decision-making in pregnancy? a global study.  Birth. 2011;38(4):336-345. doi:10.1111/j.1523-536X.2011.00488.xPubMedGoogle ScholarCrossref
7.
Kutner  G, Jin  P. The Health Literacy of America’s Adults: results from the 2003 National Assessment of Adult Literacy. 2003. https://nces.ed.gov/pubs2006/2006483.pdf. Accessed January 20, 2019.
8.
Daraz  L, Morrow  AS, Ponce  OJ,  et al.  Readability of online health information: a meta-narrative systematic review.  Am J Med Qual. 2018;33(5):487-492. doi:10.1177/1062860617751639PubMedGoogle ScholarCrossref
9.
Koo  K, Shee  K, Yap  RL.  Readability analysis of online health information about overactive bladder.  Neurourol Urodyn. 2017;36(7):1782-1787. doi:10.1002/nau.23176PubMedGoogle ScholarCrossref
10.
Berkman  ND, Sheridan  SL, Donahue  KE, Halpern  DJ, Crotty  K.  Low health literacy and health outcomes: an updated systematic review.  Ann Intern Med. 2011;155(2):97-107. doi:10.7326/0003-4819-155-2-201107190-00005PubMedGoogle ScholarCrossref
11.
Huang  G, Fang  CH, Agarwal  N, Bhagat  N, Eloy  JA, Langer  PD.  Assessment of online patient education materials from major ophthalmologic associations.  JAMA Ophthalmol. 2015;133(4):449-454. doi:10.1001/jamaophthalmol.2014.6104PubMedGoogle ScholarCrossref
12.
Edmunds  MR, Barry  RJ, Denniston  AK.  Readability assessment of online ophthalmic patient information.  JAMA Ophthalmol. 2013;131(12):1610-1616. doi:10.1001/jamaophthalmol.2013.5521PubMedGoogle ScholarCrossref
13.
Ivastinovic  D, Wackernagel  W, Wedrich  A.  Accuracy of freely available information about rhegmatogenous retinal detachment on the internet.  JAMA Ophthalmol. 2019;137(1):113-114. doi:10.1001/jamaophthalmol.2018.4682PubMedGoogle ScholarCrossref
14.
Silberg  WM, Lundberg  GD, Musacchio  RA.  Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor–Let the reader and viewer beware.  JAMA. 1997;277(15):1244-1245. doi:10.1001/jama.1997.03540390074039PubMedGoogle ScholarCrossref
15.
Readable. https://app.readable.com/text/?demo. Accessed January 29, 2019.
16.
Cline  RJW, Haynes  KM.  Consumer health information seeking on the Internet: the state of the art.  Health Educ Res. 2001;16(6):671-692. doi:10.1093/her/16.6.671PubMedGoogle ScholarCrossref
17.
Litsa  T. Is it important for SEO to rank first in 2018? https://searchenginewatch.com/2018/08/17/is-it-important-for-seo-to-rank-first-in-2018/. Accessed January 29, 2019.
18.
Chaffey  D. Comparison of Google clickthrough rates by position. https://www.smartinsights.com/search-engine-optimisation-seo/seo-analytics/comparison-of-google-clickthrough-rates-by-position/. Accessed January 29, 2019.
19.
EyeWiki. About. https://eyewiki.aao.org/EyeWiki%3AAbout. Accessed May 28, 2019.
×