Perception of Acne Questionnaire: Pretest.
Knowledge score change from baseline. P = .21 between groups immediately postintervention, and P = .30 between groups at 1-month postintervention.
Knowledge score during the study period. *P < .001 (paired t test) vs the preintervention baseline.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Koch PE, Ryder HF, Dziura J, Njike V, Antaya RJ. Educating Adolescents About Acne Vulgaris: A Comparison of Written Handouts With Audiovisual Computerized Presentations. Arch Dermatol. 2008;144(2):208–214. doi:10.1001/archdermatol.2007.35
To compare the efficacy of written handouts with that of audiovisual computerized presentations in educating adolescents about acne vulgaris.
A private dermatology office or 1 of 3 general pediatric clinics in New Haven.
One hundred one adolescent patients, aged 13 to 17 years.
All participants completed a brief enrollment questionnaire to gauge baseline knowledge of acne vulgaris. Subjects were then randomized to either receive a written handout or watch an audiovisual computerized presentation. Immediately following the intervention, and again at 1 month, patients were asked to complete identical questionnaires to assess change in knowledge.
Main Outcome Measures
Change in knowledge about acne vulgaris, as indicated by performance on preintervention, postintervention, and 1-month follow-up questionnaires.
Baseline questionnaires were completed by 21 patients in the pilot study and 80 patients in the revised study; 17 (81%) and 77 (96%) completed the respective studies. In both the pilot (P = .64) and revised (P = .63) studies, there was no significant difference between intervention groups in terms of baseline knowledge or gain in knowledge. Immediately postintervention, both groups showed significant improvement from baseline (P < .001 in the revised study and P < .01 in the pilot study). At the 1-month follow-up, patients in the pilot study randomized to receive the computerized presentation still showed significant gain in knowledge from baseline (P < .05), while those in the handout group did not. Meanwhile, both intervention groups in the revised study continued to show significant gain in knowledge from baseline at 1 month (P < .001).
Both written handouts and audiovisual computerized presentations about acne vulgaris confer significant and equivalent benefits in terms of short- and long-term knowledge gains among adolescent patients with acne.
Despite its prevalence1-3 and its potential to significantly impair emotional health and well-being,4-10 substantial misunderstanding persists regarding the causes and treatment of acne.11-13 Surveys of patients with acne in academic and community settings have revealed widespread misconceptions regarding acne's pathogenesis, natural course, and response to therapy. Yet, while patients continue to be misinformed about acne, it is not because they are disinterested. Community-based surveys indicate patients receive most of their information about acne from television (74%), parents (61%), friends (47%), and magazines (39%); meanwhile, most patients surveyed believed the information they were receiving from these sources was inadequate.14,15
While effective therapeutic options exist for the treatment of acne, treatment compliance with acne medications has been shown to be as low as 12.5%.16 Poor patient compliance has been identified as the main reason for acne treatment failure.17 Accurate diagnosis, appropriate therapy, and good compliance with directions for therapy are all important components in the treatment of disease. Previous studies13,17 have suggested that noncompliance with treatment is the result of the patient not understanding the nature of acne, not understanding the nature of the treatment, or having unrealistic expectations of treatment. Health education, especially knowledge of disease-therapy interactions, has been shown to increase compliance in adolescents who have other chronic diseases.18,19 In a disease such as acne, in which compliance with treatment is of paramount importance, patient education may play a critical therapeutic role.
Many health care providers struggle to find a way to educate patients about their diseases. Traditional methods of patient education have included physician-patient conversations and printed handouts and pamphlets. While these methods may be moderately effective, conversations in a busy clinic are often harried and printed handouts left unread.20 Audiovisual presentations are significantly more effective than traditional methods of patient education in improving patient knowledge, as measured by scoring on pretest and posttest questionnaires.21 In a technologically savvy cohort, such as teenaged patients, digitized computer-based information may be seen as “cooler” and more accessible than traditional vehicles of information. While studies in other areas of medicine have shown Internet-enabled multimedia interventions22 and “sound and slide shows”23 to be more effective than written presentation of information, to our knowledge, no such studies have been conducted on adolescents in dermatology.
We aimed to evaluate the effectiveness of 2 educational methods, both of which are applicable to everyday clinical practice. We hypothesized that subjects randomized to receive audiovisual computerized presentations about acne vulgaris would demonstrate a greater increase in knowledge about acne, as measured by scoring on postintervention and 1-month follow-up questionnaires, when compared with subjects receiving written handouts. Information from this study could influence the methods by which information is conveyed to an adolescent patient population, thus building on previous research into the specific educational methods most likely to maximize adolescents' acquisition and retention of material.
A clinical and questionnaire-based study was conducted with approval granted by the Human Investigation Committee at Yale University School of Medicine. The study involved 80 adolescents, aged 13 to 17 years, who presented to a private dermatology office or 1 of 3 general pediatric practices and clinics in New Haven. Participants received a brief questionnaire on enrollment to assess baseline knowledge about acne vulgaris (eFigure), and were then randomized via coin toss to either receive a written informational handout or watch an audiovisual computerized presentation. All enrolled subjects then immediately completed an identical questionnaire to assess the effectiveness of the intervention. At 1 month, the questionnaire was again administered via telephone interview to determine the degree of retention of the information.
The questionnaires and educational materials were designed specifically for this study and pilot tested on a group of 21 patients at the end of their visit to a private dermatology office.
Both the audiovisual and written presentations were designed to maximize the acquisition of material with strategies that have proved effective: writing at the sixth- to eighth-grade reading level, limiting the number of take-home messages, and focusing on the most prevalent misconceptions about acne vulgaris.23 The computer presentation and written handout presented the same information and focused specifically on issues shown to be misunderstood by adolescents in previous studies.11,13,17,23 In particular, our intervention addressed the causes of acne, factors that may exacerbate acne, the duration and proper use of acne treatments, and suggestions to increase compliance. The written handout was black-and-white text only, while the computerized presentation incorporated text and audiovisual aids.
In designing our educational materials, we adhered to principles of education theory and psychiatric theory of compliance.24 According to Ames et al,24 high-quality educational materials meet the following criteria: contain accurate, current, and appropriate information; adopt an appropriate learning philosophical point of view; are interesting and attractive to children; and are free of cultural, ethnic, age, race, disability, and sexual biases.24 The information presented in both the computerized and written formats was sufficient to answer all questions posed by the questionnaires. The computerized presentation was 6 minutes 26 seconds long; the pamphlet was read in less than 5 minutes by most participants. Changes to the educational materials following the pilot study included minor changes in wording and layout. Attempts were also made to ensure that the content of each question on the questionnaire was addressed in a similar manner by both the computerized and written interventions.
To allow comparison of our data with previously published data, we modeled our assessment questionnaires after those distributed by Rasmussen and Smith11 and Tan et al.12 In creating the questionnaires, we sought to adhere to standard areas of questioning in patient conceptions of acne. The baseline assessment included demographic information and information regarding subjects' current acne severity by self-report, sources of acne information, and desire for additional information about acne. The preintervention, postintervention, and 1-month follow-up questionnaires contained an identical set of questions to assess knowledge of acne and its treatments. Changes to the pilot study questionnaire included minor changes in wording and the deletion of 1 question we deemed unclear. Following the pilot study, additional questions were incorporated in an effort to address misconceptions we believed were pertinent to patient compliance and understanding of acne vulgaris—such as the notion that dirt causes blackheads and that frequent face washing correlates with improvement in acne. The revised questionnaires consisted of 18 questions (eFigure).
During the revised study, most patients were able to provide informed consent, complete the questionnaires, and read or view the written or audiovisual materials while waiting to be seen by their physician.
All data collected for this study were entered into a database (Microsoft Access; Microsoft Corporation, Redmond, Washington) and analyzed with SAS version 9.1.25 The primary outcome variable in this study is “knowledge about acne,” measured on a scale from 0 to 18, representing the number of questions answered correctly. To assess the change from baseline within groups (audiovisual vs handout), we used paired t tests. The difference between groups was analyzed using a 2-sample t test. In all analyses, P < .05 was considered statistically significant.
Twenty-one patients were recruited to the pilot study and then randomized into a handout group (n = 7) and a computerized presentation group (n = 14). Of the original 21 subjects, 4 were lost to follow-up after completing the postintervention questionnaire, leaving 17 (81%) to complete the study.
Of the 82 patients approached to participate in the revised study, 80 chose to participate and were randomized into a handout group (n = 45) and a computerized presentation group (n = 35). Of the original 80 subjects, 3 were lost to follow-up, resulting in a 96% completion rate.
The demographic data for the pilot study participants are summarized in Table 1. There was no significant difference between groups in acne severity by self-report, the degree to which patients were bothered by their acne, or knowledge of acne by self-report. The baseline knowledge scores, as determined by performance on the preintervention questionnaire, were similar between intervention groups. There was no significant difference between intervention groups (Table 1).
The postintervention change in knowledge scores, as determined by comparing the results of the pretest and posttest questionnaires, showed significant improvement from baseline for both groups (P < .01). There was no significant difference between intervention groups (mean [SD] knowledge score, 16.67% [12.19%] for the computerized presentation group vs 25.71% [18.23%] for the handout group; P = .19).
At the 1-month follow-up, significant improvement from baseline was noted in the computerized group (P < .05), but not in the handout group. Again, no significant difference was noted between groups (mean [SD] knowledge score, 15.15% [16.35%] for the computerized presentation group vs 13.33% [15.20%] for the handout group; P = .83).
The demographic data pertaining to participants in the revised study are summarized in Table 2. As in the pilot study, patients reported whether they had ever seen a physician for their acne and were asked to rate their preintervention knowledge about acne. They were also queried regarding current acne severity and the degree to which they were bothered by acne. None of the described measures differed significantly between the 2 intervention groups.
Compared with the pilot study, more patients in the revised study believed that additional information about acne would be helpful. Neither the differences in baseline knowledge nor the desire for more information about acne was significant between groups (Table 2).
The postintervention change in knowledge scores were determined by comparing the results of pretest and posttest questionnaires. Although there was no significant difference between intervention groups (mean [SD] knowledge score, 22.06% [18.05%] for the computerized presentation group vs 26.91% [15.93%] for the handout group; P = .21) (Figure 1), the within-group improvement was significant (P < .001, paired t test) (Figure 2) for both groups.
At the 1-month follow-up, significant improvement from baseline was again noted within both intervention groups (P < .001, paired t test) (Figure 2). The mean (SD) score on the final questionnaire was improved by 17.14% (16.74%) in the computerized group and by 12.84% (19.27%) in the handout group. Again, no significant difference was noted between groups (P = .30) (Figure 1).
To our knowledge, this is the first study comparing written handouts with audiovisual computerized presentations as a means of educating adolescent patients about acne vulgaris. Previously collected data suggest that despite acne's prevalence,1 knowledge about acne pathogenesis and treatment remains poor.11-13 Meanwhile, studies2,8,21,22,26 in other areas of medicine have shown computerized health interventions to improve health status and serve as valuable supplements to one-on-one interaction between patients and clinicians. With the results of such studies in mind, we hypothesized that a population of adolescent patients with acne would find colorful computer-based information more accessible than traditional vehicles of information and that this would translate into superior knowledge gains as determined by preintervention and postintervention questionnaires.
The results of our study support the notion that computerized audiovisual presentations serve as effective teaching tools in the clinic and may relieve the burden on busy health care providers. Our findings also raise interesting questions regarding the potential role of testing, or quiz taking, in patient education. The data suggest, contrary to our expectations, that written handouts impart equivalent gains in acne knowledge compared with computerized audiovisual handouts. Analysis of these results sheds light on the limitations of our study and generates questions for future research.
While dermatologists still receive nearly 80% of all visits for acne, the number of acne visits to nondermatologists has increased more than 4-fold since 1980.2 Previous studies27,28 indicating poor acne knowledge among general practitioners suggest they may not be adequately equipped to meet the educational needs of this increasing patient population with acne. Our study interventions yielded significant improvement in knowledge scores in a cohort of patients, of whom most had previously seen a physician for acne. This gain in knowledge among patients with previous exposure to acne education underscores the need, on the part of clinicians in dermatology and general practice, for more consistent and effective means of educating patients.
The enthusiastic response of adolescents to our study is evidenced by the high enrollment and completion rates. The 97.6% enrollment rate may have been influenced by the fact that most patients were approached while waiting to be seen by their pediatrician or dermatologist and had little aside from magazines with which to occupy their time. A second, related, factor in the high enrollment rate may have been that patients were assured participation in the study was unlikely to add substantial time to their clinic visit. That these assurances were borne out in the execution of the study—despite the time-consuming process of obtaining informed consent and filling out questionnaires—suggests similar educational interventions could be adopted in clinical practice without extending patient visit times. The positive response of patients to our study is consistent with findings from previous studies14,15 indicating patients are unsatisfied with the information about acne they receive from friends and the lay press.
The main outcome measure in our study was change in knowledge about acne as determined by performance on the preintervention, postintervention, and 1-month follow-up questionnaires. We had postulated that the audiovisual computerized intervention would lead to greater improvement in scores when compared with that of the written handout. However, results from the pilot and revised studies did not fully support our hypothesis. Although there was significant improvement from baseline in both groups and in both studies, there was no significant difference between groups.
Previous reports in the literature have spoken to the efficacy of audiovisual mediums through which to educate patients. One such report, a systematic review of randomized clinical trials conducted by Krishna et al,29 aimed to evaluate the utility of computerized patient education. Of 22 studies meeting the inclusion criteria, only 1 failed to show positive results for the interactive educational intervention. A subsequent randomized, controlled, clinical trial conducted by Krishna et al22 concluded that supplementing conventional asthma care with interactive multimedia education led to improved asthma knowledge and decreased morbidity and use of emergency services among 228 pediatric patients with asthma. The studies by Krishna et al, among others, have found these educational methods to be particularly effective in patient populations with chronic diseases, such as diabetes mellitus and asthma. One study by Sly19 compared 2 methods of allergy patient education. Asthmatic children presenting to a clinic in New Orleans, Louisiana, were randomized to 1 of 2 experimental groups: the first received a sound-slide show on the cause and control of the particular allergy experienced by the children, while the second received the same information via lecture. As in our acne study, the effectiveness of the 2 interventions was judged equivalent. However, an important distinction was perceived in that the slide show “[freed] the doctors for counseling on different aspects of the allergy program and more specific problems.”19(p94)
Our study of patients with acne did not address whether the educational interventions affected patient visit time or whether they eased the burden on physicians. However, this possibility is supported by other reports in the literature. A randomized, controlled, clinical trial conducted by Marshall et al21 revealed that physicians spent less time with patients who had previously received audiovisual education materials (mean, 7.0 minutes) than with patients who had not received such information (mean, 9.5 minutes), despite the fact that physicians were blinded to patient grouping. Meanwhile, a recent study by Schaffer and Tian30 showed that providing patients with written and audio educational materials—with no further intervention by the health care provider—conferred a lasting beneficial effect on asthma medication adherence.
Contrary to expectations, our data suggest the written handout and audiovisual presentation conferred equivalent benefits in the short- and long-term. This was not the case in the study by Marshall et al21 previously described, which found that patient knowledge gain was greater among patients receiving audiovisual education than among those receiving a pamphlet or a lecture. One explanation for the efficacy of our handouts may be that patients receiving the written handout could control the pace at which they received information. Furthermore, patients received the handout immediately on completing a preintervention questionnaire; it is, therefore, possible that their reading of the material was more focused than would normally be the case. Familiarity with the testing material may also have led participants to exercise the option of rereading relevant sections of the handout.
The previous discussion illuminates one of the limitations of our study. In formulating our hypothesis, we postulated that written pamphlets were inferior to computerized presentations in that the former were likely to be left unread in daily practice. In contrast, we assumed that information conveyed via a colorful audiovisual medium would more likely hold the attention of an adolescent audience. This perceived shortcoming of the handout was effectively cancelled by the fact that participants in our study enrolled with the understanding that they would not only read the material but also be tested on its content. Therefore, the study design and process of obtaining informed consent may have influenced our results.
On the other hand, a benefit inherent to written handouts is that they can be brought home and read at a patient's leisure. Audiovisual materials are less portable (although this distinction is fast losing its significance along with ever-expanding access to the Internet and home computing). Yet, the design of our study required that participants relinquish their written handouts before receiving the postintervention questionnaire. Therefore, one theoretical advantage of the written handout—its portability and the opportunity for a patient to reread it after the patient's discharge from the clinic—was negated by the false restrictions imposed by our study.
While we recognize the described limitations, we do not believe they cancel the important findings of our study. If applied to clinical practice, it is likely that an audiovisual aid would serve to augment, rather than replace, written pamphlets. Our study did not contain a third arm in which participants received both a written handout and an audiovisual presentation, but we can only assume that the combination would be comparable to, if not more effective than, either intervention alone.
The possibility that our study participants paid special attention to the materials because they knew they would be tested on the content can likewise be viewed as either a limitation of the study or a springboard for further research. Other evaluations of audiovisual aids in patient education have successfully incorporated patient testing/feedback into their educational strategies.28,29 Testing may serve a dual purpose: that of increasing the attention paid to educational media and that of alerting health care providers and/or patients about potential gaps in knowledge.
Future studies could help to elucidate the effects of an interactive multimedia presentation on patients' perceptions of their clinic visits. Our study did not evaluate this aspect of the educational interventions, but anecdotally it was noted during the coin toss that many patients wanted to be randomized into the audiovisual group, despite the greater time commitment this would entail. Future research could examine the impact of an audiovisual presentation on patient satisfaction with the education offered and the office visit in general.
Our study sample was not large enough to ascertain whether demographic differences, such as age or sex, might influence the effectiveness of various educational tools. For example, are girls more adept at computer training than boys? Or vice versa? Are young children more responsive to one educational medium while older children are responsive to another? Future studies with larger sample sizes might unearth such differences.
Future research could also evaluate whether enhanced patient education translates into improved compliance with acne medications. While previous research has suggested noncompliance to be the result of a patient not understanding the nature of acne or the mechanism and natural time course of acne therapies,13,17 to our knowledge, no studies have definitively shown this to be true. A recent study by So et al31 evaluated the effects of enhanced patient education on compliance with treatment for hypertrophic burn scars. Their intervention, which involved a 5-page printed pamphlet and a 26-minute videotape, resulted in significant improvement in medication compliance and better scar outcome compared with patients receiving only a 1-page pamphlet and in-visit counseling. Previously mentioned studies of children with diseases such as asthma and diabetes mellitus also have shown education to translate into behavior change. Future research could address whether this holds true in the case of adolescent patients with acne.
The findings from our study raise intriguing questions about patient education in general and the education of adolescents in particular. The improvement in knowledge scores achieved by most participants, including those who had previously seen a physician for their acne, is consistent with previous research in suggesting there is room for improvement in acne education. Future studies could provide additional clarification regarding the specific combination of educational interventions that may be most effective and feasible in the setting of an outpatient clinic. In addition, future research could evaluate the effect that increased knowledge about acne might have on an adolescent population in terms of self-confidence, compliance with skin care regimen, and, most notably, improved clinical outcomes.
Correspondence: Richard J. Antaya, MD, Department of Dermatology, Yale University School of Medicine, PO Box 208059, New Haven, CT 06520-8059 (email@example.com).
Accepted for Publication: July 30, 2007.
Author Contributions:Study concept and design: Koch, Ryder, and Antaya. Acquisition of data: Koch and Antaya. Analysis and interpretation of data: Koch, Dziura, Njike, and Antaya. Drafting of the manuscript: Koch, Ryder, and Antaya. Critical revision of the manuscript for important intellectual content: Koch, Dziura, Njike, and Antaya. Statistical analysis: Dziura and Njike. Obtained funding: Antaya. Study supervision: Antaya.
Financial Disclosure: None reported.
Funding/Support: This study was supported in part by grant MO1-RR06022 from the Children's Clinical Research Center; the General Clinical Research Centers Program, Yale University Office of Student Research; the National Center for Research Resources, National Institutes of Health; and a clinical research fellowship from the Doris Duke Charitable Foundation.