[Skip to Navigation]
Sign In
Article
February 2002

A Comparison of a Lecture and Computer Program to Teach Fundamentals of the Draw-a-Person Test

Author Affiliations

From the Robert Wood Johnson Clinical Scholars Program, Seattle, Wash (Dr Carroll); and the Departments of Pediatrics, University of Washington, Seattle (Dr Carroll) and University of Pennsylvania, Philadelphia (Dr Schwartz).

Arch Pediatr Adolesc Med. 2002;156(2):137-140. doi:10.1001/archpedi.156.2.137
Abstract

Background  Although computer-assisted education has been used to augment education in many areas, there are few studies of programs designed to replace lectures in a medical curriculum.

Objective  To test whether a thoughtfully designed computer program can replace a standard lecture in a pediatrics curriculum while teaching the subject matter equally well.

Methods  A computer program was developed to teach the Draw-a-Person developmental test using the multimedia-authoring tool Director. One of us (A.E.C.) tested and modified the program several times during its creation after submitting it to several objective evaluators. Thirty-nine students taking the clinical pediatrics rotation were chosen by month to interact with the program or attend the lecture. All students then scored 3 drawings and assigned them a developmental age according to the Draw-a-Person test rules. Students assigned to the computer program also completed a questionnaire evaluating the program in several subjective areas. A t test for 2 samples assuming equal variance was used to analyze the test results.

Results  Students receiving the lecture (control group) scored the 3 drawings as 5.43 years (age range, 4.5-8 years), 9.08 years (age range, 7-12 years), and 3.5 years (age range, 2-5 years), respectively. Those using the computer program (study group) scored the 3 drawings as 5.91 years (age range, 5-7 years), 7.68 years (age range, 7-8 years), and 4.34 years (age range, 3-5 years), respectively. The correct answers for the ages were 6, 7.75, and 4.25 years, respectively. A t test for 2 samples assuming equal variance showed that students using the computer program performed better on all 3 drawings (P<.05, P<.02, and P<.002, respectively).

Conclusions  Students using the computer program were more accurate than students attending the lecture when scoring drawings and estimating a developmental age from them. These results support the conclusion that a thoughtfully designed computer program can replace a standard lecture in a pediatrics curriculum.

COMPUTERS have great potential for use in medical education. Because of fundamental shifts in health care, medical school curriculum has moved from a primarily inpatient to an ambulatory setting. As schools have more difficulty gathering students together regularly, they have had to change their methods of teaching. The increased number of clinical sites has also produced increased variation in education from one student to the next, emphasizing the need for a true standardized central curriculum. Sending medical students out to varying settings has required them to receive their instruction on a more individualized basis. Computer-assisted teaching may be one method of accomplishing this task without sacrificing educational standards. Before making the shift to computer-based education, educators must be satisfied that learning is not sacrificed.

While many articles exist in the literature discussing computer-assisted education, a small proportion of them are actually trials of evaluation.1 Even fewer are evaluations of computer programs designed to teach material and replace lectures in a pediatrics curriculum. We chose to study a standard lecture teaching the Draw-a-Person (DAP) developmental test. The DAP test calls for a child to draw a person and allows professionals to estimate the child's developmental age by examining the complexity of the drawing. The DAP test is easy to teach and implement and can be translated completely into a computer application. We hoped to show that the computer program could completely replace the standard lecture in the pediatric curriculum without sacrificing learning.

Participants and methods

One of us (A.E.C.) created the computer application after planning and reviewing its content with the lecturer (M.W.S.) to maintain standardization between the lecture and the application. The lecture was regularly given one afternoon during the pediatrics clerkship rotation. It began with a clinical problem, such as "Is the child developmentally ready to go to school?" The DAP test was then introduced, the scoring system explained with several examples, and then several figures were scored. The lectures that took a half hour to deliver were not all identical from a word-for-word standpoint, but the curriculum was reasonably established. The computer program (Figure 1), written with the multimedia authoring tool Director 5.0 (Macromedia Inc, San Francisco, Calif; also available at: http://www.macromedia.com), included 4 sections: a discussion of the uses of the test, its strengths, its weaknesses, and the intent of the program itself; the instructions on how to administer and score the test (Figure 2); a library of drawings for various age groups, with full explanations as to how each drawing was scored (Figure 3); and a self-administered test, complete with answers and explanations (Figure 4). A final summary, invoked on exiting the program, reviewed the usefulness of the DAP test. Each section used an audio track linked with visual cues and text. The program was self-paced and allowed for as much review as each student believed was necessary. It was easy to use, could be run at any time, and could be completed in a few minutes. It could be run on either a Windows- or Macintosh-based computer and needed no other programs to run. After field testing and modifications, the students were allowed to interact with it. Although students could have taken it home for use, we required those using it to do so in the hospital library.

Figure 1. 
View of the "Table of Contents" screen for the computer program of the Draw-a-Person test.

View of the "Table of Contents" screen for the computer program of the Draw-a-Person test.

Figure 2. 
View of the instructions for scoring screen for the computer program of the Draw-a-Person test.

View of the instructions for scoring screen for the computer program of the Draw-a-Person test.

Figure 3. 
View of the library of drawings screen for the computer program of the Draw-a-Person test.

View of the library of drawings screen for the computer program of the Draw-a-Person test.

Figure 4. 
View of the self-assessment screen for the computer program of the Draw-a-Person test.

View of the self-assessment screen for the computer program of the Draw-a-Person test.

We took every effort to make sure that the content of the lecture and that of the program were equivalent. The author observed the lecture several times and the lecturer reviewed every version of the computer program to evaluate its content.

On alternating months, over a total of 4 months, students enrolled in Pediatrics 200 (the third-year clerkship) attended the lecture (n = 24 students) or used the computer program for instruction (n = 15 students). By alternating student placement we hoped to account for any differences between them in the levels of their education or composition. The DAP test was not taught in any other rotations, and none of the students, to our knowledge, had previously been exposed to it.

The lecturer made sure that the information covered in class was the same as that covered in the computer program. Those hearing the lecture did so as a group on the day it was normally scheduled. Those using the computer program had no lecture that day and were instructed to use the program alone, at any time they wished, over the course of 1 week.

The students in the computer program group were given a questionnaire to rate the educational value of the computer program and its ease of use on a scale of 1 (best) to 5 (worst). They were also asked to make comments and suggestions about the computer program.

As part of the final examination for the clerkship each month, the students were instructed to score 3 drawings and assign a developmental age to each. Both the lecturer and the author of the computer application knew the nature of the final examination, and neither lecture nor computer program taught to the test. The test drawings that were part of the final examination did not appear in either the lecture or the computer program. Each student's scores were averaged with their assigned group and the means compared with the correct answer. The results were analyzed by a t test for 2 samples assuming equal variance.

Results

The mean scores and ranges for each of the groups are given in Table 1. A t test for 2 samples assuming equal variances showed that computer-assisted students performed better on all 3 drawings (P<.05, P<.02, and P<.002, respectively).

Score for Both Study Groups on the Draw-a-Person Development Test* 
Score for Both Study Groups on the Draw-a-Person Development Test*
Score for Both Study Groups on the Draw-a-Person Development Test*

Subjective evaluations on a number scale allowed the students to rate how educational they found the computer program. The mean (SD) score (1 [best]-5 [worst]) was 1.89 (0.74). The students also rated how much they liked the computer program and how easy it was to use. The mean (SD) score (1 [best]-5 [worst]) was 1.43 (0.65).

Students also recorded on the same questionnaire how much time they spent running the computer program. No student felt the need to rerun the computer program. The computer program also took far less time (5-15 minutes) to run than the lecture (30 minutes).

Comment

This study showed that a thoughtfully designed and focused computer program could replace a standard lecture without a sacrifice in learning. Students using the computer program scored significantly better on a test of the DAP test than those who attended the standard lecture. The computer program was well received and enjoyed by those who reviewed it. For the teaching of this skill, the computer program was equally effective, if not better than, the standard lecture.

There are, of course, several limitations to this study. We examined a limited number of students, and they were all from one medical school class. We also did not collect extensive background information about them, but did make sure that their knowledge of the DAP test was similar. We also did not have students rate the lecture in the same way they did the computer program. There are also limitations to this type of study in general. Some have argued that a medium-medium study such as this is inherently problematic, as what is often being compared is content, not the delivery method.2,3 We went to great lengths to make sure that the content in both the lecture and the computer program was as consistent as possible to make the study one of media. We believe that such studies are still important, as many still are skeptical of the computer's ability to replace more traditional methods of education.

Computer-assisted education has been proposed as a viable alternative in education for many years.4 Programs were initially used to complement standard education, whether as a means of practicing skills learned elsewhere5 or as a means to review previously learned information.6 Computer programs have been studied to replace laboratory sessions,7 sometimes with a cost savings.8 Because of changes in health care and medical school curriculum, computer-assisted education is becoming more valuable than ever. Computer-assisted education can compensate for the increasingly different student clinical experiences by allowing information to be provided in a standardized fashion. Education can be formatted to meet individual student requirements, with students taking as much or as little time as needed to learn the concept presented. Individual computer-assisted education can also easily accommodate different student schedules.

Computers have been available for use in education for some time, but their application has not evolved as quickly as many had assumed it would. Some educators are skeptical of computer-assisted education replacing more traditional education.9 A great deal of medical education has been didactic, through lectures given rotation after rotation. Some educators believe that direct faculty involvement is necessary at every step to impart knowledge adequately. A lecturer can also influence students through personal contact as a role model or personal mentor. Many are afraid to abandon this time-honored system of medical teaching, especially to a method that is somewhat unsupervised. This study showed that a well-designed computer program could replace a lecture and teach the material just as well. The results showed that students using the program scored better in a test of their learned knowledge than the control group.

In the past, some uses of computers were not as well received as lectures10 and were found to be a cause of anxiety.11 Our findings indicated that our students accepted this computer program easily and found it easy to use.

The application of computers has the potential to transform education and is not being used anywhere near to its full potential.12 Future studies can investigate other computer programs, and other lectures, developing computer-assisted medical education into a fully incorporated and valued tool.

Accepted for publication October 11, 2001.

What This Study Adds

Computer-assisted education has been a possibility for some time. It has not enjoyed the widespread use that was initially expected. While many articles exist in the literature discussing computer-assisted education, a small proportion of them are actually trials of evaluation. Even fewer are evaluations of computer programs designed to teach material and replace lectures in a pediatrics curriculum.

This article presents a study showing that students learned more about the DAP developmental test from a thoughtfully designed and focused computer program than from a standard lecture. The computer program also offered more flexibility for scheduling and took less time than the lecture.

Corresponding author: Aaron E. Carroll, MD, Department of Pediatrics, Robert Wood Johnson Clinical Scholars Program, Suite H-220 Health Sciences Center, Box 357183, Seattle, WA 98195-7183 (e-mail: acarro@u.washington.edu).

References
1.
Adler  MDJohnson  KB Quantifying the literature of computer-aided instruction in medical education.  Acad Med. 2000;751025- 1028Google ScholarCrossref
2.
Adler  M Computer-assisted instruction.  Acad Emerg Med. 2000;71440- 1441Google ScholarCrossref
3.
Friedman  CP The research we should be doing.  Acad Med. 1994;69455- 457Google ScholarCrossref
4.
Murray  TSCupples  RWBarber  JHHannay  DRScott  DB Computer-assisted learning in undergraduate medical teaching.  Lancet. 1976;1474- 476Google ScholarCrossref
5.
Eisenberg  HGordon  S Computer-assisted teaching program: a pulmonary patient management problem.  Int J Clin Monit Comput. 1987;4195- 197Google ScholarCrossref
6.
Elves  AWAhmed  MAbrams  P Computer-assisted learning; experience at the Bristol Urological Institute in the teaching of urology.  Br J Urol. 1997;80(suppl 3)59- 62Google ScholarCrossref
7.
Lamperti  ASodicoff  M Computer-based neuroanatomy laboratory for medical students.  Anat Rec. 1997;249422- 428Google ScholarCrossref
8.
Dewhurst  DGHardcastle  JHardcastle  PTStuart  E Comparison of a computer simulation program and a traditional laboratory practical class for teaching the principles of intestinal absorption.  Am J Physiol. 1994;267 ((pt 3)) S95- S104Google Scholar
9.
Andrews  PVSchwarz  JHelme  RD Students can learn medicine with computers: evaluation of an interactive computer learning package in geriatric medicine.  Med J Aust. 1992;157693- 695Google Scholar
10.
Richardson  D Student perceptions and learning outcomes of computer-assisted versus traditional instruction in physiology.  Am J Physiol. 1997;273 ((pt 3)) S55- S58Google Scholar
11.
Khadra  MHGuinea  AIHill  DA The acceptance of computer assisted learning by medical students.  Aust N Z J Surg. 1995;65610- 612Google ScholarCrossref
12.
Piemme  TE Computer-assisted learning and evaluation in medicine.  JAMA. 1988;260367- 372Google ScholarCrossref
×