[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Download PDF
1.
The Medical School Objectives Writing Group.  Learning objectives for medical student education—guidelines for medical schools.  Acad Med.1999;74:13-18.
2.
Maudsley RF. Content and context.  Acad Med.1999;74:143-146.
3.
Garrison P. Flying Without Wings. Blue Ridge Summit, Pa: TAB Books Inc; 1985:1-31:102-106.
4.
Goodman W. The world of civil simulators.  Flight International Magazine.1978;18:435.
5.
Rolfe JM, Staples KJ. Flight SimulationCambridge, England: Cambridge University Press; 1986:232-249.
6.
Ressler EK, Armstrong JE, Forsythe GB. Military mission rehearsal. In: Tekian A, McGuire C, McGaghie WC, eds. Innovative Simulations for Assessing Professional Competence. Chicago, Ill: Dept of Medical Education, University of Illinois Medical Center; 1999:157-174.
7.
Keys B, Wolfe J. The role of management games and simulations in education and research.  J Manage.1990;16:307-336.
8.
Streufert S, Pogash R, Piasecki M. Simulation-based assessment of managerial competence: reliability and validity.  Person Psych.1988;41:537-557.
9.
Wachtel J. The future of nuclear power plant simulation in the United States. In: Walton DG, ed. Simulation for Nuclear Reactor Technology. Cambridge, England: Cambridge University Press; 1985:339-349.
10.
Office of Naval Research.  Visual Elements in Flight SimulationWashington, DC: National Council of the National Academy of Science; January 1973.
11.
Dusterberry, JC. Indroduction to simulation systems.  Society of Photo-Optical Engineers.1975;59 :141-142.
12.
Ericsson KA. The Road to Excellence. Mahwah, NJ: Lawrence Erlbaum Associates; 1996: 1-50.
13.
Allard F, Starkes JL. Motor-skill experts in sports, dance and other domains. In: Ericsson KA, Smith J, eds. Toward a General Theory of Expertise. Cambridge, England: Cambridge University Press; 1991:126-152.
14.
Ericsson KA, Krampe RT, Tesche-Rmer C. The role of deliberate practice in the acquisition of expert performance.  Psychol Rev.1993;100:363-406.
15.
Derossis AM, Fried GM, Abrahamowicz M.  et al.  Development of a model for training and evaluation of laparoscopic skills.  Am J Surg.1998;175:482-487.
16.
 The seven-step laparoscopic cholecystectomy surgical trainer. Available at: http://www.cine-med.com. Accessed July 13, 1999.
17.
 The Body Form trainer. Available at: http://www.limbsandthings.com/bodyform.htm. Accessed March 19, 1999.
18.
Melvin WS, Johnson JA, Ellison C. Laparoscopic skills enhancement.  Am J Surg.1996;172:377-379.
19.
Rosser JC, Rosser LE, Savalgi RS. Skill acquisition and assessment for laparoscopic surgery.  Arch Surg.1997;132:200-204.
20.
Windsor JA. Laparoscopic exploration of the common bile duct.  J R Coll Surg Edinb.1993;38:48-49.
21.
Derossis AM, Bothwell J, Sigman HH, Fried GM. The effect of practice on performance in a laparoscopic simulator.  Surg Endosc.1998;12:1117-1120.
22.
Gordon MS, Ewy GA, Felner JM.  et al.  Teaching bedside cardiologic examination skills using "Harvey," the cardiology patient simulator.  Med Clin North Am.1980;64:305-313.
23.
Gordon MS, Ewy GA, Felner JM.  et al.  A cardiology patient simulator for continuing education of family physicians.  J Fam Pract.1981;13:353-356.
24.
Ewy GA, Felner JM, Juul D.  et al.  Test of a cardiology patient simulator with students in fourth-year electives.  J Med Educ.1987;62:738-743.
25.
Woolliscroft JO, Calhoun JG, Tenhaken JD, Judge RD. Harvey: the impact of a cardiovascular teaching simulator on student skill acquisition.  Med Teach.1987;9:53-57.
26.
Mangione S, Nieman LZ, Gracely E, Kaye D. The teaching and practice of cardiac auscultation during internal medicine and cardiology training: a nationwide survey.  Ann Intern Med.1993;119:47-54.
27.
Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees.  JAMA.1997;278:717-722. [published correction appears in JAMA. 1998;279:1444].
28.
St Clair EW, Oddone EZ, Waugh RA, Corey GR, Feussner JR. Assessing housestaff diagnostic skills using a cardiology patient simulator.  Ann Intern Med.1992;117:751-756.
29.
Oddone EZ, Waugh RA, Samsa G, Corey R, Feussner JR. Teaching cardiovascular examination skills.  Am J Med.1993;95:389-396.
30.
Jones JS, Hunt SJ, Carlson SA, Seamon JP. Assessing bedside cardiologic examination skills using "Harvey," a cardiology patient simulator.  Acad Emerg Med.1997;4:980-985.
31.
Takashina T, Shimizu M, Katayama H. A new cardiology patient simulator.  Cardiology.1997;88:408-413.
32.
Waugh RA, Mayer JW, Ewy GA.  et al.  Multimedia computer-assisted instruction in cardiology.  Arch Intern Med.1995;155:197-203.
33.
Petrusa ER, Issenberg SB, Mayer JW.  et al.  Multi-center implementation of a four-year multimedia computer curriculum in cardiology.  Acad Med.1999;74:123-129.
34.
Issenberg SB, McGaghie WC, Brown DD.  et al.  Development of multimedia computer-based measures of clinical skills in bedside cardiology. In: Proceedings of the 8th International Ottawa Conference on Medical Education and Assessment; July 12-15, 1998; Philadelphia, Pa.
35.
Issenberg SB, Petrusa ER, McGaghie WC.  et al.  Assessment of a computer-based system to teach bedside cardiology.  Acad Med.In press.
36.
Abrahamson S, Denson JS, Wolf RM. Effectiveness of a simulator in training anesthesiology residents.  J Med Educ.1969;44:515-519.
37.
Schwid H. A flight simulator for general anesthesia training.  Comput Biomed Res.1987;20:64-75.
38.
Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment.  Anesthesiology.1988;69:387-394.
39.
Kapur PA, Steadman RH. Patient simulator competency testing.  Anesth Analg.1998;86:1157-1159.
40.
Howard SK, Gaba DM, Fish KJ.  et al.  Anesthesia crisis resource management training.  Aviat Space Environ Med.1992;63:763-770.
41.
Gaba DM, Howard SK, Flanagan B, Smith BE, Fish KJ, Botney R. Assessment of clinical performance during simulated crises using both technical and behavorial ratings.  Anesthesiology.1998;89:8-18.
42.
Chopra V, Engbers FH, Geerts MJ, Filet JG, Bovill JG, Spierdijk J. The Leiden anesthesia simulator.  Br J Anaesth.1994;73:287-292.
43.
Schwid HA, O'Donnell D. The anesthesia simulator-recorder.  Anesthesiology.1990;72:191-197.
44.
Gaba DM, DeAnda A. The response of anesthesia trainees to simulated critical events.  Anesth Analg.1989;68:444-451.
45.
DeAnda A, Gaba DM. Unplanned incidents during comprehensive anesthesia simulation.  Anesth Analg.1990;71:77-82.
46.
Schwid HA, O'Donnell D. Anesthesiologists' management of simulated critical incidents.  Anesthesiology.1992;76:495-501.
47.
DeAnda A, Gaba DM. Role of experience in the response to simulated critical incidents.  Anesth Analg.1991;72:308-315.
48.
Lampotang S, Good ML, Westhorpe R, Hardcastle J, Carovano RG. Logistics of conducting a large number of individual sessions with a full-scale patient simulator at a scientific meeting.  J Clin Monit.1997;13:399-407.
49.
Gaba DM. Improving anesthesiologist's performance by simulating reality.  Anesthesiology.1992;76:491-494.
50.
Howell JD. The physician's role in a world of technology.  Acad Med.1999;74:244-247.
51.
Roldan CA, Shively BK, Crawford MH. Value of the cardiovascular physical examination for detecting valvular heart disease in asymptomatic subjects.  Am J Cardiol.1996;77:1327-1331.
52.
Danford DA, Nasir A, Gumbiner C. Cost assessment of the evaluation of heart murmurs in children.  Pediatrics.1993;91:365-368.
53.
Mangione S. The teaching of cardiac auscultation during internal medicine and family practice training.  Acad Med.1998;73(suppl 10):S10.
54.
Forrest F. High level simulators in medical education.  Hosp Med.1998;59:653-655.
55.
Medical School Objectives Writing Group, Association of American Medical Colleges.  Medical School Objectives Project: Medical Informatics Objectives. Available at: http://www.aamc.org/meded/msop/informat.htm. Accessed January 10, 1999.
56.
Norcini J. Computer-based testing will soon be a reality.  Perspectives.Summer 1999:3.
57.
Gregoratos G, Miller AB. Task Force 3: teaching.  J Am Coll Cardiol.1999;33:1120-1127.
Special Communication
September 1, 1999

Simulation Technology for Health Care Professional Skills Training and Assessment

Author Affiliations

Author Affiliations: Center for Research in Medical Education, University of Miami School of Medicine, Miami, Fla (Drs Issenberg and Mayer); Office of Medical Education, Northwestern University Medical School, Chicago, Ill (Dr McGaghie); University of Ottawa, Ottawa, Ontario (Dr Hart); Department of Medicine, Division of Cardiology, Emory University, Atlanta, Ga (Dr Felner); Duke University Medical Center, Durham, NC (Drs Petrusa and Waugh); Department of Medicine, Division of Cardiovascular Diseases, University of Iowa Hospital and Clinics, Iowa City (Dr Brown); Mayo Clinic, Jacksonville, Fla (Dr Safford); Division of Pediatric Cardiology, University of Florida College of Medicine, Gainesville (Dr Gessner); University of Mississippi Medical Center, Jackson (Dr Gordon); and Section of Cardiology, Arizona Health Sciences Center, Tucson (Dr Ewy).

JAMA. 1999;282(9):861-866. doi:10.1001/jama.282.9.861
Abstract

Changes in medical practice that limit instruction time and patient availability, the expanding options for diagnosis and management, and advances in technology are contributing to greater use of simulation technology in medical education. Four areas of high-technology simulations currently being used are laparoscopic techniques, which provide surgeons with an opportunity to enhance their motor skills without risk to patients; a cardiovascular disease simulator, which can be used to simulate cardiac conditions; multimedia computer systems, which includes patient-centered, case-based programs that constitute a generalist curriculum in cardiology; and anesthesia simulators, which have controlled responses that vary according to numerous possible scenarios. Some benefits of simulation technology include improvements in certain surgical technical skills, in cardiovascular examination skills, and in acquisition and retention of knowledge compared with traditional lectures. These systems help to address the problem of poor skills training and proficiency and may provide a method for physicians to become self-directed lifelong learners.

To fulfill society's expectations that physicians "can and will attend equally to all aspects of health care," the Association of American Medical Colleges' Medical School Objectives Project Report I1 has stated that physicians must be altruistic, knowledgeable, skillful, and dutiful. Maudsley2 subsequently pointed out that prescribing such undergraduate outcomes is much easier than achieving them. He notes that approximately 25 major reports on medical education have been issued this century, most of which have "identified the same or similar problems, claimed that previous recommendations have gone relatively unheeded, argued that reform is essential and urgently needed, and prescribed strikingly similar corrections." Despite these comments, there have been significant changes in medical education over the past 2 decades, with technology a major factor in this change. Although its main impact on medicine and medical education appears to be in the area of information management, technology is playing an increasingly important role in skills training, which is related to the "skillful" attribute listed in the Medical School Objectives Project report.

Although the use of simulation technology, including multimedia computer programs, only now is gaining wider acceptance in medicine, such technology is well established in other disciplines. Examples include the use of flight simulators for pilots and astronauts,35 war games and training exercises for military personnel,6 management games for business executives,7,8 and technical operations for nuclear power plant personnel.9 Simulations are not identical to "real life" events. Instead, simulations place trainees in lifelike situations that provide immediate feedback about questions, decisions, and actions. Flight simulators closely approximate in-flight situations, and the airline industry has demonstrated that they improve pilot skills.10,11

Skills may be defined as "actions (and reactions) which an individual performs in a competent way in order to achieve a goal."12 One may have no skill, some skill, or complete mastery. Therefore, when teaching or testing a skill, the level of acceptable mastery must be defined based on the training level. In competitive fields such as athletics, music, and chess, results from tournaments can be analyzed to rank individuals' skills on an interval scale. According to Ericsson,12 the differences between individuals, from novice to champion, "are among the largest reproducible differences in performance observed for normal adults." He also suggests that similar large differences would be expected in other domains of expertise (including medicine), in which a long period of education followed by an apprenticeship is required. Medical practice also may have corollaries in sports, where there is evidence that elite athletes use their perceptual-motor expertise in various situations to predict what is coming next and select appropriate actions.13

The most important identifiable factor separating the elite performer from others is the amount of "deliberate practice." This includes practice undertaken over a long period of time to attain excellence as well as the amount of ongoing effort required to maintain it. Deliberate practice has been defined as the opportunity to tackle "a well-defined task with an appropriate difficulty level for the particular individual, informative feedback, and opportunities for repetition and corrections of errors."14

The problem in medical education is that the subjects necessary to "deliberately practice on" are human beings with all their diversity and variability. Changes in medical practice that have reduced physician teaching time and decreased the availability of patients as educational resources and the rapidly increasing options for disease diagnosis and management have created a need for new methods of instruction, knowledge acquisition, and assessment. Advances in high-fidelity simulations of human conditions constitute such new methods and could add immeasurably to our teaching armamentarium.

Unlike patients, simulators do not become embarrassed or stressed; have predictable behavior; are available at any time to fit curriculum needs; can be programmed to simulate selected findings, conditions, situations, and complications; allow standardized experience for all trainees; can be used repeatedly with fidelity and reproducibility; and can be used to train both for procedures and difficult management situations.

This article focuses on 4 areas in which high technology simulations are being used to teach and test a variety of skills in medicine and discusses the evidence of their effectiveness. The advantages of such simulations are outlined for both the initial training of important skills and as a potential means for lifelong development and reinforcement of these skills.

LAPAROSCOPIC SIMULATORS

Laparoscopic surgery is technically different from traditional open surgery, and many practicing surgeons did not learn the skills required for laparoscopic procedures during residency training. These skills include proficiency in ambidextrous maneuvers with new instruments and enhanced hand-eye coordination and depth perception.15 Several methods have been developed to assist physicians in acquiring laparoscopic skills. Early examples include training with other certified laparoscopic surgeons and courses involving practice with animal models. Laparoscopic surgical simulators provide an additional method for surgeons to enhance their motor skills in a standardized, controlled environment without risk to patients.

Examples of laparoscopic simulators include the Laparoscopic Surgical Trainer,16 Body Form laparoscopic trainer,17 and McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS) program.15 The latter consists of a box covered by an opaque membrane to simulate skin, 2 trocars, a laparoscope, and a video monitor placed in line with the operator. Several exercises have been developed, each emphasizing a specific skill essential for proper laparoscopic technique, including assessment of hand-eye coordination; cutting technique; placement of surgical clips, ligating loops, and mesh materials; and needle transfer, suture placement, and knot-tying skills.

Several reports have described the development and use of training exercises that simulate skills involved in laparoscopic surgery, including knot tying and suturing18,19 and the use of saphenous veins to simulate common bile duct exploration.20 Recent studies with newer simulators have included several skills for training and assessment of speed and precision. Derossis and colleagues15 evaluated surgical residents and practicing surgeons and found that level of training and frequency of skill repetition were significant predictors of overall skill proficiency on the simulator. In a follow-up study by the same group,21 surgical residents trained with a simulator showed significantly (P<.05) greater improvement on the simulator compared with controls on 4 of the tasks and for overall scores.

Surgical techniques that require repeated practice to master specific skills lend themselves to simulator use. Surgical program directors and those responsible for continuing medical education should consider use of simulator training along with current didactic sessions before allowing trainees to perform these invasive techniques on patients. The traditional reliance on an apprenticeship may no longer be the only method to remain technically proficient in an era in which the techniques and the instruments of surgery are continually changing.

HARVEY, THE CARDIOLOGY PATIENT SIMULATOR

The Harvey simulator is a life-sized mannequin that provides a comprehensive cardiology curriculum by realistically simulating 27 cardiac conditions. The physical findings programmed in the simulator for each disease include blood pressure; bilateral jugular venous, carotid, and peripheral arterial pulses; precordial impulses in 6 different areas; and auscultatory events in the 4 classic auscultatory areas that are synchronized with the pulses and vary with respiration.

A curriculum of cardiovascular diseases with learning goals, a teaching manual, test instruments, and self-assessment slide programs for each condition has been developed by a national consortium of physicians and educators.22 The slide programs include all the elements normally available with a patient, including history, blood chemistry results, and noninvasive and invasive laboratory data. In addition, appropriate medical and surgical therapy is presented, along with a summary of the pathology and epidemiology of each disease.

The Harvey simulator can be used in the education of a variety of trainees. The simulator may be used to teach the beginning student such basic techniques as taking blood pressure and recognizing a heart murmur. For the senior medical student, Harvey correlates heart sounds with respiration, provides normal and abnormal carotid and jugular venous pulsations, and correlates physical findings with historical and laboratory data. The simulator also has potential applications for the postgraduate training of primary care physicians. For example, at 2 annual conventions of the American Academy of Family Physicians, more than 1500 physicians participated in 8 4-hour teaching sessions that emphasized the importance of bedside diagnostic skills in evaluating patients with suspected cardiovascular disease.23 Despite the large audiences, each physician was able to participate in the evaluation of findings presented on Harvey through the use of individual stethophones for auscultation and closed circuit monitors for visualization of pulses. Responses from a course evaluation indicated that participants were nearly unanimous (1280 of 1333) in thinking that the simulator was a valuable teaching tool with which they would like to have further experience.

Harvey has been rigorously tested to establish its educational efficacy. In a multicenter study involving 208 senior medical students at 5 medical schools,24 fourth-year medical students who used the cardiology patient simulator (CPS) during their cardiology elective performed significantly better than the non-CPS–trained group, who learned in the traditional manner from patients. This was true not only on the CPS skills posttest (P<.001), but also on the patient skills posttest (P<.03). In addition, there was no statistically significant difference in the way patients perceived the professional behavior of CPS-trained and non-CPS–trained students. This finding addresses the concern that simulators may negatively affect physician behavior. In another study involving 203 second-year medical students at the University of Michigan, incorporation of Harvey into a required physical skills course significantly improved overall cardiac examination skills as measured by pretests and posttests of unknown findings on the simulator (P<.001).25 An additional observation was that the use of Harvey reduced the time faculty and students would have spent locating enough patients to examine a wide variety of cardiac problems.25

These data suggest that simulation technology is a reasonable addition to the medical curriculum, skills learned on a simulator are transferable to patients, and student behavior toward patients is not adversely affected by exposure to a simulator. Combining simulation technology with traditional patient-centered teaching has the potential to prepare medical students to provide medical care with increased confidence at the bedside. This is especially important at a time when the cardiac bedside examination reportedly is being taught less frequently and less effectively26 and some medical students and residents have difficulty identifying even common cardiac auscultatory findings.27

The CPS may also be used in the place of patients for testing bedside cardiovascular examination skills. In contradistinction to testing skills using patients, the CPS provides complete control over the task selected and its complexity. Patient findings can be presented uniformly and the process of skills testing can then be standardized. A multicenter study involving senior medical students called for testing individuals on a full range of 200 CPS simulations.24 Standardization allowed improved sampling of student performance so that any area (eg, vital signs, nonauscultatory events, and auscultation) could be assessed. Other educators have used Harvey to assess cardiovascular bedside skills of residents.2830 Recently a more portable cardiology patient simulator, Simulator K (M64 New Cardiology Patient Simulator AVP Training System, Kyoto Kagaku Co, Kyoto, Japan), was developed by Japanese cardiologists who recognize the potential use of simulation to facilitate bedside skills testing.31

UMEDIC MULTIMEDIA COMPUTER SYSTEM

The UMedic multimedia computer system has been in development for the last 14 years and contains multimedia features that include computer and video graphics and real-time digitized video and audio.32 Ten patient-centered, case-based programs constitute a comprehensive generalist curriculum in cardiology. The program structure includes history; bedside findings; diagnosis; laboratory data, including electrocardiograms, radiographs, real-time echocardiograms and angiograms; treatment, including videos of surgery; and summary discussions. Learners can choose to study all of the above sections or only the bedside evaluation, which does not include the laboratory data and treatment sections. The system can be used alone or linked to Harvey and also can be used by an instructor in an auditorium setting.

Throughout each patient-centered, self-learning program, a physician instructor provides demonstrations of bedside findings using Harvey, narrative explanations, and feedback on important points in video segments. Multiple-choice questions are presented during the program to focus on key teaching points, to encourage problem solving, and to enhance interactive learning. An administrative program tracks 27 categories of learners and records their performance and time spent completing the tests.

A recent multicenter study demonstrated that this system can be integrated into a 4-year medical school curriculum.33 A total of 1586 students at 6 medical schools completed 6131 programs and favorably rated the educational value of the system compared with other learning materials. The study resulted in a recommended 4-year curriculum plan for the UMedic system. Valid pretests and posttests were created to measure outcomes in bedside skills34 and were used in an additional multicenter cohort study involving senior medical students (n=208) at 5 institutions that compared the UMedic system with traditional methods for teaching bedside skills in cardiology.35 In the intervention group, UMedic modules replaced instruction in bedside skills that occurred during teaching rounds and individual patient workups. There was a statistically significant improvement in the pretest to posttest scores of the UMedic trained students compared with students who had not used the UMedic modules (P<.001).

ANESTHESIA SIMULATOR

Simulators have been used in anesthesiology training for 30 years, beginning with the SIM 1 system.36 Advances in technology over the next 20 years led to the development of more sophisticated simulation systems, including a computer-only program that provided a highly sophisticated model of human physiology and pharmacology to determine precise responses to drugs and interventions37 and a hands-on simulator that realistically re-created cognitive tasks of anesthetic administration, patient monitoring, and intervention.38

Current anesthesia simulators such as the Eagle (CEA Electronics Inc, Binghamton, NY) and METI (Medical Education Technologies, Sarasota, Fla) use whole-body mannequins that provide more than 40 realistic and sophisticated findings in 7 anatomic areas. In addition, the simulation systems usually include actual hemodynamic monitoring systems, anesthesia machines, and supplies typically used in an operating room. Computer programs control the responses of the simulator, which vary according to 50 possible preprogrammed scenarios. The simulator may be housed in a clinical skills laboratory or, more commonly, placed in an actual or mock-up operating suite. It can be controlled by an experienced trainer from the bedside or from a remote control booth. While these simulators were primarily developed by anesthesiologists to be used for training and testing those in the field, application of these devices has broadened quickly. Simulators have been used to train and assess medical students during anesthesiology and surgery clerkships; residents in surgery, radiology, obstetrics, and emergency medicine; critical care nurses; and respiratory technicians.39

An additional feature of these simulators is the ability to program scenarios that allow individuals or teams to train for unforeseen or catastrophic events, eg, cardiac arrest, traumatic injury, or surgical procedures that require the cooperation of individuals with varying expertise and training such as surgeons, anesthesiologists, and critical care nurses. One method that has been used to train and assess multiprofessional performance is called Anesthesia Crisis Resource Management.40 This system can assess technical performance, such as placement of instruments or administration of medications, as well as behavioral performance, ie, the appropriate use of sound crisis management behaviors including leadership, communication, and distribution of workload to other members of the team.41 The training sessions can be videotaped and reviewed so that items requiring remediation can be addressed and corrected.

Reports of anesthesia simulators initially focused on development and user acceptance. However, the high cost and requirements for accompanying equipment, space, and personnel have resulted in research to justify the installation of such devices. Reports on the use of simulators by residents and practicing anesthesiologists demonstrate that the simulators are judged to be highly realistic,38,42,43 increased training is associated with an increased correction of problems,44 and unplanned incidents during simulation were usually a result of human error.45,46 Other studies of anesthesiology residents and practicing anesthesiologists showed that training on a simulator can improve the acquisition and retention of knowledge in comparison with traditional lectures.42,47 Although an advanced level of training was associated with fewer unplanned errors and management flaws, mistakes still occurred at a surprising rate.42,45,47

The METI simulator also has been used at an annual scientific meeting for continuing medical education of anesthesiologists.48 If these simulators are accepted as a valid measure for performance evaluation, they may become an important component of certifying competence.41 The advantage of these systems compared with the current method for testing and certification, including written and oral examinations, is that they allow the examinee to demonstrate clinical skills in a controlled clinical environment while still exhibiting cognitive and language skills.41 As Gaba49 has pointed out, "no industry in which human lives depend on the skilled performance of responsible operators has waited for unequivocal proof of the benefit of simulation before embracing it."

COMMENT

There is a long-standing debate regarding the potentially dehumanizing effects of using technology in the practice and teaching of medicine. As Howell50 cogently points out, such concerns are not new, but have been voiced since the turn of the last century. He questions whether worrying about machines detracting from the caring aspects of medicine and technologies encroaching on physicians' abilities to use judgment is justifiable. Simulation training avoids using patients for skills practice and ensures that trainees have had some practice before treating humans. If simulators enhance the clinician's diagnostic skills, then these devices have the potential to reduce the number of diagnostic tests (some invasive) a physician orders for a given patient. Recent studies51,52 support the view that "a well-performed cardiovascular examination, using physical maneuvers and bedside aids . . . remains the most cost-effective tool for the initial evaluation of these patients."53

Several practical questions should be addressed when assessing the value of simulators. These include the cost of purchasing the simulator, amount of use once purchased, the time saved by teachers, the cost of maintenance, the frequency and cost of technical and software upgrading, and the cost of training faculty to use the system and integrate the simulator into the curriculum. Costs of simulation systems range from less than $5000 for most laparoscopic simulators16 to $75,000 for Harvey (D. A. Lawson, written communication, June 25, 1999), to well over $100,000 (range, $125,000 to $200,000 with price depending on the number of features and accessories) for highly sophisticated anesthesia simulators.54 Thus, it is important that the perceived benefits of this type of training be evaluated and proven if resources are to be allocated to purchase these systems.

A recent survey of 5 medical schools that use Harvey in their curriculum and a review of their technical records resulted in the following data for the 1998-1999 academic year (J. W. Mayer, MD, University of Miami School of Medicine, Miami, Fla, written communication, June 18, 1999). The average class size was 150 students. Harvey was used during the second, third, and fourth years: in the second and third years to teach the entire class and during the senior year in a cardiology elective (20%-80% of the class). When all 3 class years were in session, the simulator was used an average of 22 hours per week to teach medical students. The majority of training (17 h/wk) was carried out in small groups in a self-learning mode while the remaining time (5 h/wk) involved instructors teaching with Harvey. At other times nurses, physician assistants, residents, and fellows used the device. The simulator was also used at several postgraduate conferences, which reflected its potential role as a component of lifelong professional development.

Most of the hands-on, self-learning hours on the simulator represent what in the past was possible only through teacher-student time at the bedside and can be interpreted as time saved by the instructor. For both faculty teaching and self-learning, time also is saved by not having to identify patients with a variety of bedside findings and locate appropriate examples of diagnostic studies (eg, electrocardiograms) that are presented in the software programs.

The only maintenance is cleaning the simulator's skin with alcohol every 1 to 2 weeks. The only "required" upgrades have been updates of the teaching slide programs every 3 to 4 years (at a cost of a textbook, approximately $200) so that the CPS curriculum stays current with diagnostic and therapeutic advances. Schools that obtained the simulator when tape cassettes generated the heart sounds had an optional digital-audio upgrade 5 years ago at less than 10% of the purchase price. There was no cost for training faculty other than 1 hour of orientation, and the average time spent to integrate the simulator into the curriculum was 3 hours. These data were obtained from medical schools that have active Harvey programs and may not reflect usage at all institutions that purchased the simulator.

As with many new technologies, the high price of simulators reflects the cost of initial research and development. Advances in technology should enable improvements and refinements to these devices, and as their use continues to increase, the price for many of these systems should decline. For example, according to its technical engineer (D. A. Lawson, oral communication, July 6, 1999), advances in digital sound technology and solid-state circuitry have reduced the current cost of Harvey to approximately one half of what it was more than a decade ago.

The key element in the successful use of simulators is that they become integrated throughout the entire curriculum so that deliberate practice to acquire expertise over time is possible. By analogy, the number of years someone plays a sport or practices a profession bears limited relation to how well they perform. What does correlate with quality of performance is the amount of ongoing deliberate practice that includes "informative feedback and opportunities for repetition and correction of errors."12

Unfortunately, most medical students and practitioners have little regular access to professional feedback with opportunities for repetition and correction of errors. The regular use of simulators incorporated into structured continuing medical education programs as well as in self-assessment and self-directed remediation programs offers great promise for lifelong professional development. Several organizations have recognized the role of simulation technology in continuing education and recently have implemented guidelines or programs to foster its development. For example, the Association of American Medical Colleges' Medical School Objectives Project, in its Medical Informatics Objectives, states ". . . the successful medical school graduate should be able to . . . effectively utilize various computer-based instructional (and self-assessment) tools, including electronic tutorials and patient simulations."55 The American Board of Internal Medicine shares this view, as reflected in its decision to form a Physical Examination Self-Evaluation Process committee.56 The committee is now developing a multimedia computer-based, self-assessment program focusing on physical examination and physical diagnosis skills.56 The American College of Cardiology Task Force on Teaching endorses innovations in teaching methods and evaluation techniques, including the use of Harvey and interactive computer software.57

New technology and the changing medical education environment are likely to ensure that the use of simulators will continue to increase. Simulation techniques are moving rapidly from the game and military fields into medical education, skills training, and the daily practice of medicine. The task for medical educators will be to embrace and harness this potential and use it to enhance the self-directed acquisition of skills throughout the lifelong medical education continuum.

References
1.
The Medical School Objectives Writing Group.  Learning objectives for medical student education—guidelines for medical schools.  Acad Med.1999;74:13-18.
2.
Maudsley RF. Content and context.  Acad Med.1999;74:143-146.
3.
Garrison P. Flying Without Wings. Blue Ridge Summit, Pa: TAB Books Inc; 1985:1-31:102-106.
4.
Goodman W. The world of civil simulators.  Flight International Magazine.1978;18:435.
5.
Rolfe JM, Staples KJ. Flight SimulationCambridge, England: Cambridge University Press; 1986:232-249.
6.
Ressler EK, Armstrong JE, Forsythe GB. Military mission rehearsal. In: Tekian A, McGuire C, McGaghie WC, eds. Innovative Simulations for Assessing Professional Competence. Chicago, Ill: Dept of Medical Education, University of Illinois Medical Center; 1999:157-174.
7.
Keys B, Wolfe J. The role of management games and simulations in education and research.  J Manage.1990;16:307-336.
8.
Streufert S, Pogash R, Piasecki M. Simulation-based assessment of managerial competence: reliability and validity.  Person Psych.1988;41:537-557.
9.
Wachtel J. The future of nuclear power plant simulation in the United States. In: Walton DG, ed. Simulation for Nuclear Reactor Technology. Cambridge, England: Cambridge University Press; 1985:339-349.
10.
Office of Naval Research.  Visual Elements in Flight SimulationWashington, DC: National Council of the National Academy of Science; January 1973.
11.
Dusterberry, JC. Indroduction to simulation systems.  Society of Photo-Optical Engineers.1975;59 :141-142.
12.
Ericsson KA. The Road to Excellence. Mahwah, NJ: Lawrence Erlbaum Associates; 1996: 1-50.
13.
Allard F, Starkes JL. Motor-skill experts in sports, dance and other domains. In: Ericsson KA, Smith J, eds. Toward a General Theory of Expertise. Cambridge, England: Cambridge University Press; 1991:126-152.
14.
Ericsson KA, Krampe RT, Tesche-Rmer C. The role of deliberate practice in the acquisition of expert performance.  Psychol Rev.1993;100:363-406.
15.
Derossis AM, Fried GM, Abrahamowicz M.  et al.  Development of a model for training and evaluation of laparoscopic skills.  Am J Surg.1998;175:482-487.
16.
 The seven-step laparoscopic cholecystectomy surgical trainer. Available at: http://www.cine-med.com. Accessed July 13, 1999.
17.
 The Body Form trainer. Available at: http://www.limbsandthings.com/bodyform.htm. Accessed March 19, 1999.
18.
Melvin WS, Johnson JA, Ellison C. Laparoscopic skills enhancement.  Am J Surg.1996;172:377-379.
19.
Rosser JC, Rosser LE, Savalgi RS. Skill acquisition and assessment for laparoscopic surgery.  Arch Surg.1997;132:200-204.
20.
Windsor JA. Laparoscopic exploration of the common bile duct.  J R Coll Surg Edinb.1993;38:48-49.
21.
Derossis AM, Bothwell J, Sigman HH, Fried GM. The effect of practice on performance in a laparoscopic simulator.  Surg Endosc.1998;12:1117-1120.
22.
Gordon MS, Ewy GA, Felner JM.  et al.  Teaching bedside cardiologic examination skills using "Harvey," the cardiology patient simulator.  Med Clin North Am.1980;64:305-313.
23.
Gordon MS, Ewy GA, Felner JM.  et al.  A cardiology patient simulator for continuing education of family physicians.  J Fam Pract.1981;13:353-356.
24.
Ewy GA, Felner JM, Juul D.  et al.  Test of a cardiology patient simulator with students in fourth-year electives.  J Med Educ.1987;62:738-743.
25.
Woolliscroft JO, Calhoun JG, Tenhaken JD, Judge RD. Harvey: the impact of a cardiovascular teaching simulator on student skill acquisition.  Med Teach.1987;9:53-57.
26.
Mangione S, Nieman LZ, Gracely E, Kaye D. The teaching and practice of cardiac auscultation during internal medicine and cardiology training: a nationwide survey.  Ann Intern Med.1993;119:47-54.
27.
Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees.  JAMA.1997;278:717-722. [published correction appears in JAMA. 1998;279:1444].
28.
St Clair EW, Oddone EZ, Waugh RA, Corey GR, Feussner JR. Assessing housestaff diagnostic skills using a cardiology patient simulator.  Ann Intern Med.1992;117:751-756.
29.
Oddone EZ, Waugh RA, Samsa G, Corey R, Feussner JR. Teaching cardiovascular examination skills.  Am J Med.1993;95:389-396.
30.
Jones JS, Hunt SJ, Carlson SA, Seamon JP. Assessing bedside cardiologic examination skills using "Harvey," a cardiology patient simulator.  Acad Emerg Med.1997;4:980-985.
31.
Takashina T, Shimizu M, Katayama H. A new cardiology patient simulator.  Cardiology.1997;88:408-413.
32.
Waugh RA, Mayer JW, Ewy GA.  et al.  Multimedia computer-assisted instruction in cardiology.  Arch Intern Med.1995;155:197-203.
33.
Petrusa ER, Issenberg SB, Mayer JW.  et al.  Multi-center implementation of a four-year multimedia computer curriculum in cardiology.  Acad Med.1999;74:123-129.
34.
Issenberg SB, McGaghie WC, Brown DD.  et al.  Development of multimedia computer-based measures of clinical skills in bedside cardiology. In: Proceedings of the 8th International Ottawa Conference on Medical Education and Assessment; July 12-15, 1998; Philadelphia, Pa.
35.
Issenberg SB, Petrusa ER, McGaghie WC.  et al.  Assessment of a computer-based system to teach bedside cardiology.  Acad Med.In press.
36.
Abrahamson S, Denson JS, Wolf RM. Effectiveness of a simulator in training anesthesiology residents.  J Med Educ.1969;44:515-519.
37.
Schwid H. A flight simulator for general anesthesia training.  Comput Biomed Res.1987;20:64-75.
38.
Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment.  Anesthesiology.1988;69:387-394.
39.
Kapur PA, Steadman RH. Patient simulator competency testing.  Anesth Analg.1998;86:1157-1159.
40.
Howard SK, Gaba DM, Fish KJ.  et al.  Anesthesia crisis resource management training.  Aviat Space Environ Med.1992;63:763-770.
41.
Gaba DM, Howard SK, Flanagan B, Smith BE, Fish KJ, Botney R. Assessment of clinical performance during simulated crises using both technical and behavorial ratings.  Anesthesiology.1998;89:8-18.
42.
Chopra V, Engbers FH, Geerts MJ, Filet JG, Bovill JG, Spierdijk J. The Leiden anesthesia simulator.  Br J Anaesth.1994;73:287-292.
43.
Schwid HA, O'Donnell D. The anesthesia simulator-recorder.  Anesthesiology.1990;72:191-197.
44.
Gaba DM, DeAnda A. The response of anesthesia trainees to simulated critical events.  Anesth Analg.1989;68:444-451.
45.
DeAnda A, Gaba DM. Unplanned incidents during comprehensive anesthesia simulation.  Anesth Analg.1990;71:77-82.
46.
Schwid HA, O'Donnell D. Anesthesiologists' management of simulated critical incidents.  Anesthesiology.1992;76:495-501.
47.
DeAnda A, Gaba DM. Role of experience in the response to simulated critical incidents.  Anesth Analg.1991;72:308-315.
48.
Lampotang S, Good ML, Westhorpe R, Hardcastle J, Carovano RG. Logistics of conducting a large number of individual sessions with a full-scale patient simulator at a scientific meeting.  J Clin Monit.1997;13:399-407.
49.
Gaba DM. Improving anesthesiologist's performance by simulating reality.  Anesthesiology.1992;76:491-494.
50.
Howell JD. The physician's role in a world of technology.  Acad Med.1999;74:244-247.
51.
Roldan CA, Shively BK, Crawford MH. Value of the cardiovascular physical examination for detecting valvular heart disease in asymptomatic subjects.  Am J Cardiol.1996;77:1327-1331.
52.
Danford DA, Nasir A, Gumbiner C. Cost assessment of the evaluation of heart murmurs in children.  Pediatrics.1993;91:365-368.
53.
Mangione S. The teaching of cardiac auscultation during internal medicine and family practice training.  Acad Med.1998;73(suppl 10):S10.
54.
Forrest F. High level simulators in medical education.  Hosp Med.1998;59:653-655.
55.
Medical School Objectives Writing Group, Association of American Medical Colleges.  Medical School Objectives Project: Medical Informatics Objectives. Available at: http://www.aamc.org/meded/msop/informat.htm. Accessed January 10, 1999.
56.
Norcini J. Computer-based testing will soon be a reality.  Perspectives.Summer 1999:3.
57.
Gregoratos G, Miller AB. Task Force 3: teaching.  J Am Coll Cardiol.1999;33:1120-1127.
×