[Skip to Navigation]
Sign In
Figure 1. 
Screen capture of home screen (©1998, West Portal Software Corp, San Francisco, Calif; reprinted with permission).

Screen capture of home screen (©1998, West Portal Software Corp, San Francisco, Calif; reprinted with permission).

Figure 2. 
Screen capture with "Bump" selected (©1998, West Portal Software Corp, San Francisco, Calif; reprinted with permission).

Screen capture with "Bump" selected (©1998, West Portal Software Corp, San Francisco, Calif; reprinted with permission).

Table 1. 
Study Design*
Study Design*
Table 2. 
Number of Lesions Triaged Correctly and Incorrectly With and Without Using the Decision Support Software
Number of Lesions Triaged Correctly and Incorrectly With and Without Using the Decision Support Software
1.
Cassileth  BRClark Jr  WLusk  EJFrederick  BEThompson  CJWalsh  WP How well do physicians recognize melanoma and other problem lesions?  J Am Acad Dermatol. 1986;14555- 560Google ScholarCrossref
2.
Paine  SLCockburn  JNoy  SMMarks  R Early detection of skin cancer: knowledge, perceptions and practices of general practitioners in Victoria.  Med J Aust. 1994;161188- 189192- 195Google Scholar
3.
Federman  DHogan  DTaylor  JRCaralis  PKirsner  RS A comparison of diagnosis, evaluation, and treatment of patients with dermatologic disorders.  J Am Acad Dermatol. 1995;32726- 729Google ScholarCrossref
4.
Gerbert  BMaurer  TBerger  T  et al.  Primary care physicians as gatekeepers in managed care: primary care physicians' and dermatologists' skills at secondary prevention of skin cancer.  Arch Dermatol. 1996;1321030- 1038Google ScholarCrossref
5.
Gerbert  BBronstone  AWolff  M  et al.  Improving primary care residents' proficiency in the diagnosis of skin cancer.  J Gen Intern Med. 1998;1391- 97Google ScholarCrossref
6.
Robinson  JMcGaghie  W Skin cancer detection in a clinical practice examination with standardized patients.  J Am Acad Dermatol. 1996;34709- 711Google ScholarCrossref
7.
Dolan  NNg  JMartin  GRobinson  JRademaker  A Effectiveness of a skin cancer control educational intervention for internal medicine housestaff and attending physicians.  J Gen Intern Med. 1997;12531- 536Google ScholarCrossref
8.
Gloster  HMBrodland  DG The epidemiology of skin cancer.  Dermatol Surg. 1996;22217- 226Google Scholar
9.
Sober  AJBurstein  JM Precursors to skin cancer.  Cancer. 1995;75645- 650Google ScholarCrossref
10.
Weinstock  MAGoldstein  MGDube  CERhodes  ARSober  AJ Basic skin cancer triage for melanoma detection.  J Am Acad Dermatol. 1996;341063- 1066Google ScholarCrossref
11.
Lodder  HBakker  ARvan  Bemmel JedMusen  Med Project management.  Handbook of Medical Informatics. Heidelberg, Germany Springer-Verlag1997;527- 534Google Scholar
12.
Harrison  PKirby  BDickinson  YSchofield  R Teledermatology: high technology or not?  J Telemed Telecare. 1998;431- 32Google ScholarCrossref
13.
Lyon  CHarrison  P Digital imaging and teledermatology: educational and diagnostic applications of a portable digital imaging system for the trainee dermatologist.  Clin Exp Dermatol. 1997;22163- 165Google ScholarCrossref
Study
February 2000

Decision Support Software to Help Primary Care Physicians Triage Skin Cancer: A Pilot Study

Author Affiliations

From the Division of Behavioral Sciences, School of Dentistry (Drs Gerbert and Bronstone), and Department of Dermatology, School of Medicine (Drs Maurer and Berger), University of California, San Francisco; and West Portal Software Corp, San Francisco (Mr Hofmann).

Arch Dermatol. 2000;136(2):187-192. doi:10.1001/archderm.136.2.187
Abstract

Objective  To determine whether decision support software can help primary care physicians proficiently triage lesions suggestive of basal cell and squamous cell carcinoma.

Design/Measures  Physicians selected triage options for 15 digitized images of skin lesions, with and without use of the decision support software.

Participants/Settings  Twenty primary care physicians practicing in a health maintenance organization or a city health clinic.

Intervention  Decision support software designed to help physicians arrive at a triage recommendation consisted of a clinical information form, a decision tree, and support features (teaching points, example images, and diagrams).

Results  Without using the decision support software, physicians chose the wrong triage decision 36.7% of the time; using the decision support software, they chose the wrong response only 13.3% of the time. Not using the decision support software, they failed to correctly perform a biopsy on or refer patients with cancerous lesions 22.1% of the time; using the software, they failed to correctly perform a biopsy on or refer patients with cancerous lesions only 3.6% of the time. Physicians scored an average of 3 points (of a possible 15 points) higher when they used the software (signed rank, 101.0; P<.001). They scored an average of 1 point higher on the 7 cancerous lesions when they used the software (signed rank, 65.5; P<.001).

Conclusions  Use of decision support software could improve primary care physicians' triage decisions for lesions suggestive of nonmelanoma skin cancer, and potentially reduce morbidity and health care costs. We are designing a larger study to evaluate the accuracy and utility of the software with patients seen in clinical practice.

IN MANAGED care organizations, primary care physicians are largely responsible for triaging for dermatologic problems, including detecting and triaging skin cancer at an early stage. Ample data1-4 suggest that primary care physicians are inadequate at diagnosing skin lesions, including cancerous ones. Interventions to improve primary care physicians' skills in this area have produced mixed results; one multicomponent intervention5 significantly improved primary care physicians' ability to correctly diagnose and provide evaluation plans for skin lesions, whereas several other interventions6,7 have failed to improve their skills. Even with exposure to an effective educational intervention, primary care physicians are likely to need ongoing support over time to maintain or enhance their learning.

Triaging skin lesions is a complex and difficult task. To make a clinical decision, primary care physicians must attend to a variety of important features of the skin lesion (including size, shape, color, texture, location, and growth pattern) and patient history (including age, skin type, hair color, family history of skin cancer, and sun exposure),8,9 sort these data, perform probability calculations, and follow decision-making rules to come to a triage decision. Computer programs with well-designed graphical user interfaces could more effectively facilitate clinical decision making and learning than traditional diagnostic aids (eg, flowcharts) by enabling the user to move through multiple complex decision trees. We conducted a pilot study to determine whether decision support software can improve the ability of primary care physicians to triage digitized images of skin lesions. Although others10 have developed decision rules for triaging lesions suggestive of malignant melanoma, decision rules for triaging nonmelanoma skin cancer have not yet been developed and validated. At this early stage in the development of the decision support software, we focused on triaging lesions suggestive of nonmelanoma skin cancer.

Participants and methods
Participants

Participants were recruited from a health maintenance organization and a city health clinic in the San Francisco Bay Area (California). During staff meetings, primary care physicians were invited to participate in this study. Physicians who were interested contacted an investigator to enroll.

Dermatology algorithm development

The initial algorithms were developed by 2 dermatologists (T.M. and T.B.) and were reviewed by an expert panel of 3 dermatologists and 2 primary care physicians. Based on the panel's feedback, the algorithms were revised and used as a basis to develop the first prototype of the software. A primary care physician pretested the software by using it to triage digitized images of skin lesions. The algorithms were then further revised based on this pretesting.

Decision support software

The decision support software developed by West Portal Software Corp, San Francisco, consists of several interrelated components: a clinical information form, a decision tree, and support features (teaching points, example images, and diagrams). The initial screen is a clinical information form designed for physicians to input information that either is provided by patients or is easily observable by physicians. The former includes lesion growth pattern, number of similar lesions present, patient's age when the lesion appeared, patient's skin type, history of trauma to the lesion area, and history of lesion bleeding. The latter includes lesion size (in millimeters) and location (eg, head, neck, trunk, extremity, palm, sole, and periungual). The clinical information form does not include morphologic characteristics of lesions that physicians have to accurately assess to correctly triage skin lesions.

After completing the clinical information form, the user selects to view the decision tree. Figure 1 shows the home screen where the user enters the decision tree. With each decision, the dynamic, graphical user interface expands the branch selected and collapses those not selected. If the user chooses "Bump," he or she views the screen depicted in Figure 2.

The algorithms for the decision tree incorporate the patient clinical information that was input on the initial screen. For example, if the physician inputs into the clinical information form that a lesion has been present for 6 months, certain decision tree nodes (eg, lesion duration <2 months) are visually eliminated (ie, are shown in red). Selected nodes that are not eliminated remain green. Physicians then choose which nodes to follow by making clinical judgments regarding various lesion characteristics. They still have to make decisions about whether lesions are, eg, warty and rough vs smooth, pigmented vs nonpigmented, and containing a central core or punctum vs not containing a central core or punctum. After each set of branches is a triage recommendation. If physicians follow the wrong path by making an error in judging a lesion characteristic, the software may create an incorrect triage recommendation. To reduce these judgment errors, physicians can access support features at various branch points: teaching points are text-based explanations of particular lesion characteristics, example images are digitized images of lesions that depict a particular lesion characteristic, and diagrams are sketches that depict a particular lesion characteristic.

Test stimuli

Test stimuli consisted of 30 digitized images of patients' skin lesions presented in the clinic to 1 of the 2 dermatologist investigators (T.B. or T.M.). Only digitized images judged by these investigators to be high-resolution representations of the actual lesions were shown; all digitized images of lesions were compressed JPEG images with a minimum resolution of 96 dpi. Lesions were viewed on a standard high-resolution multisync monitor with settings of 800 × 600 pixels and 24-bit color. Fourteen lesions were cancerous, 3 were precancerous, and 13 were benign. A range of lesion types were included: actinic keratosis; basal cell carcinoma; dermatofibroma; fibrous papule (angiofibroma); infectious (bacterial) ulcer; nevus; psoriasis; seborrheic keratosis; squamous cell carcinoma, including Bowen disease; and venous stasis ulcer. Before the study, participants were not informed of the types of skin lesions they would be viewing. To control for the effect of multiple exposures to the same lesions,4 we first divided the 30 lesions into categories by type (eg, basal cell carcinoma and seborrheic keratosis). Next, we randomly assigned lesions within each category to 1 of 2 sets. This resulted in 2 sets of 15 test lesions each, with equal numbers of types of in each set. To control for any difference in the degree of difficulty between the 2 sets of lesions, participants were randomly assigned to a test order (Table 1).

Procedures

Testing was conducted in an office at each of the 2 sites. All participants completed a brief background survey, tests of their triage abilities not using (test 1) or using (test 2) the decision support software, and an exit survey, and gave written informed consent. For tests 1 and 2 participants viewed digitized images of skin lesions and marked their triage decisions on an answer sheet that indicated the size and location of each lesion. The 7 triage options were (1) reassure and observe, (2) treat with liquid nitrogen, (3) treat for infection, (4) treat with medication, (5) treat for arterial venous insufficiency, (6) perform a biopsy, and (7) refer. During test 1, a "dummy" patient (portrayed throughout by the same investigator [A.B.]) answered only those questions asked by physicians that were included in the decision support software's clinical information form. The dummy patient referred to a printout of the clinical information form for each test lesion to ensure that she answered questions consistently. During test 2, participants received a brief (≈10-minute) orientation on how to use the decision support software, and were instructed to follow the software's decision-making rules to the best of their ability. To reduce testing time, we preloaded the software with the clinical information related to each of the 15 digitized test lesions, which simply saved the participant from having to gather this information from the dummy patient and input it into the clinical information form. All study procedures were approved by the Committee on Human Research at the University of California, San Francisco.

Data analysis

Two investigators (T.B. and T.M.) based their determinations of correct triage decisions for each test lesion on their actual treatment plans for patients with these lesions. There was 100% agreement between the investigators regarding the appropriate treatment plan for each lesion. For lesions that underwent biopsy in practice, pathological test results were available. When a physician recommended a biopsy for a test lesion that the investigators had determined in practice to be clearly benign or to be an obvious actinic keratosis, this was considered an unnecessary biopsy. Similarly, "biopsy" or "refer" would have been the correct triage decisions for lesions on which the investigators performed a biopsy in practice but were shown on pathological examination to be benign. To determine whether use of the decision support software improved physician performance, change scores for each physician were calculated. The total change score (theoretical range, −15 to 15) was the total score correct on test 2 (using the software) minus the total score correct on test 1 (not using the software). The cancer change score (theoretical range, −7 to 7) was the total cancer score correct on test 2 (using the software) minus the total cancer score correct on test 1 (not using the software). A Wilcoxon signed rank test was conducted on the change scores.

Results
Participants

Participants were 20 primary care physicians practicing in a large health maintenance organization (n = 15) or a city health clinic (n = 5); 55% were women (n = 11), and 90% (n = 18) were white. Average age was 45 years. Seventy percent (n = 14) of participants had completed a dermatology rotation during their residency and 60% (n = 12) during medical school. Thirty percent (n = 6) had no continuing education hours in dermatology in the past 5 years, 65% (n = 13) had 1 to 5 hours, and 5% (n = 1) had 6 to 20 hours. Ten percent (n = 2) of participants were "extremely" or "very" confident of their skin cancer triage abilities, 60% (n = 12) were "somewhat" confident, and 30% (n = 6) were "a little" or "not at all" confident. Only 1 physician performed skin biopsies; the remainder indicated that they referred patients with lesions suggestive of skin cancer to dermatologists. Forty-five percent (n = 9) of participants were "extremely" or "very" comfortable using computers, 35% (n = 7) were "somewhat" comfortable, and 20% (n = 4) were "a little" or "not at all" comfortable.

Overall impact of using decision support software on triage decisions

Physicians made a greater number of correct decisions using the software than not using the software (Table 2). Not using the decision support software, physicians chose the wrong triage decision for 110 (36.7%) of 300 lesions. Using the decision support software, they chose the wrong response for only 40 (13.3%) of 300 lesions. Without using the decision support software, physicians failed to perform a biopsy on or refer patients with cancerous lesions for 31 (22.1%) of 140 lesions. Using the software, they failed to perform a biopsy on or refer patients with cancerous lesions for only 5 (3.6%) of 140 lesions. Use of decision support software also led to more correct triage decisions for noncancerous lesions, decreasing the number of recommended biopsies and referrals for benign and precancerous lesions. Not using the decision support software, physicians decided to perform a biopsy on or refer patients with benign or precancerous lesions for 55 (34.4%) of 160 lesions. Using the software, they inappropriately recommended a biopsy or referral for a benign or precancerous lesion for only 26 (16.3%) of 160 lesions.

Impact of using decision support software on physician performance

Physicians scored an average of 3 points higher (of a possible 15 points) when they used the software (mean change score, 3.15; signed rank, 101.0; P<.001). Nineteen of 20 participants improved their triage performance using the software, gaining as much as 7 points. The only participant who performed better without the software had the highest score on test 1. Physicians' total cancer score was an average of 1 point higher with the software (of a possible 7 points) (mean cancer change score, 1.1; signed rank, 65.5; P<.001). Fifteen of 20 participants improved their cancer triage performance using the software, gaining as much as 3 points.

Physician response to using decision support software

Ninety percent (n = 18) of participants found the software either "extremely" or "very" easy to use and 10% (n = 2) found it "somewhat" easy to use; none found it "a little" or "not at all" easy to use. When asked if a full-featured version of the software would help in their practice with actual patients, 75% (n = 15) of participants thought that such software would be "extremely" or "very" helpful, 20% (n = 4) thought it would be "somewhat" helpful, and 5% (n = 1) thought it would be "a little" helpful; none rated the software as "not at all" helpful.

Comment

These preliminary findings suggest that using decision support software could improve primary care physicians' decision making regarding skin lesion triage. Physicians' overall triage error rate using the decision support software (13.3%) was similar to the triage error rate found for dermatologists in a recent study (15%).5 Nineteen of 20 physicians made more correct triage decisions using the software than on their own, and physicians using the software averaged 3 more correct triage decisions than when they triaged on their own, representing a 20% improvement rate. Regarding cancerous lesions, primary care physicians' error rate using the decision support software (3.6%) was similar to that achieved by dermatologists in a previous study (0%).5 In addition, unnecessary referrals and biopsies were reduced by half when physicians used the decision support software. These findings are especially encouraging given that 18 participants (90%) rated themselves between "somewhat" and "not at all" confident in their skin cancer triage skills. They also suggest that an expanded version of the software, including a teaching component to provide physicians with feedback about their decisions, might further improve physicians' triage decisions and enhance their confidence in this area.

Although a variety of computer-based decision-making aids have been developed for a variety of medical problems, many of these systems are never implemented in health care settings.11 Barriers at the implementation stage may be reduced when systems are designed to take into account the needs of health care professionals and their work practices.11 Because patient appointments are often scheduled in 10- to 15-minute slots, primary care physicians need to make triage decisions quickly. For successful implementation in primary care settings, a computer-based decision support tool needs to be easy to use and require a minimum amount of time to provide support. The decision support software developed in this study uses an easy-to-use graphical, dynamic interface; operates on Windows-compatible personal computers; and in actual practice, would require approximately 4 to 5 minutes of use to arrive at a triage decision. Although most participants rated themselves as "somewhat" or "not at all" comfortable using computers, 18 (90%) rated the software as "very" or "extremely" easy to use. This high degree of comfort with the software was accomplished with only a 10-minute orientation.

This study had several limitations. Although it is possible that physicians' scores using the software were superior partly because of their previous test experience triaging lesions on their own, there is no evidence that physicians who have a limited exposure to test skin lesions and who receive no feedback on their triage decisions can improve their abilities. Because a small convenience sample was used for testing, the findings cannot be generalized to the population of primary care physicians. The sample, however, ranged in terms of their exposure to dermatology education, self-reported confidence in triaging skin lesions, comfort level using computers, and type of practice setting. Because this study used digitized images of skin lesions rather than lesions on actual patients, we do not know how effective the software would be in helping physicians triage lesions on their patients. Data4 suggest that primary care physicians triaging on their own are no better at diagnosing skin lesions on actual patients than on computer images. Because the decision support software requires physicians to make certain morphologic distinctions (eg, lesion surface rough vs smooth) that may be difficult to discern without being able to feel the lesion, physicians using the software with actual patients may arrive at more correct triage decisions than when using the software with digitized images. We are designing a larger study to evaluate the accuracy and utility of using the software with patients seen in clinical practice.

Innovative and validated methods are needed to improve primary care physicians' ability to triage for skin cancer. The efficacy of teledermatology—dermatologists consulting on cases presented via videoconferencing or electronically transmitted digitized images—is now being explored as a means to make consultation by dermatologists more accessible and cost-effective.12,13 The use of teledermatology may complement our decision support software, which is intended to help primary care physicians make better decisions about when to consult with dermatologists. The decision support software developed in this study is intended to ultimately serve as a decision-making aid and a teaching tool. Physicians will learn through experience by using the software and internalizing its decision-making rules and by completing a tutorial, which we are now designing and testing. Ultimately, the software could be expanded to triage for all types of skin lesions—including malignant melanoma—to potentially prevent disfiguring surgery and save lives.

Accepted for publication May 17, 1999.

This study was supported by grant 1 R43 CA75906-01 from the Small Business Innovation Research Program, Washington, DC.

We thank Jeffrey Schneider, MD, and Sang-ick Chang, MD, for assistance with participant recruitment; Nona Caspers, MFA, for editorial assistance; and Jennifer Fechner for proofreading.

Reprints: Barbara Gerbert, PhD, Division of Behavioral Sciences, School of Dentistry, University of California, San Francisco, 601 Montgomery, Suite 810, San Francisco, CA 94111 (e-mail: gerbert@itsa.ucsf.edu).

References
1.
Cassileth  BRClark Jr  WLusk  EJFrederick  BEThompson  CJWalsh  WP How well do physicians recognize melanoma and other problem lesions?  J Am Acad Dermatol. 1986;14555- 560Google ScholarCrossref
2.
Paine  SLCockburn  JNoy  SMMarks  R Early detection of skin cancer: knowledge, perceptions and practices of general practitioners in Victoria.  Med J Aust. 1994;161188- 189192- 195Google Scholar
3.
Federman  DHogan  DTaylor  JRCaralis  PKirsner  RS A comparison of diagnosis, evaluation, and treatment of patients with dermatologic disorders.  J Am Acad Dermatol. 1995;32726- 729Google ScholarCrossref
4.
Gerbert  BMaurer  TBerger  T  et al.  Primary care physicians as gatekeepers in managed care: primary care physicians' and dermatologists' skills at secondary prevention of skin cancer.  Arch Dermatol. 1996;1321030- 1038Google ScholarCrossref
5.
Gerbert  BBronstone  AWolff  M  et al.  Improving primary care residents' proficiency in the diagnosis of skin cancer.  J Gen Intern Med. 1998;1391- 97Google ScholarCrossref
6.
Robinson  JMcGaghie  W Skin cancer detection in a clinical practice examination with standardized patients.  J Am Acad Dermatol. 1996;34709- 711Google ScholarCrossref
7.
Dolan  NNg  JMartin  GRobinson  JRademaker  A Effectiveness of a skin cancer control educational intervention for internal medicine housestaff and attending physicians.  J Gen Intern Med. 1997;12531- 536Google ScholarCrossref
8.
Gloster  HMBrodland  DG The epidemiology of skin cancer.  Dermatol Surg. 1996;22217- 226Google Scholar
9.
Sober  AJBurstein  JM Precursors to skin cancer.  Cancer. 1995;75645- 650Google ScholarCrossref
10.
Weinstock  MAGoldstein  MGDube  CERhodes  ARSober  AJ Basic skin cancer triage for melanoma detection.  J Am Acad Dermatol. 1996;341063- 1066Google ScholarCrossref
11.
Lodder  HBakker  ARvan  Bemmel JedMusen  Med Project management.  Handbook of Medical Informatics. Heidelberg, Germany Springer-Verlag1997;527- 534Google Scholar
12.
Harrison  PKirby  BDickinson  YSchofield  R Teledermatology: high technology or not?  J Telemed Telecare. 1998;431- 32Google ScholarCrossref
13.
Lyon  CHarrison  P Digital imaging and teledermatology: educational and diagnostic applications of a portable digital imaging system for the trainee dermatologist.  Clin Exp Dermatol. 1997;22163- 165Google ScholarCrossref
×