[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 54.197.65.227. Please contact the publisher to request reinstatement.
[Skip to Content Landing]
Download PDF
Patient preferences by age.

Patient preferences by age.

Table 1. 
Equipment and Technical Details
Equipment and Technical Details
Table 2. 
Demographic Characteristics of 131 Enrolled Patients
Demographic Characteristics of 131 Enrolled Patients
Table 3. 
Scoring of Agreement
Scoring of Agreement
Table 4. 
Agreement Between Video and In-Person Examinations ("Gold Standard") by Diagnostic Category
Agreement Between Video and In-Person Examinations ("Gold Standard") by Diagnostic Category
1.
Allen  AHayes  JSadasivan  RWilliamson  SKWittman  C A pilot study of the physician acceptance of tele-oncology. J Telemed Telecare. 1995;134- 37
2.
Jones  DHCrichton  CMacdonald  AP  et al.  Teledermatology in the highlands of Scotland. J Telemed Telecare. 1996;2(suppl)7- 8Article
3.
Allen  AHayes  J Patient satisfaction with telemedicine in a rural clinic. Am J Public Health. 1994;841693Article
4.
Mitchell  BRMitchell  JCDisney  APS User adoption issues in renal telemedicine. J Telemed Telecare. 1996;281- 86Article
5.
Kvedar  JCEdwards  RAMenn  ER  et al.  The substitution of digital images for dermatologic physical examination. Arch Dermatol. 1997;133161- 167Article
6.
Perednia  DAGaines  JARossum  AC Variability in physician assessment of lesions in cutaneous images and its implications for skin screening and computer-assisted diagnosis. Arch Dermatol. 1992;128357- 364Article
7.
Byles  JEHennrikus  DSanson-Fisher  RHersey  P Reliability of naevus counts in identifying individuals at high risk of malignant melanoma. Br J Dermatol. 1994;13051- 56Article
8.
Whited  JDHorner  RDHall  RPSimel  DL The influence of history on interobserver agreement for diagnosing actinic keratoses and malignant skin lesions. J Am Acad Dermatol. 1995;33603- 607Article
9.
Zelickson  BDHoman  L Teledermatology in the nursing home. Arch Dermatol. 1997;133171- 174Article
10.
Phillips  CMBurke  WAShechter  A  et al.  Reliability of dermatology teleconsultations with the use of teleconferencing technology. J Am Acad Dermatol. 1997;37398- 402Article
11.
Lesher  JLDavis  LSGourdin  FWEnglish  DThompson  WO Telemedicine evaluation of cutaneous diseases: a blinded comparative study. J Am Acad Dermatol. 1998;3827- 31Article
12.
Leffell  DJChen  YTBerwick  MBolognia  JL Interobserver agreement in a community skin cancer screening setting. J Am Acad Dermatol. 1993;281003- 1005Article
13.
Larnier  COrtonne  JPVenot  A  et al.  Evaluation of cutaneous photodamage using a photographic scale. Br J Dermatol. 1994;130167- 173Article
14.
Williams  HCBurney  PGJStrachan  DHay  RJ The U.K. working party's diagnostic criteria for atopic dermatitis, II: observer variation of clinical diagnosis and signs of atopic dermatitis. Not Available
15.
Field  MJed Telemedicine: A Guide to Assessing Telecommunications in Health Care.  Washington, DC National Academy Press1996;1- 15
Study
April 1998

Teledermatology and In-Person ExaminationsA Comparison of Patient and Physician Perceptions and Diagnostic Agreement

Author Affiliations

From the Departments of Dermatology (Drs Lowitt, Kauffman, and Burnett), Epidemiology and Preventive Medicine (Dr Kessler), Pathology (Dr Kauffman), Medicine (Dr Hooper), and Radiology (Dr Siegel), University of Maryland School of Medicine, and the Baltimore Veterans Affairs Medical Center (Drs Lowitt, Kauffman, Hooper, Siegel, and Burnett), Baltimore.

Arch Dermatol. 1998;134(4):471-476. doi:10.1001/archderm.134.4.471
Abstract

Objective  To compare physician and patient impressions and interphysician diagnostic agreement between live teledermatology and in-person examinations.

Design  Paired video and in-person examinations with different dermatologists.

Setting  An urban Veterans Affairs dermatology clinic.

Patients  One hundred thirty-nine patients.

Main Outcome Measures  Satisfaction questionnaires and interphysician diagnostic agreement.

Results  Patient and physician satisfaction was high. Agreement between video and in-person diagnoses was 80%.

Conclusions  Physicians and patients were satisfied with teledermatology examinations. Diagnostic agreement between in-person and video dermatologists was high.

WE REPORT the results of a teledermatology evaluation project conducted within the Baltimore Veterans Affairs Medical Center in Baltimore, Md, in which dermatology patients were examined by 2 dermatologists, first by live-interactive video and immediately thereafter by in-person examination. The trial was designed (1) to compare live 2-way interactive video examinations with traditional in-person examinations with respect to patient and provider satisfaction, and (2) to evaluate the degree of diagnostic agreement between in-person and video examinations.

PATIENTS AND METHODS
PATIENT SELECTION

Consecutive dermatology clinic outpatients were enrolled prior to review of chart materials. Patients who refused were asked to state their reasons. Patients participated with institutional review board–approved informed consent. Control subjects underwent only in-person examination. Excluded patients were those transported by stretcher and those who declined to participate. The first 6 patients served as "trial" subjects and were not included in the analysis. Based on α=.05, β=.20, power of 0.8, and a difference worth detecting in the proportion of satisfaction ratings of 15%, our sample size required at least 99 participants.

PHYSICIANS

Four dermatologists (2 board-certified and 2 third-year residents) participated. Each physician underwent a 45-minute training session and observed several patient interviews before performing examinations alone. Each physician rotated in the role of video or in-person physician. Physicians were asked to conduct the video interview and physical examination as much as possible in the manner of a direct, in-person examination.

NURSE-ESCORTS

During video examinations, the patient was accompanied by a nurse-escort. Responsibilities of the nurse-escort included orienting the patient, facilitating patient-physician communication, displaying chart information, and positioning the patient. The nurse-escort also positioned and focused the close-up camera and/or dermatoscope, as requested.

STUDY PROCEDURE

After each patient was brought to the telemedicine room, the dermatologist appeared on the patient's video monitor and introduced himself or herself, elicited the medical history, examined the chart, and performed the physical examination. The physician indicated that a treatment plan would be discussed during the in-person examination immediately to follow. The patient, physician, and nurse-escort then completed written questionnaires. The patient was then escorted to an adjacent room for an in-person examination with a second dermatologist that served as the clinic visit of record. The patient and physician completed questionnaires following this examination.

STUDY FORMS

Six data collection forms were developed for each patient: demographic information; nurse-escort questionnaire; patient postvideo and post–in-person questionnaires; and physician postvideo and post–in-person questionnaires.

PATIENT, PHYSICIAN, AND NURSE-ESCORT PERCEPTIONS
Questionnaire Design

Respondents were asked to either strongly agree, agree, disagree, or strongly disagree with statements about video and in-person examinations (modified Likert scale). Patient questionnaire statements included "I was satisfied with this examination," "I felt comfortable talking with the doctor," "I would like to be examined by this doctor again," "I could see the doctor well," " I could hear the doctor well," "The doctor was looking me in the eye," and "The video camera bothered me or made me nervous." Patients were asked to make hypothetical choices between (1) a video examination with a dermatologist or an in-person examination with a primary care provider, and (2) a video examination with a dermatologist conducted close to home, or an in-person examination with a dermatologist 2 hours away.

Physician questionnaire statements included the following: "The patient appeared satisfied," "I could see the patient well," "I could examine the skin well," "I was able to visualize all necessary areas," "I could hear the patient well," "I could establish a good rapport with the patient," "I was happy with focus and visual resolution," "I was distracted by the technology," "There were problems with the video controls," and "Examination of the patient's chart was easy."

Data Analysis: Patient and Physician Perceptions

Computerized data entry screens were developed using SPSS statistical software (SPSS Inc, Chicago, Ill) for the personal computer. Analyses were performed using t tests for parametric data and χ2 tests for 2 × 2 tables.

DIAGNOSTIC AGREEMENT
Questionnaire Design

Physicians were asked to list up to 4 dermatologic diagnoses for each visit, to note the anatomical location of each diagnosis, and to indicate their degree of confidence (by Likert scale) in each diagnosis.

Data Analysis

The diagnoses from each video examination were compared with those of the corresponding in-person examination by a panel of 3 board-certified dermatologists. The panel considered each diagnostic observation (by anatomical site) to determine whether there was agreement between the video and in-person physicians. Observations made by one physician, but not noted by the other, were tallied separately as "extra diagnoses."

Overall agreement was calculated by dividing the number of agreements by the sum of agreements and disagreements. Percentage agreement data were also stratified for bandwidth, type of examination (new or follow-up), and by the degree of the video physician's diagnostic confidence. Two-by-two tables were generated for each diagnostic category, and the sensitivity, specificity, and positive predictive value were determined for each.

Following the completion of the study, histopathological reports were obtained for all biopsy specimens of skin lesions taken from the study group at or after the time of the teledermatology examination. Results were compared with the clinical diagnoses made by the 2 examining physicians.

TECHNICAL ASPECTS

Video examinations were conducted using a live, interactive system in which the patient and physician could see each other and converse in real time. Physicians controlled the direction, focus, and zoom of the camera in the patient's room. Video visits were transmitted over a dedicated T1 line, a data line that conveys information at a rate of up to 1.554 megabytes per second (54 times the speed of a 28800-baud modem) between a patient examination room and the physician station elsewhere within the Baltimore Veterans Affairs Medical Center. Examinations were conducted either using full T1 capability, or 14 T1 (384 kilobytes per second, one fourth the bandwidth of full T1) (Table 1).

RESULTS
PATIENT CHARACTERISTICS

During a 2-month period, 139 patients participated (Table 2). The first 6 trial patients were not included in the analyses, and 2 patients withdrew because of time constraints. One hundred two study patients underwent sequential video and in-person examinations, and 29 control subjects had in-person examinations only. Seven patients refused to participate. The reasons given included a dislike of machines (4), lack of time (2), and frustration with the skin problem (1). No obvious common characteristic explained their lack of participation.

PHYSICIAN CHARACTERISTICS

Each physician performed between 16% and 35% of in-person examinations. Video examinations were less well distributed; two thirds were performed by 1 dermatologist. No differences in physician satisfaction, confidence, diagnostic categories, and frequency of agreement were detected among the 4 physicians.

BANDWIDTH

Sixty-four percent of video examinations were conducted at T1 and 36% were conducted at 14 T1. Because bandwidth adjustments required an off-site technician, the T1 examinations were completed first (patients 1-91) and the remainder of the examinations were performed at 14 T1.

PARTICIPANT PERCEPTIONS
Patient Perceptions

Patient acceptance of both video and in-person examinations was extremely high. Responses were positive in 97% to 100% of video examinations and 99% to 100% of in-person examinations. Strongly agree predominated for in-person responses, while agree predominated for video examinations (P=.001), suggesting that although patients were positive about the video experience, they preferred in-person examination. No differences in responses to satisfaction questions were detected between T1 and 14 T1 bandwidth video examinations. No differences were detected between patients and controls for any in-person examination satisfaction queries.

While most patients would prefer a video examination by a dermatologist to an in-person visit with a nondermatologist, and most would prefer to see the dermatologist by video close to home rather than to travel to see the same physician in person, a substantial minority of patients still preferred the more traditional examination formats. Younger patients were generally more willing to accept the new technology than were their older counterparts (Figure 1). No differences in responses were noted when stratified for level of education, type of examination (new vs follow-up), and distance of residence from the hospital.

Additional written comments were provided by 64 patients. Two thirds of the comments were positive. Positive comments included "Just as good as in person," "Make all VA [Veterans Affairs] hospitals have one," and "Saves time and trouble." Negative comments included "Don't like machines," "Prefer face-to-face," "Prefer hands-on," and "Keep the old conventional way. I am old fashioned and conventional."

Physician Perceptions

Physicians were highly satisfied with the interpersonal aspects of both in-person and video visits; however, in-person examinations were preferred (P<.001). Good rapport with patients could be established in 98% of in-person examinations, 95% of T1 video examinations, and 90% of 14 T1 bandwidth examinations. No significant differences in satisfaction were detected among the 4 dermatologists.

For all in-person examinations and 81% of video examinations, physicians were satisfied with their ability to examine the skin well and visualize all necessary anatomical areas. Physicians were able to examine the skin well in 93% of T1 examinations, but in only 60% of 14 T1 bandwidth examinations (P<.001). Physicians were "happy with focus and visual resolution" in 89% of encounters at T1, but they were satisfied in only 41% of those performed at 14 T1 bandwidth (P=.001). Examination of the chart was equally easy at video (75%) and in-person (78%) encounters.

Physicians expressed greater confidence in their in-person diagnoses (98%) than their video diagnoses (85%), but confidence did not significantly differ between T1 (86%) and 14 T1 (82%) bandwidth, or between new patient examination and follow-up examinations.

The most frequent problems noted by physicians performing video examinations were (1) on-screen icons partially obstructed the view of the patient, (2) certain anatomical locations (lower legs, feet, genitals, and scalp) were particularly difficult to examine, (3) inability to touch the skin was a limitation for some diagnoses (eg, actinic keratoses), and (4) fine focus was not satisfactory in some cases.

Nurse-Escort Observations

Nurse-escorts could see and hear the physician well in 99% and 100% of the video and in-person examinations, respectively. Nurse-escorts observed that patients appeared to communicate effectively with the physician during all examinations. Problems with the video controls were noted during 9% of examinations. The greatest difficulty for the nurse-escort was manipulation and focus of the handheld, flexible camera.

DIAGNOSTIC AGREEMENT
Physician Agreement: Video vs In-Person Examinations

During the study, 318 diagnostic observations were generated, of which 260 were evaluable for agreement (130 paired diagnoses in which both examiners were clearly evaluating the same lesion), and 58 were extra diagnoses (made by one examiner but not noted by the other.) Examples of scoring for agreement are presented in Table 3. Of the 130 paired observations, agreement between video and in-person examinations was achieved 104 times (80%). The most common disagreements were about benign tumors, 23%; scaling eruptions, 23%; scaly vs acneiform conditions, 19%; and premalignant or malignant tumors, 19% (Table 4). Of the 58 extra diagnoses, 25 were noted during in-person examinations and 33 were from video examinations. On 5 occasions, a malignant or premalignant tumor was listed by one physician but not the other. The only extra diagnosis of malignancy was one squamous cell carcinoma diagnosed by a video physician. No significant differences in agreement were detected between diagnoses from a new patient examination (77%) and follow-up examinations (81%) or between T1 bandwidth examinations (84%) and 14 T1 bandwidth examinations (78%). Video physicians who chose agree or strongly agree for the confidence inquiry achieved agreement with the in-person physician in 89% of cases. When confidence was lower (disagree or strongly disagree) the level of agreement was 33%.

Correlation of Agreement Data With Histopathological Findings

Review of histopathological records for all study subjects yielded 11 biopsy specimens of skin lesions during the study. In 7 cases (63%), the report confirmed the diagnosis agreed on by both observers. In 1 case, the report settled a disagreement in favor of the video diagnosis. On 3 occasions, the report revealed a diagnosis different from that proposed by both observers.

COMMENT
PATIENT, PROVIDER, AND NURSE-ESCORT IMPRESSIONS

Patients were highly satisfied with teledermatology examinations, which confirms previous reports in other specialties.14 The lack of influence of bandwidth on patient satisfaction suggests that the quality of transmission provided at 14 T1 bandwidth was sufficient for good physician and patient rapport. Age proved to be an independent predictor of patient acceptance; older patients were less inclined to embrace the new technology. Physicians were satisfied with their ability to establish patient rapport at both T1 and 14 T1 bandwidths, but were less satisfied with the quality of images viewed at 14 T1 bandwidth. The flexible neck camera was essential for access to hard-to-see areas and for close-ups. The hand-held dermatoscope was not favored by the physicians because the image it provided was too greatly magnified. The document camera was not satisfactory for clearly transferring chart information; the flexible camera proved superior for this purpose. Physician frustration with technical difficulties deserves special mention. The problems with establishing and maintaining video and audio connections that plague new telemedicine systems highlight the importance of strong technical support.

DIAGNOSTIC AGREEMENT

The level of diagnostic agreement between in-person and video dermatologists was high (80%). Although physicians were less confident in their diagnoses in 14 T1 bandwidth examinations, no significant difference in agreement was detected between examinations at T1 and at 14 T1 bandwidths (84% and 78%, respectively). Confirming a previous report,5 low diagnostic confidence among physicians correlated with low levels of agreement. Teledermatologists should therefore consider follow-up in-person examination when the diagnostic confidence during video examination is low.

If it is assumed that the in-person examination represents the "gold standard" with which the video examination is compared, then sensitivity, specificity, and positive predictive value may be calculated (Table 4). The lower positive predictive value in the acneiform and malignant tumor categories indicates that video physicians tended to overdiagnose these conditions. Higher sensitivities were found for benign and premalignant tumors than for the dermatitis and papulosquamous categories, suggesting that tumors are somewhat easier to diagnose by video. The high values for specificity are not unexpected because of the relatively small numbers in each diagnostic category.

The use of the in-person examination as the gold standard is a debatable choice. Because 2 in-person physicians do not always agree, histopathological analysis might be seen as a more appropriate gold standard. This viewpoint is supported by the 3 occasions in which neither the in-person nor the video diagnosis matched the ultimate histopathological diagnosis. Reliance on biopsy-rendered diagnoses is not possible in all cases, however, because most dermatologic diagnoses do not require biopsy confirmation.

Extra diagnoses were not considered in the analyses as true diagnostic disagreements because they are less a measure of diagnostic agreement than of diagnostic detection.6 A recent study7 of diagnostic detection among dermatologists suggests that significant variability in detection exists, even for such directed tasks as total body nevus counts.

Reports of agreement among dermatologists are few and tend to address narrow diagnostic categories. Interdermatologist agreement varies significantly, from 88% to 91% for actinic keratoses and 0% to 85% for malignancies,8 59% for a variety of still images,5 and 61% to 88% for comparisons of still and in-person diagnoses in the same patients.9 Interphysician agreement between in-person and live 2-way interactive teledermatology examination was 77% in 1 study10 of 60 patients and 78% complete agreement/14% partial agreement in another study11 of 60 patients. Agreement in skin cancer screening yielded κ=0.78 for actinic keratoses, κ=0.38 for squamous cell carcinomas,12 κ=0.44 to 0.7613 for the extent of cutaneous photodamage, and κ=0.01 and 0.7814 for criteria for atopic dermatitis.

Putting agreement into perspective is challenging because the degree of diagnostic agreement between 2 dermatologists examining a patient in the same room has not been well established.6,11 Because interphysician variability will account for a proportion of disagreements in telemedicine encounters, a central question is how much disagreement may be attributed to the technology itself?

We acknowledge several shortcomings of this pilot study. Satisfaction data from this population of adult male veterans is unlikely to represent that of the general population. The number of participating physicians was small and the imbalance of video examiners may have altered the satisfaction and the agreement data by overrepresenting the opinions of a single individual. Because the 4 dermatologists work together on a regular basis, their agreement might exceed that of dermatologists from different clinical centers. The study's artificial scenario, which created the illusion of distance even though the patient and physician were actually in the same building, may have affected the results. We acknowledge that diagnostic agreement is a valuable indicator, but not a direct measurement of diagnostic accuracy.

Future teledermatology studies should extend beyond satisfaction and agreement to health outcomes and cost-effectiveness.15 Studies using more limited bandwidths should determine the minimum acceptable bandwidth for each application. The transfer of still images alone, without direct verbal or video communication between patient and physician, offers a more economical alternative for transfer of information; further research in agreement, accuracy, and satisfaction of this and other new modalities is needed. We are enthusiastic about the opportunities that telemedicine may provide; however, we urge those who implement telemedicine systems to be respectful of the magnitude of change inherent in altering the traditional relationship between the physician and patient.

Back to top
Article Information

Accepted for publication November 14, 1997.

This study was supported in part by the Baltimore Research and Education Foundation Inc, a nonprofit research component of the Baltimore Veterans Affairs Medical Center, and the Department of Dermatology, University of Maryland School of Medicine, Baltimore. Videoconferencing equipment and technical support was provided by VSI Enterprise, Norcross, Ga.

Presented in part at the American Medical Informatics Association Spring Congress, Kansas City, Mo, June 6, 1996; and at the 58th Annual Meeting of the Society for Investigative Dermatology, Washington, DC, April 23-27, 1997.

We thank Kathy Yeager, RN, Cynthia Schafer, and Karen Beasley, MD, for their organization and persistence as study coordinators; Steven Caplan, MD, and Jerry Cooley, MD, for participating as physicians in the video and in-person examinations; Oscar Torres, RN, Lynette Cartledge, PCA, and Keena Dezurn, PCA, for participation as nurse-escorts; Matt Hammaker for assistance in procurement and installation of equipment; Mitchell Rosen, PhD, and Saul Lowitt, PhD, for advice regarding biostatistical analysis; and Arnold Kreger, PhD, for critical review of the manuscript.

Corresponding author: Mark H. Lowitt, MD, Department of Dermatology, University of Maryland School of Medicine, 405 W Redwood St, 6th Floor, Baltimore, MD 21201 (e-mail: mlowitt@umaryland.edu).

References
1.
Allen  AHayes  JSadasivan  RWilliamson  SKWittman  C A pilot study of the physician acceptance of tele-oncology. J Telemed Telecare. 1995;134- 37
2.
Jones  DHCrichton  CMacdonald  AP  et al.  Teledermatology in the highlands of Scotland. J Telemed Telecare. 1996;2(suppl)7- 8Article
3.
Allen  AHayes  J Patient satisfaction with telemedicine in a rural clinic. Am J Public Health. 1994;841693Article
4.
Mitchell  BRMitchell  JCDisney  APS User adoption issues in renal telemedicine. J Telemed Telecare. 1996;281- 86Article
5.
Kvedar  JCEdwards  RAMenn  ER  et al.  The substitution of digital images for dermatologic physical examination. Arch Dermatol. 1997;133161- 167Article
6.
Perednia  DAGaines  JARossum  AC Variability in physician assessment of lesions in cutaneous images and its implications for skin screening and computer-assisted diagnosis. Arch Dermatol. 1992;128357- 364Article
7.
Byles  JEHennrikus  DSanson-Fisher  RHersey  P Reliability of naevus counts in identifying individuals at high risk of malignant melanoma. Br J Dermatol. 1994;13051- 56Article
8.
Whited  JDHorner  RDHall  RPSimel  DL The influence of history on interobserver agreement for diagnosing actinic keratoses and malignant skin lesions. J Am Acad Dermatol. 1995;33603- 607Article
9.
Zelickson  BDHoman  L Teledermatology in the nursing home. Arch Dermatol. 1997;133171- 174Article
10.
Phillips  CMBurke  WAShechter  A  et al.  Reliability of dermatology teleconsultations with the use of teleconferencing technology. J Am Acad Dermatol. 1997;37398- 402Article
11.
Lesher  JLDavis  LSGourdin  FWEnglish  DThompson  WO Telemedicine evaluation of cutaneous diseases: a blinded comparative study. J Am Acad Dermatol. 1998;3827- 31Article
12.
Leffell  DJChen  YTBerwick  MBolognia  JL Interobserver agreement in a community skin cancer screening setting. J Am Acad Dermatol. 1993;281003- 1005Article
13.
Larnier  COrtonne  JPVenot  A  et al.  Evaluation of cutaneous photodamage using a photographic scale. Br J Dermatol. 1994;130167- 173Article
14.
Williams  HCBurney  PGJStrachan  DHay  RJ The U.K. working party's diagnostic criteria for atopic dermatitis, II: observer variation of clinical diagnosis and signs of atopic dermatitis. Not Available
15.
Field  MJed Telemedicine: A Guide to Assessing Telecommunications in Health Care.  Washington, DC National Academy Press1996;1- 15
×