[Skip to Navigation]
Sign In
Figure 1. 
Demonstration by study staff on how to take a set of 3 images: front (A), right (B), and left (C).

Demonstration by study staff on how to take a set of 3 images: front (A), right (B), and left (C).

Figure 2. 
Flow diagram of subject progress through the randomized controlled trial. Intervention refers to the e-visit group and control refers to the traditional office visit group. BCBS indicates Blue Cross Blue Shield of Massachusetts

Flow diagram of subject progress through the randomized controlled trial. Intervention refers to the e-visit group and control refers to the traditional office visit group. BCBS indicates Blue Cross Blue Shield of Massachusetts

Figure 3. 
Comparison of mean (SD) total inflammatory lesion count, control vs intervention groups. No significant between-group differences were seen in visits 1, 4, and 5. Asterisks indicate significant difference seen in visits 2 and 3.

Comparison of mean (SD) total inflammatory lesion count, control vs intervention groups. No significant between-group differences were seen in visits 1, 4, and 5. Asterisks indicate significant difference seen in visits 2 and 3.

Table 1. 
Patient Demographicsa
Patient Demographicsa
Table 2. 
Comparison of Assessment Measures in the Office Visit vs E-Visit Groupsa
Comparison of Assessment Measures in the Office Visit vs E-Visit Groupsa
1.
Massachusetts Medical Society, 2008 Physician Workforce Study: Executive Summary. http://www.massmed.org/AM/Template.cfm?CONTENTID=23165&Template=/CM/ContentDisplay.cfm
2.
Resneck  J  Jr Too few or too many dermatologists? difficulties in assessing optimal workforce size.  Arch Dermatol 2001;137 (10) 1295- 1301PubMedGoogle Scholar
3.
Suneja  TSmith  EDChen  GJZipperstein  KJFleischer  AB  JrFeldman  SR Waiting times to see a dermatologist are perceived as too long by dermatologists: implications for the dermatology workforce.  Arch Dermatol 2001;137 (10) 1303- 1307PubMedGoogle Scholar
4.
Tsang  MWResneck  JJack  S Even patients with changing moles face long dermatology appointment wait-times: a study of simulated patient calls to dermatologists.  J Am Acad Dermatol 2006;55 (1) 54- 158Google ScholarCrossref
5.
WorldStats, Internet Usage Statistics for the Americas. http://www.internetworldstats.com/stats2.htmAccessed September 15 2008
6.
Qureshi  AABrandling-Bennett  HAGiberti  SMcClure  DHalpern  EFKvedar  JC Evaluation of digital skin images submitted by patients who received practical training or an online tutorial.  J Telemed Telecare 2006;12 (2) 79- 82PubMedGoogle ScholarCrossref
7.
Qureshi  AABrandling-Bennett  HAWittenberg  EChen  SCSober  AJKvedar  JC Willingness-to-pay stated preferences for telemedicine versus in-person visits in patients with a history of psoriasis or melanoma.  Telemed J E Health 2006;12 (6) 639- 643PubMedGoogle ScholarCrossref
8.
Eminovic  NWitkamp  Lde Keizer  NFWyatt  JC Patient perceptions about a novel form of patient-assisted teledermatology.  Arch Dermatol 2006;142 (5) 648- 649PubMedGoogle Scholar
9.
Weinstock  MANguyen  FQRPRisica  PM Patient and referring provider satisfaction with teledermatology.  J Am Acad Dermatol 2002;47 (1) 68- 72PubMedGoogle ScholarCrossref
10.
Whited  JDHall  RPFoy  ME  et al.  Patient and clinician satisfaction with a store-and-forward teledermatology consult system.  Telemed J E Health 2004;10 (4) 422- 431PubMedGoogle ScholarCrossref
11.
Whited  JD Teledermatology research review.  Int J Dermatol 2006;45 (3) 220- 229PubMedGoogle ScholarCrossref
12.
Lucky  AWBarber  BLGirman  CJWilliams  JRatterman  JWaldstreicher  J A multirater validation study to assess the reliability of acne lesion counting.  J Am Acad Dermatol 1996;35 (4) 559- 565PubMedGoogle ScholarCrossref
13.
Bergman  HTsai  KYSeo  SJKvedar  JCWatson  AJ Remote assessment of acne: the use of acne grading tools to evaluate digital skin images.  Telemed J E Health 2009;15 (5) 426- 430Google ScholarCrossref
14.
Burke  BMCunliffe  WJ The assessment of acne vulgaris: the Leeds technique.  Br J Dermatol 1984;111 (1) 83- 92PubMedGoogle ScholarCrossref
15.
Pak  HTriplett  CALindquist  JHGrambow  SCWhited  JD Store-and-forward teledermatology results in similar clinical outcomes to conventional clinic-based care.  J Telemed Telecare 2007;13 (1) 26- 30PubMedGoogle ScholarCrossref
16.
Grant  RWCampbell  EGGruen  RLFerris  TGBlumenthal  D Prevalence of basic information technology use by U.S. physicians.  J Gen Intern Med 2006;21 (11) 1150- 1155PubMedGoogle ScholarCrossref
Study
April 2010

A Randomized Trial to Evaluate the Efficacy of Online Follow-up Visits in the Management of Acne

Author Affiliations

Author Affiliations: Center for Connected Health, Boston, Massachusetts; Department of Dermatology, Massachusetts General Hospital, Boston; and Harvard Medical School, Boston.

Arch Dermatol. 2010;146(4):406-411. doi:10.1001/archdermatol.2010.29
Abstract

Objective  To evaluate whether delivering acne follow-up care via an asynchronous, remote online visit (e-visit) platform produces equivalent clinical outcomes to office care.

Design  A prospective, randomized controlled study.

Setting  Two teaching hospitals in Boston between September 2005 and May 2007.

Participants  A total of 151 patients with mild to moderate facial acne.

Interventions  Subjects were asked to carry out 4 follow-up visits using either an e-visit platform or conventional office care. At 6-week intervals, subjects in the e-visit group were prompted to send images of their skin and an update, via a secure Web site, to their dermatologist. Dermatologists responded with advice and electronic prescriptions.

Main Outcome Measures  The primary outcome measure was change in total inflammatory lesion count between the first and last visit. The major secondary outcomes were subject and dermatologist satisfaction with care and length of time to complete visits.

Results  The mean age of subjects was 28 years; most were female (78%), white (65%), and college educated (69%). One hundred twenty-one of the initial 151 subjects completed the study. The decrease in total inflammatory lesion count was similar in the e-visit and office visit groups (6.67 and 9.39, respectively) (P = .49). Both subjects and dermatologists reported comparable satisfaction with care regardless of visit type (P = .06 and P = .16, respectively). Compared with office visits, e-visits were time saving for subjects and time neutral for dermatologists (4 minutes, 8 seconds vs 4 minutes, 42 seconds) (P = .57).

Conclusion  Delivering follow-up care to acne patients via an e-visit platform produced clinical outcomes equivalent to those of conventional office visits.

Trial Registration  clinicaltrials.gov Identifier: NCT00417456

Ensuring timely access to high-quality care is currently a challenge for the stressed US health care system. Many specialties, including internal medicine, psychiatry, and dermatology, are struggling to accommodate a growing demand for appointments owing to a critical shortage of health care providers (hereinafter, “providers”).1-4 Each specialty faces individual challenges: internal medicine, a growing tide of patients with diabetes and obesity; dermatology, a rise in skin cancer and a geographically maldistributed workforce leading to lack of access for many patients living in rural areas.2 Long wait times for urgent issues are prevalent in both rural and urban settings and are viewed as unacceptable by many clinicians.3,4

One potential solution to these issues may be the adoption of innovative, technology-enabled models of care delivery. Most Americans have access to the Internet, mobile phones, and digital cameras, all of which can be used to support remote, asynchronous communication between patient and physician.5 Such models of care delivery have been proposed as more efficient ways of delivering high-quality care to patients. However, despite evidence of high levels of satisfaction6-10 and reduced wait times,11 there is currently a lack of evidence regarding the clinical outcomes achieved by alternative methods of care delivery.

Dermatology has proven to be a useful test bed for innovative techniques that deliver care outside of clinic walls and streamline provider workflow. We conducted a randomized clinical trial to evaluate whether patients with acne receiving follow-up care using an asynchronous, online care (e-visit) platform experienced equivalent clinical outcomes to those receiving conventional office care.

Methods
Eligibility criteria

Subjects were required to be 12 years or older, be diagnosed as having mild to moderate acne by a dermatologist, have access to a computer and Internet connection, have Blue Cross and Blue Shield of Massachusetts health insurance (only insurer to integrate with the online platform), and be willing to conduct a series of 4 office or online visits at 6-week intervals. Subjects with severe acne or those taking isotretinoin were excluded.

Setting

Subjects were recruited in Boston, Massachusetts, using advertisements on local Web sites and at health care facilities. All care was provided by 5 dermatologists employed by Massachusetts General Hospital or Brigham and Women's Hospital. Recruitment commenced in September 2005 and closed in May 2007. The study was reviewed and approved by the institutional review board of the Massachusetts General Hospital. This trial was registered on clinicaltrials.gov (NCT00417456).

Interventions

Eligible subjects were invited to attend an initial office visit with 1 of the 5 participating dermatologists. Following confirmation of diagnosis and patient consent, a member of the research team took 3 baseline facial photographs of the subject (front and both sides) (Figure 1). Subjects were then informed of their assignment to either the intervention (e-visit) or control (office visit) arm of the study. Intervention subjects were provided with a digital camera and trained to take images using a standardized validated protocol.6

Control subjects received 4 follow-up office visits with their dermatologist at 6-week intervals. At each visit, study staff obtained a set of 3 images of the patient. Subjects continued to pay their existing copayment at each of these visits. Intervention subjects received 4 follow-up e-visits with their dermatologist at 6-week intervals. To complete an e-visit, subjects were required to (1) capture 3 facial images and upload them to the secure site; (2) complete a structured set of disease-specific questions using the secure online Web site (free text sections were also included); and (3) provide copayment via the Web site on submission of the e-visit information.

A member of the research team reviewed the submitted visit information prior to sending it to the dermatologist. Subjects were prompted to retake images and/or redo the visit if information was missing or images were of poor quality. Of the 54 intervention subjects, 39 had to resubmit 1 or more e-visits owing to either poor photo quality (n = 22) or a technical error, such as failing to attach photos to the visit (n = 17). Most errors occurred on the subject's first e-visit. Dermatologists responded to e-visit subjects within 3 business days. The e-visit platform allowed physicians to modify treatments, clarify the history, and attach electronic prescriptions.

Primary and secondary outcomes

The primary end point of this study was the change in total inflammatory lesion count (TILC)12,13 between the first and final study visits. Secondary end points included (1) changes in acne severity according to 3 other assessment measures, frontal inflammatory lesion count (FILC), Burke and Cunliffe Leeds technique (Leeds),14 and forced choice; (2) subject and dermatologist satisfaction with care; and (3) subject and dermatologist time required to complete a visit.

Two raters were trained to carry out all acne assessment measures on the digital images. Raters were blinded to the treatment assignation of the subject and the visit number. The same rater was assigned all images for a particular subject.

Subject and provider satisfaction were assessed by surveys administered at the final visit. Some questions had been previously validated,8 whereas others were specifically designed for this study.

Time taken to complete visits was assessed by examining a random selection of 30 of each type of visit. At office visits a member of the research team used a stopwatch to record the length of time between the subject arriving at and leaving the clinic. The length of contact time with the physician was also recorded. E-visit timing was assessed by dermatologists themselves using a stopwatch.

Sample size

This was an equivalency trial evaluating any difference in the change in lesion counts between first and last visits across the 2 arms of the study. A difference of 10 lesions or more was considered clinically significant. To have 80% power to detect such a difference, assuming standard deviation (SD) of 20, a type 1 error level of .05, and a dropout rate of 25%, a total sample size of 151 subjects was required.

Randomization

Following initial assessment and consent, subjects were assigned to the control or intervention arm at a 1:1 ratio on the basis of random number generation. Owing to the nature of the intervention, study staff, dermatologists, and subjects were not blinded to group assignation over the course of the study.

Statistical analysis

Continuous outcomes between groups were compared using the t test (for normally distributed outcomes) and the Wilcoxon rank-sum test (for nonnormally distributed outcomes and rank measures). Differences in proportions between groups were compared by using χ2 tests or Fisher exact tests when appropriate. All calculations were performed with SAS software, version 9.1 (SAS Institute Inc, Cary, North Carolina). A 2-sided P value of .05 was considered statistically significant. Average values are represented as means (SDs).

Results
Patient demographics

A total of 151 subjects were enrolled, of whom 121 completed the study (80%) (Figure 2). Baseline demographic information is summarized in Table 1. The difference in dropout rate between the intervention and control groups was significant (P = .03), but there were no significant differences in demographics or baseline self-reported acne severity between subjects who completed and those who did not complete the trial.

Clinical outcomes

Figure 3 charts the mean lesion counts at each of the 5 visits. We conducted a per-protocol analysis using data from all 121 subjects who completed the study. The improvements (mean reduction in lesions) in TILC from visit 1 to visit 5 in the control and intervention groups were 9.4 and 6.7, respectively, a nonsignificant difference between groups of 2.72 lesions (95% confidence interval [CI], −5.54 to 10.99) (P = .49). A last value carried forward analysis, including those subjects who did not complete the study, also showed no significant difference between the improvement seen between control and intervention groups (8.62 and 4.99, respectively), a nonsignificant difference of 3.64 lesions (95% CI, −3.46 to 10.74) (P = .31). Furthermore, no significant differences in acne improvement were seen between the groups on 2 alternative acne assessment measures (Table 2) or the forced choice examination (proportion of control and intervention subjects where the last visit image was deemed the best was 55% and 55%, respectively (P = .98). In the subset of 74 subjects who self-reported acne severity at baseline and the final visit, 65% of intervention subjects rated their acne as less severe at the end of the study compared with 44% of control subjects (P = .22).

Subject satisfaction

There were no significant differences between control and intervention groups in subject satisfaction with overall care (98% vs 91%) (P = .054) or belief that acne had improved (88% vs 91%) (P = .64). Control subjects were more likely to agree that the visit took too much time out of their day than were intervention subjects (34% vs 4%) (P < .001). Most subjects in the e-visit group agreed that their dermatologist could assess their acne just as well using an e-visit as in person (76%) and that they could express their concerns and questions about acne as well by e-visit as by office visit (83%). The proportion of subjects who would consider using e-visits again to deal with acne was even greater (91%).

Dermatologist satisfaction

On a 1 to 10 scale, there were no significant differences in dermatologists' satisfaction with the overall care they provided for control vs intervention subjects (9.39 vs 9.04) (P = .16) or with the subjects' acne improvement (8.92 vs 8.34) (P = .06). Dermatologists were more likely to report wishing that they could have managed their office visit subjects via e-visits than the reverse (P < .001). Dermatologists managing subjects using e-visits were most likely to report that e-visits took less time to complete than office visits (68%).

Timing

Subjects attending an office visit spent an average of 22 minutes (range, 15-35 minutes) in the physician's office, of which only 4 minutes, 8 seconds was spent with the dermatologist (range, 1 minute, 20 seconds to 7 minutes, 15 seconds). In addition, almost half of this group (45%) spent between 30 and 60 minutes traveling to the office. In contrast, 91% of e-visit subjects were able to complete their e-visits in less than 20 minutes. Dermatologists took comparable lengths of time to complete e-visits and office visits (4 minutes, 42 seconds and 4 minutes, 8 seconds, respectively) (P = .57). This time estimate reflects the length of time the physician spent on the e-visit Web site or with the patient, not the total time spent documenting the visit in an electronic medical record.

Subjects completed their e-visits at varying times throughout the day. In contrast to office visits, only 40% of subjects chose to complete their e-visit during the workday (8 AM-5 PM). Dermatologists were equally likely to complete their portion of the e-visit during the workday (50%) or after hours (50%).

Comment

In this trial, delivering follow-up care to subjects with mild to moderate acne via office and online visits produced equivalent clinical outcomes by several different metrics. There were no significant differences between groups in subject or physician satisfaction with care. E-visits were time saving for subjects and time neutral for dermatologists (P = .56).

These findings suggest that dermatologists obtain sufficient information from digital images and survey responses to make appropriate management decisions in the treatment of acne. In addition, this model of care delivery was popular with both physicians and patients, likely owing to the convenience and/or time savings associated with e-visits. Familiarity with online banking, travel, and shopping sites may promote patient interest in receiving the same level of convenience and 24/7 access to services in the health care industry, a field traditionally slow to respond to consumer preferences. Sixty percent of subjects completed e-visits outside of working hours, half of these between 6PM and midnight. E-visits could generally be completed in less than 20 minutes, unlike clinic visits, which entailed time spent traveling, parking, and waiting for the physician. It appears that convenience may be at least as important to patients as other aspects of care, given that 91% of subjects would choose to receive acne care via e-visits in the future despite only 76% of e-visit subjects agreeing that their dermatologist could assess their skin as well as in person.

This study differs from the existing telemedicine literature in several key ways. Previous studies have focused on using technology as a tool for diagnosis, whereas we report on clinical outcomes. Only 1 study to our knowledge has addressed clinical outcomes in patients managed with store-and-forward teledermatology compared with conventional clinic care.15 However, the authors used a rating system of improved, no change, or worse to evaluate outcomes. Although it is reassuring that a similar proportion of patients in each group improved, this measure lacks the sensitivity to demonstrate equivalency, which we believe is an important prerequisite for physician adoption.

Previous studies have used technology to facilitate communication between specialist and referring physician, whereas our study allows a specialist to communicate directly with the patient, removing the need for an in-person visit or the involvement of another physician. It is a true visit replacement rather than merely an adjunct to conventional care. Finally, teledermatology has previously been viewed as a one-off consultation, whereas we demonstrate that these tools can successfully be used to deliver follow-up care over a period of several months.

There were several limitations to this study. Our study subjects were primarily white, educated women, so our results may not be generalizable to the wider population of patients with acne who may be less comfortable taking a more active role in their own care or in using technology. Our primary outcome measure, TILC, did not include comedones or cysts because neither class of lesion can be fully appreciated from a digital image. However, any undercounting should be similar across study arms, so we do not believe that the TILC outcome measure affects the validity of our findings.

The study experienced a dropout rate of 20%, with two-thirds of dropouts coming from the intervention group. However, our initial power calculation allowed for a 25% dropout rate, so we do not believe the 20% rate negatively impacted our findings. In addition, our intention-to-treat analysis, using last value carried forward as a conservative estimate of improvement, did not reveal any significant differences in the improvement seen between study groups, which suggests that the differential dropout rate across study arms did not affect the validity of our findings. However, it is worth noting that even motivated patients may require support and encouragement to adopt new methods of follow-up and that this should be anticipated in resource planning for introduction of such services into clinical practice.

Finally, we did not document therapies prescribed to patients in each group. We assumed that dermatologist variability was accounted for by randomization, which ensured that all physicians had both e-visit and office patients under their care.

This model of care could be expanded to other dermatology and nondermatology conditions. We hypothesize that a set of key characteristics defines conditions amenable to online management. Conditions should be nonurgent and chronic and have physiologic parameters amenable to remote monitoring. It should also be prevalent in populations with high levels of technology adoption. Some examples include chronic skin diseases such as eczema or psoriasis or general medical conditions such as type 2 diabetes mellitus or hypertension, for which glucose and blood pressure readings would replace digital images. Extending the applications of the e-visit platform would require further research to ensure that our finding of equivalent quality of care is maintained across other clinical conditions. Physicians have been slow to adopt information technology tools16 owing to concerns around reimbursement, data overload, and clinical outcomes therefore, thoughtful integration within existing workflow, including electronic medical record integration, would be necessary to ensure participation. Health care reforms such as a shift from visit-based to outcome-based reimbursement might promote adoption of this type of care delivery platform.

In conclusion, we have demonstrated equivalent clinical outcomes in e-visit and conventional visit management of patients with acne. Further work should examine broader applications of e-visit technology both within the specialty of dermatology and beyond.

Correspondence: Alice J. Watson, MBChB, MRCP, MPH, 25 New Chardon St, Ste 400D, Boston, MA 02114 (ajwatson@partners.org).

Accepted for Publication: October 12, 2009.

Author Contributions: Drs Watson and Bergman contributed equally to this article. Drs Watson and Bergman had full access to all of the data in this study and take responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: Bergman, Williams, and Kvedar. Acquisition of data: Bergman and Williams. Analysis and interpretation of data: Watson, Bergman, Williams, and Kvedar. Drafting of the manuscript: Watson and Bergman. Critical revision of the manuscript for important intellectual content: Watson, Bergman, Williams, and Kvedar. Statistical analysis: Watson and Bergman. Obtained funding: Kvedar. Administrative, technical, and material support: Williams. Study supervision: Watson and Kvedar.

Financial Disclosure: None reported.

Funding/Support: This study was supported in part by a grant from the Information Systems Research Council at Partners Healthcare, Boston, Massachusetts.

Role of the Sponsor: The sponsor had no role in the design or conduct of the study; in the collection, analysis, or interpretation of data; or in the preparation, review, or approval of the manuscript.

Previous Presentations: Preliminary results from this study were presented at the Society for Investigative Dermatology Annual Meeting; May 9-12, 2007; Los Angeles, California; at the American Telemedicine Association Annual Meeting; April 6-8, 2008; Seattle, Washington; and at the American Academy of Dermatology Annual Meeting; March 6-10, 2009; San Francisco, California.

Additional Contributions: Regina Nieves, Abby Cange, BSc, Brian Hammond, BA, Elkan Halpern, PhD, and Abrar Qureshi, MD, PhD, assisted with this project. The cooperation and support of the physicians and staff of the dermatology departments at Massachusetts General Hospital and Brigham and Women's Hospital made this project possible.

References
1.
Massachusetts Medical Society, 2008 Physician Workforce Study: Executive Summary. http://www.massmed.org/AM/Template.cfm?CONTENTID=23165&Template=/CM/ContentDisplay.cfm
2.
Resneck  J  Jr Too few or too many dermatologists? difficulties in assessing optimal workforce size.  Arch Dermatol 2001;137 (10) 1295- 1301PubMedGoogle Scholar
3.
Suneja  TSmith  EDChen  GJZipperstein  KJFleischer  AB  JrFeldman  SR Waiting times to see a dermatologist are perceived as too long by dermatologists: implications for the dermatology workforce.  Arch Dermatol 2001;137 (10) 1303- 1307PubMedGoogle Scholar
4.
Tsang  MWResneck  JJack  S Even patients with changing moles face long dermatology appointment wait-times: a study of simulated patient calls to dermatologists.  J Am Acad Dermatol 2006;55 (1) 54- 158Google ScholarCrossref
5.
WorldStats, Internet Usage Statistics for the Americas. http://www.internetworldstats.com/stats2.htmAccessed September 15 2008
6.
Qureshi  AABrandling-Bennett  HAGiberti  SMcClure  DHalpern  EFKvedar  JC Evaluation of digital skin images submitted by patients who received practical training or an online tutorial.  J Telemed Telecare 2006;12 (2) 79- 82PubMedGoogle ScholarCrossref
7.
Qureshi  AABrandling-Bennett  HAWittenberg  EChen  SCSober  AJKvedar  JC Willingness-to-pay stated preferences for telemedicine versus in-person visits in patients with a history of psoriasis or melanoma.  Telemed J E Health 2006;12 (6) 639- 643PubMedGoogle ScholarCrossref
8.
Eminovic  NWitkamp  Lde Keizer  NFWyatt  JC Patient perceptions about a novel form of patient-assisted teledermatology.  Arch Dermatol 2006;142 (5) 648- 649PubMedGoogle Scholar
9.
Weinstock  MANguyen  FQRPRisica  PM Patient and referring provider satisfaction with teledermatology.  J Am Acad Dermatol 2002;47 (1) 68- 72PubMedGoogle ScholarCrossref
10.
Whited  JDHall  RPFoy  ME  et al.  Patient and clinician satisfaction with a store-and-forward teledermatology consult system.  Telemed J E Health 2004;10 (4) 422- 431PubMedGoogle ScholarCrossref
11.
Whited  JD Teledermatology research review.  Int J Dermatol 2006;45 (3) 220- 229PubMedGoogle ScholarCrossref
12.
Lucky  AWBarber  BLGirman  CJWilliams  JRatterman  JWaldstreicher  J A multirater validation study to assess the reliability of acne lesion counting.  J Am Acad Dermatol 1996;35 (4) 559- 565PubMedGoogle ScholarCrossref
13.
Bergman  HTsai  KYSeo  SJKvedar  JCWatson  AJ Remote assessment of acne: the use of acne grading tools to evaluate digital skin images.  Telemed J E Health 2009;15 (5) 426- 430Google ScholarCrossref
14.
Burke  BMCunliffe  WJ The assessment of acne vulgaris: the Leeds technique.  Br J Dermatol 1984;111 (1) 83- 92PubMedGoogle ScholarCrossref
15.
Pak  HTriplett  CALindquist  JHGrambow  SCWhited  JD Store-and-forward teledermatology results in similar clinical outcomes to conventional clinic-based care.  J Telemed Telecare 2007;13 (1) 26- 30PubMedGoogle ScholarCrossref
16.
Grant  RWCampbell  EGGruen  RLFerris  TGBlumenthal  D Prevalence of basic information technology use by U.S. physicians.  J Gen Intern Med 2006;21 (11) 1150- 1155PubMedGoogle ScholarCrossref
×