This tool is a novel, self-administered questionnaire with 15 closed-ended questions and includes features of disease validation for cutaneous lupus erythematosus as well as its various phenotypes.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Lam C, Liu SW, Townsend HB, Femia AN, Qureshi AA, Vleugels RA. Development and Pilot Testing of the Cutaneous Lupus Screening Tool. JAMA Dermatol. 2016;152(1):60–66. doi:10.1001/jamadermatol.2015.3088
Copyright 2016 American Medical Association. All Rights Reserved. Applicable FARS/DFARS Restrictions Apply to Government Use.
Patients with cutaneous lupus erythematosus (CLE) experience significant morbidity and poor quality of life. In the absence of a dermatologist’s examination, no reliable tool exists to confirm whether a patient has CLE for use in epidemiologic studies.
To determine whether the Cutaneous Lupus Screening (CLUSE) tool can detect cases of CLE by measuring its performance in individuals with dermatologist-diagnosed CLE compared with individuals without CLE.
Design, Setting, and Participants
The CLUSE tool is a novel, self-administered questionnaire with 15 closed-ended questions derived from the Delphi method. It includes features of disease validation for CLE as well as its most common phenotypes. This pilot study was administered during a 1-year period (July 1, 2011, to June 30, 2012) in outpatient dermatology clinics at an academic medical center. Data analysis was performed July 1, 2012, to November 30, 2013. Participants were individuals 18 years or older who had a definitive diagnosis of CLE or any other non-CLE dermatologic condition as established by a board-certified dermatologist. Eligible patients were recruited consecutively, and no individual approached declined to participate.
Main Outcomes and Measures
Sensitivity and specificity of the individual questions from the CLUSE tool in predicting CLE, comparisons between summary scores for the dichotomous questions between the CLE cases and non-CLE controls, and 9 scoring algorithms that assign a diagnosis of CLE and its subtypes depending on an individual’s response to each question.
A total of 133 patients were given the CLUSE tool; 16 participants were excluded. Responses from 117 individuals were collected for analysis and included 24 CLE cases and 93 non-CLE cases. In the 117 questionnaires analyzed, mean (SD) and median (interquartile range) CLUSE scores differed in the CLE (5.6 [2.1] and 5.5 [3-10], respectively) vs non-CLE (0.96 [1.6] and 0 [0-7], respectively) groups (all P < .001). Of the 9 algorithms, algorithm 9, used for diagnosing CLE regardless of subtype, demonstrated the highest sensitivity (87.5%) and high specificity (96.8%).
Conclusions and Relevance
A combination of questions and representative photographs can ascertain cases of CLE with high sensitivity and specificity. The CLUSE tool is a brief, self-administered questionnaire with low respondent burden used for the identification of CLE. In the future, this questionnaire will be administered to large, established patient databases to gather epidemiologic data on this disease.
Lupus erythematosus (LE) is an autoimmune disorder with heterogeneous clinical manifestations ranging from fulminant internal organ involvement in systemic LE (SLE) to skin-limited disease in cutaneous LE (CLE). Patients with CLE experience significant morbidity and poor quality of life.1-3 In addition, the socioeconomic effect of LE is considerable, owing to the frequent early age of presentation, chronic nature of the disease, and cost of therapy.4,5
A paucity of epidemiologic data regarding CLE exists in the literature. A major reason for this limited information is the lack of well-developed outcome measures that can be used to identify disease in a large population. Several population-based studies regarding the incidence and clinical characteristics of CLE have been published6-9; however, limitations include their retrospective design and selective patient populations. In addition, owing to differing methodologies of data collection and analysis, as well as variation in inclusion criteria, the studies cannot be compared. For instance, some studies used International Classification of Diseases diagnostic coding,7,8 whereas others searched for key words from patient databases6 or retrieved data from mailed questionnaires.9
Similar cutaneous findings can be observed both in CLE alone and in SLE with cutaneous involvement and have been broadly classified into LE-specific and LE-nonspecific lesions depending on characteristic or noncharacteristic histopathologic features, respectively.10,11 The LE-specific lesions are further subdivided into 3 categories: chronic CLE, subacute CLE (SCLE), and acute CLE (ACLE).12,13 Chronic CLE is the largest subset, and discoid CLE (DLE) lesions are the most common.13 Other, less common, chronic CLE lesions include chilblain LE, lupus panniculitis or profundus, mucosal LE, lupus tumidus, hyperkeratotic LE, and lichenoid LE.11,14
The standard for the diagnosis of CLE is a physical examination by a board-certified dermatologist using a combination of clinical and histopathologic features. Thus, in the absence of a dermatologist’s examination, no reliable tool exists that confirms whether a patient has CLE rather than other dermatologic disorders for use in epidemiologic studies. In addition, proper determination of a patient’s CLE subtype at the time of diagnosis may affect management considerations since certain forms, such as ACLE, are most often associated with systemic manifestations, whereas others, particularly DLE, may result in permanent scarring and disfigurement.12 However, complete subtyping of each patient with CLE by a dermatologist with sufficient specificity for inclusion in large epidemiologic studies would be impractical and costly.
In response to these needs, we developed and pilot tested the Cutaneous Lupus Screening (CLUSE) tool in an outpatient dermatology setting at an academic medical center. CLUSE is a self-administered questionnaire with 15 closed-ended questions and includes features of disease validation for CLE as well as its various phenotypes. The goal of this pilot study was to provide data from use of the CLUSE tool, measuring performance in individuals with dermatologist-diagnosed CLE compared with individuals without CLE.
The 15 questions used in the CLUSE tool (Figure) were derived from expert opinion and the Delphi method and required yes or no answers. We used 4 faculty experts in dermatology and rheumatology (including H.B.T., A.A.Q., and R.A.V.) and underwent 3 iterations before finalizing the CLUSE tool. The initial pool of 45 questions was reduced to the final 15 closed-ended questions over these iterations to make the CLUSE tool straightforward and succinct. The first 3 questions (Q1, Q2, and Q3) were aimed at identifying which health care professional diagnosed CLE in that patient. The next 6 questions depict the 3 main subtypes of CLE (Q4 for ACLE; Q5, Q6, and Q7 for DLE; and Q8 and Q9 for SCLE) in representative color photographs. Question 10 refers to a symptom generally associated with SLE (ie, diffuse hair loss),15,16 whereas Q11 describes photosensitivity, which can be present in all of the CLE subsets. The last 4 questions (Q12, Q13, Q14, and Q15) pertain to other cutaneous conditions that may mimic CLE, such as eczema, psoriasis, urticaria, and rosacea, respectively. Other forms of CLE were excluded from the questionnaire because they are less commonly seen. The tool was designed to be a brief, 1-page questionnaire with succinct and declarative questions about CLE. Adding photographs of the less common CLE subtypes would have increased the time spent and hindered the ease of administration of the CLUSE tool.
We evaluated the questionnaire in 2 ways. First, we examined the sensitivity and specificity of the dichotomous questions in predicting CLE. In addition, the dichotomous questions Q1 and Q3 through Q11 were assigned a score of 1 for each question answered yes and zero for a response of no, resulting in a summary score ranging from 0 to 10. Questions 12 through 15 aimed to capture individuals who may have a skin condition mimicking CLE.
Second, 9 scoring algorithms were developed that assign a diagnosis of CLE and its subtypes depending on the individual’s response to each question (Table 1). Each algorithm was evaluated individually and independently from the others. These algorithms were developed on the basis of multiple a priori hypotheses elaborated through the Delphi method. For example, algorithm 1 assigned a diagnosis of DLE if the patient answered yes to Q3 and Q6 or Q7. This algorithm was based on the a priori hypothesis that a patient who received a CLE diagnosis from a dermatologist and had characteristic discoid lesions (Q6 or Q7) likely represented a true diagnosis of DLE. Algorithm 2 assigned a diagnosis of DLE if the patient answered yes to having characteristic scarring alopecia caused by DLE (Q5) and similar discoid lesions on another body site (Q6 or Q7). This algorithm postulated that the classic discoid lesions were sufficiently recognizable in morphology and distribution that patients would be able to identify them, even in the absence of a formal diagnosis of CLE by a health care professional.
Similar to algorithm 1, algorithm 3 assigned a diagnosis of SCLE if the patient had received a CLE diagnosis from a dermatologist (Q3) and had either a characteristic annular (Q8) or papulosquamous (Q9) eruption. Algorithm 4 ascribed a diagnosis of SCLE if the patient received the CLE diagnosis from a nondermatologist physician (Q1) and had either a characteristic annular (Q8) or papulosquamous (Q9) eruption that worsened on sun exposure (Q11). By adding photosensitivity to the questions, this algorithm underscored the strong association between SCLE and photosensitivity9 and hypothesized that the diagnosis of SCLE, which may be more challenging by way of its broad differential diagnosis, was made more specific for nondermatologist physicians.
Algorithm 5 assigned a diagnosis of ACLE if the patient received the diagnosis from a nondermatologist physician (Q1), had the characteristic malar rash (Q4), and experienced either diffuse hair loss (Q10) or photosensitivity (Q11), both of which are features of SLE. In a similar fashion to algorithm 4, algorithm 5 hypothesized that the diagnosis of ACLE, which may be more challenging by way of its broad differential diagnosis, was made more specific for nondermatologist physicians by adding other recognized features of SLE. Algorithm 6 attributed a diagnosis of ACLE if a dermatologist made the diagnosis and the patient had the characteristic malar rash that worsened on sun exposure. This algorithm was based on the most likely signs and symptoms of a patient with ACLE presenting to a dermatologist’s office.
The final 3 algorithms pertain to CLE as a whole, regardless of subtype, and were derived from the hypotheses stated above. Algorithm 7 involved a dermatologist-made diagnosis of CLE and patient-reported photosensitivity. Algorithm 8 consisted of a CLE diagnosis by a nondermatologist physician and any characteristic cutaneous lesion (yes to any of Q5 through Q9) that was photoexacerbated. Algorithm 9 was a diagnosis of CLE by a dermatologist and any characteristic cutaneous lesion (yes to any of Q5 through Q9). We also performed unsupervised analysis evaluating algorithms that had high sensitivity and specificity for dermatologist-diagnosed CLE.
Institutional review board approval was obtained from Partners Healthcare System. Participants provided oral informed consent; they did not receive financial compensation. Eligible participants were 18 years or older and had a definitive diagnosis of CLE or any other non-CLE dermatologic condition as established by a board-certified dermatologist. Individuals younger than 18 years and those in whom the diagnosis was equivocal—CLE or otherwise—were excluded from this study. Study staff (C.L. and R.A.V.) reviewed the dermatologic medical records of potential patients and recruited eligible patients consecutively from dermatology and connective tissue disease outpatient dermatology clinics at Brigham and Women’s Hospital. This recruitment was independent of the distribution of sex, age, or ethnic group and was conducted by study staff during a 1-year period (July 1, 2011, to June 30, 2012). Study staff asked participants to answer all questions without assistance. For any non–English-speaking patient, an interpreter was available to clarify the questionnaire but not to provide answers. Data regarding sex, age, race, and ethnicity were also obtained from all participants as well as definitive disease diagnoses for all non-CLE participants from medical record review.
Data analysis was conducted from July 1, 2012, to November 30, 2013. Eligible participants who self-administered the questionnaire and answered all questions relevant to the algorithms detailed above were included in this study conducted in an adult population attending dermatology clinics. Individuals with missing data were excluded from the analysis. Demographics of the patient population were summarized for CLE and non-CLE cases. The significance of differences in these 2 groups was assessed with unpaired, 2-tailed t tests for continuous variables and χ2 tests for nominal variables.
We measured the performance of individual questions and algorithms by comparing each with the standard: a dermatologist’s diagnosis for the presence or absence of CLE and/or the appropriate subtype. Sensitivity, specificity, positive predictive value, and negative predictive value were calculated for individual questions and algorithms. We compared the performance of the scoring algorithms and individual questions in distinguishing CLE and its 3 main subtypes from non-CLE. A combination of the questions with the best discrimination across subtypes was then further analyzed for sensitivity and specificity in diagnosing CLE regardless of subtype.
A total of 133 patients were given the CLUSE tool by study staff. Among these, 16 participants were excluded from analysis for (1) having missing data on their questionnaires (n = 12); (2) having tumid LE, a less common variant of chronic CLE not covered by the photographs of the questionnaire (n = 3); and (3) having a diagnosis that changed over the course of the study (n = 1). Thus, responses from 117 individuals were collected for analysis and included 24 CLE cases and 93 non-CLE cases (Table 2). There was a strong preponderance of women in both the CLE (22 [91.7%]) and non-CLE (66 [71.0%]) groups. In addition, most participants were white, particularly in the non-CLE (78 [83.9%]) group, whereas the CLE group included only 14 (58.3%) white participants, with 10 (41.7%) participants in the CLE group being Hispanic, Asian, or black. The age at onset of CLE ranged from 18 to 60 years (median, 30 years). The duration of CLE ranged from 6 months to 35 years (median, 11 years). These data were missing for 2 individuals with CLE. The mean (SD) and median (interquartile range) CLUSE scores were significantly different in the CLE (5.6 [2.1] and 5.5 [3-10], respectively) and non-CLE (1.0 [1.6] and 0 [0-7], respectively) groups (P < .001 for each comparison) (Table 2).
There were no ACLE cases in this pilot study. The number of DLE and SCLE cases was similar (11 patients [54.5%] and 9 [37.5%], respectively), and 4 patients (16.7%) had both DLE and SCLE lesions. Ten of the 24 patients with CLE (41.7%) had underlying SLE.
Of the 93 patients in the non-CLE group, a variety of dermatologic conditions were represented (Table 3). The most common condition was psoriasis, which included all subtypes (33 patients [35.5%]).
Sixteen of 24 participants (66.7%) with CLE answered yes to Q1 (whether the diagnosis of CLE was made by a nondermatologist); of these respondents, 7 (29.2%) had DLE only, 6 (25.0%) had SCLE only, and 3 (12.5%) had both SCLE and DLE. Twenty-two of 24 individuals (91.2%) answered yes to Q3 (whether the diagnosis was made by a dermatologist); the 2 participants (8.3%) who responded no were given the CLUSE tool at their initial consultation with a dermatologist and had been long-time patients of a rheumatologist. Two participants (8.3%) (1 with DLE and 1 with SCLE) answered yes to Q2 (whether the diagnosis was made by a nurse practitioner or physician assistant), but both of these also replied yes to Q1 and Q3. Concerning Q10 (whether the patient experienced diffuse hair loss), 12 CLE participants (50.0%) answered yes, 6 of whom had underlying SLE. The 3 DLE-only patients (12.5%) had scalp involvement and had also answered yes to Q5 (photograph depicting discoid lupus of the scalp). For Q11 (whether the patient experienced photosensitivity), 19 individuals (79.2%) answered yes, including 6 of 11 with DLE (54.5%), all 9 of those with SCLE (47.3%), and all 4 respondents with both SCLE and DLE (21.0%). The 5 patients (20.8%) who replied no to Q11 had a diagnosis of DLE, and 2 of these individuals (40.0%) had underlying SLE.
Although there were no cases of ACLE, a total of 10 individuals (41.2%) with CLE had underlying SLE. Thirteen of the 24 individuals (54.2%) with CLE reported having a malar rash (yes to Q4). Of these 13 individuals, 9 had SCLE (69.2%) (7 had a rash on their face associated with CLE, eczema, or rosacea), 2 (15.4%) had DLE involving the face, and 2 (15.4%) had SCLE and DLE also affecting the face. Six of the 13 individuals (46.2%) who reported having a malar rash had underlying SLE and photosensitivity (yes to Q11). Of 11 DLE-only cases, 10 participants (90.1%) answered yes to Q5, Q6, or Q7 (all photographs relating to DLE skin lesions). Eight of 9 individuals (88.9%) with only SCLE replied yes to Q8 or Q9 (both classic photographic representations of SCLE). Finally, all of the 4 respondents with DLE and SCLE answered yes to at least 1 of Q5 through Q9.
Table 4 presents the sensitivity, specificity, positive predictive value, and negative predictive value for each scoring algorithm and each scored question of the CLUSE tool. Because no individuals with ACLE were recruited during the study period, sensitivities and positive predictive values were not calculated for algorithms 5 and 6. Of the 9 algorithms, algorithm 9 (if yes to Q3 and any of Q5 through Q9), used for diagnosing CLE regardless of subtype, demonstrated the highest sensitivity (87.5%). In addition, algorithm 9 had a high specificity (96.8%). Algorithm 7 (if yes to Q3 and Q11) also showed a reasonable sensitivity (70.8%) and high specificity (97.8%) for diagnosing CLE. In terms of CLE subtype, algorithm 3 (if yes to Q3 and Q8 or Q9) revealed the highest sensitivity (76.9%) as well as a high specificity (97.1%) for diagnosing SCLE. The lowest sensitivity (33.3%) was observed with algorithm 2 (if yes to Q5 and Q6 or Q7), which was used for diagnosis of DLE. All algorithms had high specificities, with the lowest being algorithm 6 (88.9%).
Regarding individual questions, Q3 (whether the diagnosis of CLE was made by a dermatologist) had the highest sensitivity (91.7%) and high specificity (93.5%). The question with the second-highest sensitivity (79.2%) was Q11 (whether the patient experienced photosensitivity). Combining these questions produced algorithm 7, with a sensitivity of 70.8% and specificity of 97.8%.
Cutaneous lupus erythematosus is a chronic inflammatory disorder of the skin that remains poorly understood and difficult to study. There is a need for tools, such as the CLUSE questionnaire, to confirm cases of CLE for use in epidemiologic studies. We have shown in this study that a combination of questions and representative photographs can ascertain cases of CLE with high sensitivity and specificity—we obtained high specificities for all 9 algorithms developed. The CLUSE questionnaire was designed as a 1-page document to be brief and easy to administer, free of discomfort and risk to the patient, and function visually. We plan to administer this questionnaire to individuals in large, established patient databases to gather epidemiologic data on this disease.
Our study had several limitations. To fully appreciate the quality of the color clinical photographs, the questionnaire should be printed on glossy photographic paper, which is more costly than regular printing paper. However, given the cost of other screening modalities, such as a complete examination by a dermatologist, the cost of the CLUSE tool appears reasonable. When developing this tool, we elaborated algorithms for the 3 main subtypes of CLE as well as for CLE as a whole. A unifying and all-encompassing questionnaire was challenging to produce given the many clinical presentations and subtypes of CLE. In addition, although dermatologic textbooks describe classic morphologies for these cutaneous findings, patients may have overlapping conditions or atypical features and thus not fit precisely into a single diagnostic category. Also, owing to the space limitations of a 1-page questionnaire, there is limited variability in the clinical photographs included. Therefore, although the CLUSE tool used 3 distinct algorithms for CLE, regardless of subtype, and included 6 questions with 14 color images, it was not designed to detect atypical manifestations of CLE. Another limitation is that potential lupus mimickers, such as polymorphous light eruption and photoallergic or phototoxic eruptions, were not captured in this small pilot study. Therefore, further investigation is warranted to determine whether these photosensitive conditions could potentially be misclassified in a larger cohort study. Finally, in this pilot study, we administered the CLUSE questionnaire to patients in dermatology outpatient clinics in a hospital setting, hence limiting the generalizability of the results. Our plan is to pilot test this tool in other outpatient clinics, including rheumatology clinics.
The question with the highest sensitivity was Q3 (whether the diagnosis of CLE was made by a dermatologist). When this question was combined with any of the 5 representative photographs depicting DLE or SCLE (algorithm 9), the highest sensitivity of all algorithms (87.5%) was obtained. The algorithms with the lowest sensitivities pertained to DLE, which was unexpected given that this subtype tends to have more frequent classic cutaneous findings and distribution relative to other subtypes of CLE. In fact, algorithm 2 used only clinical photographs and scored the lowest sensitivity of all algorithms (33.3%). All algorithms demonstrated high specificity (≥89%), which would potentially allow the CLUSE tool to be used to capture individuals with CLE from large patient cohorts. A significant difference was also noted between the total CLUSE score for the CLE and non-CLE cases. This difference was also statistically significant (P < .001) for each comparison. When analyzing the responses among respondents in the non-CLE group who answered yes to Q12 through Q15 (whether the patient had eczema, psoriasis, hives, or rosacea, respectively), most of these participants did not also answer yes to Q4 through Q9 (the clinical photographs representing ACLE, SCLE, and DLE). This finding is important because the above dermatoses may mimic the various subtypes of CLE clinically.
The CLUSE tool is a brief, self-administered questionnaire with low respondent burden used for the identification of CLE. It is meant as a screening tool, not as a substitute for a thorough examination by a dermatologist. The novel features of the CLUSE tool include high-quality representative clinical images and inclusion of the main subphenotypes of CLE. This pilot study reveals reasonably high sensitivity and specificity for CLE. The high specificity of the CLUSE tool indicates that it may be advantageous in capturing individuals with CLE from a larger cohort to use for epidemiologic studies.
Corresponding Author: Christina Lam, MD, Department of Dermatology, Boston University School of Medicine, 609 Albany St, J-105, Boston, MA 02118 (firstname.lastname@example.org).
Accepted for Publication: July 23, 2015.
Published Online: October 28, 2015. doi:10.1001/jamadermatol.2015.3088.
Author Contributions: Drs Lam and Vleugels had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Lam, Qureshi, Vleugels.
Acquisition, analysis, or interpretation of data: Lam, Liu, Townsend, Femia, Vleugels.
Drafting of the manuscript: Lam, Townsend.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Lam, Liu.
Obtained funding: Vleugels.
Administrative, technical, or material support: Lam, Townsend, Femia.
Study supervision: Qureshi, Vleugels.
Conflict of Interest Disclosures: Dr Lam reported receiving grant support from Biogen IDEC for a study unrelated to the present report. Dr Qureshi reported serving as a consultant to and receiving speaking fees and honoraria from AbbVie, Amgen, the Centers for Disease Control and Prevention, Janssen, Merck, Novartis, and Pfizer. No other disclosures were reported.
Funding/Support: This work was supported by a Medical Dermatology Career Development Award from the Dermatology Foundation (Dr Vleugels).
Role of the Funder/Sponsor: The Dermatology Foundation had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Additional Contributions: Cara J. Joyce, PhD, Department of Biostatistics and Bioinformatics, DePaul University, provided statistical support. There was no financial compensation. We thank the patients for granting permission to publish this information.