Key PointsQuestion
Is electronic directly observed therapy (DOT) noninferior to in-person DOT in supporting medication adherence for tuberculosis treatment?
Findings
In this randomized, 2-period crossover noninferiority trial of 216 patients with tuberculosis, the modified intention-to-treat analysis estimate of the percentage of medication doses staff observed patients ingest with in-person DOT was 87.2% vs 89.8% with electronic DOT. The percentage difference between DOT methods was −2.6%, which was less than the noninferiority margin of 10% at a statistically significant level.
Meaning
These findings suggest that electronic DOT was noninferior to in-person DOT when employed by a tuberculosis program that has historically implemented in-person DOT successfully.
Importance
Electronic directly observed therapy (DOT) is used increasingly as an alternative to in-person DOT for monitoring tuberculosis treatment. Evidence supporting its efficacy is limited.
Objective
To determine whether electronic DOT can attain a level of treatment observation as favorable as in-person DOT.
Design, Setting, and Participants
This was a 2-period crossover, noninferiority trial with initial randomization to electronic or in-person DOT at the time outpatient tuberculosis treatment began. The trial enrolled 216 participants with physician-suspected or bacteriologically confirmed tuberculosis from July 2017 to October 2019 in 4 clinics operated by the New York City Health Department. Data analysis was conducted between March 2020 and April 2021.
Interventions
Participants were asked to complete 20 medication doses using 1 DOT method, then switched methods for another 20 doses. With in-person therapy, participants chose clinic or community-based DOT; with electronic DOT, participants chose live video-conferencing or recorded videos.
Main Outcomes and Measures
Difference between the percentage of medication doses participants were observed to completely ingest with in-person DOT and with electronic DOT. Noninferiority was demonstrated if the upper 95% confidence limit of the difference was 10% or less. We estimated the percentage of completed doses using a logistic mixed effects model, run in 4 modes: modified intention-to-treat, per-protocol, per-protocol with 85% or more of doses conforming to the randomization assignment, and empirical. Confidence intervals were estimated by bootstrapping (with 1000 replicates).
Results
There were 173 participants in each crossover period (median age, 40 years [range, 16-86 years]; 140 [66%] men; 80 [37%] Asian and Pacific Islander, 43 [20%] Black, and 71 [33%] Hispanic individuals) evaluated with the model in the modified intention-to-treat analytic mode. The percentage of completed doses with in-person DOT was 87.2% (95% CI, 84.6%-89.9%) vs 89.8% (95% CI, 87.5%-92.1%) with electronic DOT. The percentage difference was −2.6% (95% CI, −4.8% to −0.3%), consistent with a conclusion of noninferiority. The 3 other analytic modes yielded equivalent conclusions, with percentage differences ranging from −4.9% to −1.9%.
Conclusions and Relevance
In this trial, the percentage of completed doses under electronic DOT was noninferior to that under in-person DOT. This trial provides evidence supporting the efficacy of this digital adherence technology, and for the inclusion of electronic DOT in the standard of care.
Trial Registration
ClinicalTrials.gov Identifier: NCT03266003
In the US, directly observed therapy (DOT) is a key component of tuberculosis (TB) control. During DOT, TB program staff observe patients ingest medication in locations convenient to patients.1-11 This approach is costly and poses logistical challenges for TB programs and patients.12,13 In response, programs have sought to capitalize on advances in communication technology to develop alternatives to in-person DOT. Use of one such approach, electronic DOT, has steadily increased in recent years.14 Electronic DOT employs personal electronic devices, particularly smart mobile telephones with video capabilities, to view patients remotely ingest their medications.
The US Centers for Disease Control and Prevention (CDC) and World Health Organization’s (WHO) Global Task Force on Digital Health for TB promote the use of digital technologies to address challenges in TB prevention, care, and control in a patient-centered manner.15,16 Robust evidence across different population groups to support these recommendations is limited. A recent review of digital technology to enhance TB control identified 19 studies that reported electronic DOT was feasible, acceptable, and associated with good treatment adherence; most of these studies were observational.17
Three randomized trials have reported higher levels of treatment observation,18,19 comparable treatment completion rates,20 lower program-incurred costs,18,20 lower patient-incurred costs,19 and greater satisfaction19,20 among patients who used electronic compared with patients randomized to in-person DOT. These results are encouraging. However, it is important to note that 2 of the 3 trials used only clinic-based in-person DOT as a comparator.19,20 Multiple studies have demonstrated that when patients are directed to undergo DOT in clinical facilities, the studied cohorts often have lower treatment completion rates,21-24 less treatment success,24 increased mortality,24 less satisfaction with their care,21 and higher out-of-pocket costs21 compared with patients who undergo DOT in a more patient-centered manner, such as within their homes or in community-based settings.24,25 The Story et al18 trial randomized participants to either clinic-, home-, and community-based in-person or electronic DOT using patient-recorded videos asynchronously viewed by treatment observers. This trial included a large percentage of patients with social features often associated with poor adherence. While this trial demonstrated that electronic DOT was advantageous for a difficult-to-reach population, overall adherence was low for participants in either group.18
We sought to determine whether electronic DOT could achieve as high a level of treatment observation in a large multiclinic TB control program as could be achieved with in-person DOT conducted at patient-preferred locations. Implemented under pragmatic conditions with a diverse patient population in a large urban TB program, the purpose of our study was to expand the knowledge base on digital adherence technologies.
Objective and Study Design
Our primary objective was to assess the difference in the percentage of medication doses that staff observed participants completely ingest with electronic vs in-person DOT. We used a randomized, 2-arm, 2-period crossover, noninferiority design. This trial was conducted in 4 clinics operated by the New York City (NYC) Department of Health and Mental Hygiene, Bureau of Tuberculosis Control (BTBC). The trial protocol (Supplement 1) was approved by institutional review boards at the NYC Department of Health and Mental Hygeine and Columbia University. All participants provided written informed consent. The study followed the Consolidated Standards of Reporting Trials (CONSORT) reporting guideline.
Enrollment began July 2017 and ended October 2019. We included persons aged 12 years and older, with a suspected, laboratory-confirmed, or clinical diagnosis of TB disease, who were prescribed oral anti-TB medication,26 had a residence location accessible to staff, and had no plans to move for 9 months. We recruited English and non-English speaking participants using bilingual staff, contracted interpreters, and translated data collection forms. To assess whether participants were similar across analytic groups and representative of the NYC BTBC patient population, participants’ self-reported race and ethnicity data were retrieved from their Department of Health and Mental Hygeine electronic medical record. Persons were excluded if prescribed injectable TB medications or the supervising physician advised use of in-person DOT. We also excluded those with a cognitive or physical disability that prevented their use of electronic DOT who lacked a caretaker to assist them.
Participants were randomized 1:1 to start outpatient treatment with either in-person or electronic DOT using a computer-generated random list for each clinic. This list was used to create numbered and sealed opaque randomization packets for each clinic. Following each participant enrollment, staff opened packets sequentially to make group assignments. Those randomized to group 1 underwent in-person DOT during crossover period 1, followed by electronic DOT during period 2. For group 2 participants, the order of DOT methods was reversed.
Study Measurements and Procedures
For this study, all nonholiday, weekday doses scheduled in advance for DOT were designated scheduled and observable, and an outcome was documented for each dose. The NYC BTBC routinely provided DOT on weekdays only. Each crossover period comprised 20 scheduled and observable doses. If a treating clinician held all medications, the scheduled and observable criteria could not be met, and these doses were excluded from analysis. Similarly, doses were excluded if a participant was admitted to a medical or correctional facility.
If a problem arose during a DOT session, the type of problem encountered (eg, technical-, patient-, or program-related) and reasons for nonobservation were documented. At the conclusion of the 2 crossover periods, participants chose their preferred DOT method for their remaining treatment.
Participants undergoing in-person DOT could choose to meet with health department staff at the TB clinic (clinic-based DOT) or at a mutually agreed-upon location in the community (community-based DOT). While undergoing electronic DOT, participants could choose live videoconferencing (Skype for Business), which allowed TB program staff to interact with participants in real-time, or recorded (asynchronous) videos using a software application that automatically uploaded time-stamped videos to a secure cloud-based server (SureAdhere Mobile Technology, Inc), and which TB program staff reviewed the following workday. To ensure participants’ competency with electronic DOT software applications, a standardized teach-back training method was used.
Analogous with BTBC practice, participants used personal smartphones or other video-capable devices (eg, tablet) to engage in electronic DOT. Participants who did not possess a device were loaned a smartphone by the BTBC at no charge. Those using personal devices were provided a $10 gift card each month to reimburse data usage costs. Additionally, all participants were provided $50 for completing the study’s enrollment visit, and another $50 if they completed an opinion questionnaire following the 2 crossover periods.
Participant care was coordinated according to BTBC case management policies.26 Treatment was prescribed and provided at no cost to the patient according to New York State law.
We computed a sample size under a parallel design, then modified the computation to account for pooled variances27 and reduced the sample size to account for the effect of the crossover design.28 We estimated that 256 participants were required to determine, with 90% power and a two-sided significance level of 2.5%, whether electronic DOT is noninferior to in-person DOT, using a prespecified noninferiority margin of 10%. This margin was based on the presumption that electronic DOT would be of interest to programs because of logistical and cost advantages, even if adherence was slightly worse than when DOT is conducted in-person.
Each scheduled and observable dose of medication was classified with a binary outcome: staff observed the participant completely ingest the dose of medication (hereafter called a “completed dose”), or they did not. The binary dose outcomes were analyzed using a logistic generalized linear mixed effects regression model (GLMM),29 which included fixed-effect explanatory variables representing DOT method at each dose, participant randomization group, crossover period, the dose outcome during each of the 2 preceding scheduled and observable doses (representing carryover effects), season (represented as calendar quarter), and the interaction between DOT method and season. To minimize bias from expected correlations among doses observed within the same participant and among participants treated at the same clinic, the GLMM included random effects representing each tuberculosis treatment clinic and each participant nested within their respective clinic. Participants were included in the primary statistical analysis if they completed both crossover periods with sufficient data to represent carryover effects in the GLMM (eAppendix 2 in Supplement 2).
The percentages of completed doses with electronic and in-person DOT were estimated as least-square means from the GLMM. The percentage difference was calculated by subtracting the percentage of completed doses observed with electronic DOT from the percentage with in-person DOT. Robust estimates of percentages, percentage differences, and confidence limits were obtained with the bootstrap method by repeating the described calculations for 1000 replicate data sets.30 To test for noninferiority, the bootstrap 95% upper confidence limit of the percentage difference was compared with the designated 10% noninferiority margin; a 95% upper confidence limit less than the noninferiority margin is consistent with a conclusion of noninferiority at a 5% confidence level. The GLMM was run in 4 analytic modes: modified intention-to-treat (ITT), empirical (ie, as-observed; EMP), per-protocol (PP), and PP 85%. In the modified ITT analysis, the DOT method of each dose was represented according to participants’ randomization assignment. This ITT mode was described as modified because it excluded 38 participants postrandomization who lacked sufficient data to represent carryover effects in the GLMM. In the EMP mode, the DOT method of each dose was represented according to the DOT method actually used. The PP analysis was restricted to participants whose DOT method at each dose matched their randomization assignment. The PP 85% analysis was restricted to patients with 85% or more doses that matched their randomization assignment; doses that did not match randomization were represented according to the DOT method the participant used. Additional details about the statistical analysis and sensitivity analyses are provided in Appendix 1 in Supplement 2.
Demographic Characteristics of Enrolled Patients and Analytical Samples
A total of 216 persons were randomized (median age, 42 years [range, 16-86 years]; 140 [65%] men) among 820 persons screened (Figure 1; eTable 2 in Supplement 2). Five participants (2%) withdrew before crossover period 1 commenced and were unavailable for analysis, and another 38 (18%) withdrew during crossover period 1. The remaining 173 participants completed both crossover periods and were included in the modified ITT and EMP analyses.
The demographics of participants included in the modified ITT (173 participants), PP (43 participants), and PP 85% (138 participants) analyses were similar to nonenrolled patients who underwent TB treatment through the NYC BTBC during the study period (Table 1). Overall, the distribution of racial and ethnic groups was similar (total enrolled: 43 [20%] non-Hispanic African American or Black individuals; 80 [37%] Asian, Pacific Islander, and Hawaiian individuals; 71 [33%] Hispanic individuals; and 9 [4%] non-Hispanic White individuals), although proportionally more persons of Asian descent were enrolled, and fewer Hispanic persons were included in the PP analysis (11 [26%] individuals). Characteristics of the randomized groups were generally similar. Proportionally more persons 61 years and older were randomized to group 2 (eTable 3 in Supplement 2).
DOT Usage Patterns by Dose
In total, 138 (80%) participants switched DOT methods for crossover period 2 in accordance with study protocol (Figure 2). During crossover period 1, 2 (1%) group 1 participants elected to switch to electronic DOT prior to the start of crossover period 2. Further, 27 (16%) remained on electronic DOT and 6 (4%) remained on in-person DOT by their own choice for both crossover periods. Patients undergoing electronic DOT used in-person DOT intermittently during clinic appointments when they had not yet taken their medication.
Effect of DOT Method on Dose Completion
Electronic DOT was noninferior compared with in-person DOT (Table 2 and Figure 3). In the modified ITT analytic mode (173 participants), the bootstrap percentage of completed doses with in-person DOT was 87.2% (95% CI, 84.6% to 89.9%) vs 89.8% (95% CI, 87.5% to 92.1%) with electronic DOT. The bootstrap percentage difference was −2.6% (95% CI, −4.8% to −0.3%). The upper 95% confidence limit of −0.3% was far less than the 10% noninferiority limit, which was consistent with electronic DOT being noninferior to in-person DOT in attaining dose completion. Results of the EMP (−2.2%; 95% CI, −4.8% to 0.4%), PP (−4.9%; 95% CI, −11.7% to 2.8%), and PP 85% (−1.9%; 95% CI, −4.5% to 0.9%) analyses were consistent with those of the modified ITT analysis and supported the conclusion that electronic DOT was noninferior to in-person DOT. Furthermore, the magnitude of the bootstrap percentage differences suggested that electronic DOT outperformed in-person DOT by 1.9% to 4.9%.
The association of season with dose completion observed under electronic vs in-person DOT was evaluated with an interaction term in the GLMM. In each season, the upper 95% bootstrap confidence limit for the percentage difference was less than the 10% noninferiority limit for all 4 analytic modes, with 1 exception. In the PP analytic mode, the upper confidence limit for spring (April through June: 8.8%; 95% CI, −9.4% to 42.0%) exceeded the noninferiority limit (not tabulated). This was likely a result of the small numbers in this restricted mode (43 participants). Overall, for an urban area located in a temperate climate, season was not significantly associated with the percentage difference in dose completion (eTable 4 in Supplement 2).
To assess whether the conclusion of noninferiority depended on restricting analysis to 173 participants, GLMMs were rerun with 33 of 38 participants who withdrew during crossover period 1 and had data sufficient to represent carryover effects in the logistic GLMMs. This expanded sample comprised 206 participants (none of the excluded participants met the criteria for PP and PP 85% analytic modes), for which ITT and EMP analytic modes were run. Results from the expanded patient sample also supported the conclusion of noninferiority. In addition, the noninferiority conclusion was upheld in (unadjusted) univariate analyses (eTable 1, eFigure in Supplement 2).
Technical, Patient, and Program Issues Affecting DOT Sessions
Issues affecting medication observations occurred with both electronic and in-person DOT. Among 29 900 prescribed medication doses taken during and after the crossover periods for all 216 participants, 20 344 were nonholiday, weekday doses scheduled for DOT. For 2034 (10%) DOT doses, 2239 unique problems were documented. Of these, 1083 (48%) were patient-related (eg, difficulties operating software or work schedules interfered with DOT), 688 (31%) were technical (eg, nonfunctioning internet connections), and 468 (21%) were staff- or program-related (eg, unscheduled absence). Overall, 1301 of the 2239 unique problems led to 1161 (57%) of the 2034 affected DOT observations not being observed. Community-based in-person DOT observations had the greatest percentage of issues with observations (541 observations [19%]) compared with live-video electronic DOT observations (714 [10%]), recorded-video electronic DOT (659 [8%]), and clinic-based in-person DOT (120 [6%]).
Participants’ DOT Preferences for the Remainder of Treatment
Seventy-three (42%) of the 173 participants who completed the crossover period reported they preferred to continue treatment with live-video electronic DOT, 73 (42%) preferred recorded-video electronic DOT, 9 (5%) preferred community-based in-person DOT, 1 (0.6%) preferred clinic-based in-person DOT, and 6 (4%) elected to self-administer their medications. No preference was recorded for 11 (6%) participants, as a TB diagnosis was ruled out for 3, 4 completed treatment, 3 stopped medication for unspecified reasons, and 1 was lost to follow-up.
This trial enabled rigorous evaluation of electronic DOT efficacy under pragmatic conditions with a diverse patient population receiving treatment through an urban TB program. Our results demonstrate that, in this context, electronic DOT was noninferior to in-person DOT across multiple modes of statistical analysis. Moreover, the results rest on the strength of analysis at the level of individual doses, while controlling for biases arising from the study design.
As novel technologies are integrated into the delivery of medical care, potential exists for increasing the number of weekly doses observed using recorded electronic DOT; improving clinical outcomes; delivering patient-centered care; empowering patients; and promoting equity in care.16,31 These possibilities are significant. Poor treatment adherence has thwarted efforts to eliminate TB. Recent data demonstrate an elevated risk of unfavorable outcomes when patients miss as few as 1 dose in 10.32
We also demonstrated that a combination of DOT methods enabled the NYC BTBC to achieve high rates of direct observation. Although electronic DOT was preferred by most patients for the remainder of treatment, 6 of 173 patients (4%) declined electronic DOT in favor of in-person DOT for both crossover periods, and 10 (6%) chose to continue treatment with in-person DOT. Furthermore, a goal of this trial was to assess electronic DOT performance when offered to TB patients from the start of outpatient treatment. Eligibility was not predicated on prior treatment adherence. The trial population was largely similar to the population of TB patients and persons being evaluated for TB. Finally, all 4 DOT methods experienced challenges that interfered with observations. These data command consideration in relation to TB program operations.
This study had several limitations. It was not feasible to mask participants and clinicians to the intervention. The direct effect was that 35 of 173 patients (20%) switched from their assigned DOT method or continued with their previously assigned DOT method into the subsequent crossover period. We controlled the analytic impact of these protocol deviations by assessing the individual dose as the unit of analysis, and represented its characteristics (including the DOT method) in the GLMM analytic approach.
Additionally, some patients were not enrolled because of clinician concerns regarding treatment adherence (eTable 2 in Supplement 2). This exclusion may affect the generalizability of the results. Conceivably, clinicians in settings that offer both in-person and electronic DOT will make similar decisions.
During crossover period 1, more participants in group 2 (24 individuals) withdrew from the study than in group 1 (14), leading to a slightly greater proportion of group 1 participants among the 173 patients included in statistical analysis. However, the effect of this imbalance appeared to have been negligible since bootstrap percentage differences estimated from ITT and EMP analyses of the 206-patient sample (which included 33 of the 38 participants who withdrew), were quantitatively similar to those estimated in modified ITT and EMP analyses with the 173-patient sample. Finally, some patients become less adherent once symptoms abate. This evaluation focused on adherence following the start of outpatient treatment.
Additional study of electronic DOT and logistical challenges surrounding its use across populations with historically sub-optimal treatment outcomes and within different cultural and economic settings is warranted. Insights into how programs may further enhance the effectiveness of electronic DOT are needed. For example, given the social support patients derive from in-person DOT, there may be benefits from supplementing recorded electronic DOT with optimally timed live-video interactions.
In a randomized, crossover noninferiority trial implemented in an urban TB program with a history of successful in-person DOT practice, we found electronic was as effective as in-person DOT for assuring high levels of TB treatment adherence. The findings from this trial support adoption of electronic DOT as a standard care option for programs successfully using in-person DOT, a practice adopted by the NYC BTBC at the start of the COVID-19 pandemic.
Accepted for Publication: November 24, 2021.
Published: January 20, 2022. doi:10.1001/jamanetworkopen.2021.44210
Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2022 Burzynski J et al. JAMA Network Open.
Corresponding Author: Joseph Burzynski, MD, MPH, Bureau of Tuberculosis Control, New York City Department of Health and Mental Hygiene, 42-09 28th St, WS 7-449, CN-72B, Long Island City, NY 11101-4132 (jburzyns@health.nyc.gov).
Author Contributions: Dr de Castro had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Drs Burzynski and Mangan are equally first authors and listed alphabetically.
Concept and design: Burzynski, Mangan, Macaraig, Goswami, Lin, Schluger, Vernon.
Acquisition, analysis, or interpretation of data: Burzynski, Mangan, Lam, Macaraig, Salerno, de Castro, Goswami, Lin, Vernon.
Drafting of the manuscript: Burzynski, Mangan, Macaraig, Salerno, de Castro, Lin, Vernon.
Critical revision of the manuscript for important intellectual content: Burzynski, Mangan, Lam, Macaraig, Salerno, de Castro, Goswami, Lin, Schluger, Vernon.
Statistical analysis: Salerno, de Castro, Lin.
Obtained funding: Vernon.
Administrative, technical, or material support: Burzynski, Mangan, Lam, Macaraig, Salerno, Goswami, Schluger, Vernon.
Supervision: Burzynski, Mangan, Lam, Macaraig, Salerno, Schluger.
Conflict of Interest Disclosures: Drs Mangan, Lam, deCastro, Goswami, Lin, and Vernon reported employment with the US Centers for Disease Control and Prevention outside the submitted study. No other disclosures were reported.
Funding/Support: This study was funded by the US Centers for Disease Control and Prevention’s Antibiotic Resistance Solutions Initiative.
Role of the Funder/Sponsor: The funding organization participated in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review and approval of the manuscript; and decision to submit the manuscript for publication.
Group Information: The eDOT Study Team members are listed in Supplement 3.
Disclaimer: Opinions expressed herein are those of the authors, and do not necessarily reflect the official view of the US Centers for Disease Control and Prevention or the New York City Bureau of Tuberculosis Control.
Data Sharing Statement: See Supplement 4.
Additional Contributions: We thank all study participants who took part in this study to the benefit of their fellow patients. We are grateful to the New York City Bureau of Tuberculosis Control case management staff, and staff at the Morrisania, Fort Greene, Washington Heights, and Corona tuberculosis clinics for their collaboration throughout this trial. We thank Brock Stewart, PhD (US Centers for Disease Control and Prevention), who provided the original analytic plan for the trial; Andrew Hill, PhD (US Centers for Disease Control and Prevention), and Patrick Phillips, PhD (University of California, San Francisco), who consulted on statistical aspects of the data analysis; and Richard Garfein, PhD, MPH (University of California, San Diego) for his assistance with the design of the study and data collection. Drs Stewart, Hill, and Garfein were compensated for their work related to this trial. Dr Phillips was not compensated.
1.Hopewell
PC. Tuberculosis control: how the world has changed since 1990.
Bull World Health Organ. 2002;80(6):427.
PubMedGoogle Scholar 4.Alipanah
N, Jarlsberg
L, Miller
C,
et al. Adherence interventions and outcomes of tuberculosis treatment: a systematic review and meta-analysis of trials and observational studies.
PLoS Med. 2018;15(7):e1002595. doi:
10.1371/journal.pmed.1002595Google Scholar 5.Nahid
P, Dorman
SE, Alipanah
N,
et al. Official American Thoracic Society/Centers for Disease Control and Prevention/Infectious Diseases Society of America Clinical Practice Guidelines: treatment of drug-susceptible tuberculosis.
Clin Infect Dis. 2016;63(7):e147-e195. doi:
10.1093/cid/ciw376PubMedGoogle ScholarCrossref 6.Toczek
A, Cox
H, du Cros
P, Cooke
G, Ford
N. Strategies for reducing treatment default in drug-resistant tuberculosis: systematic review and meta-analysis.
Int J Tuberc Lung Dis. 2013;17(3):299-307. doi:
10.5588/ijtld.12.0537PubMedGoogle ScholarCrossref 11.Chaulk
CP, Kazandjian
VA. Directly observed therapy for treatment completion of pulmonary tuberculosis: consensus statement of the Public Health Tuberculosis Guidelines Panel.
JAMA. 1998;279(12):943-948. Published correction appears in JAMA 1998 Jul 8;280(2):134. doi:
10.1001/jama.279.12.943PubMedGoogle ScholarCrossref 14.Ngwatu
BK, Nsengiyumva
NP, Oxlade
O,
et al. The impact of digital health technologies on tuberculosis treatment: a systematic review.
Eur Respir J. 2018;51(1):1701596. doi:
10.1183/13993003.01596-2017Google Scholar 17.Lee
Y, Raviglione
MC, Flahault
A. Use of digital technology to enhance tuberculosis control: scoping review.
J Med Internet Res. 2020;22(2):e15727. doi:
10.2196/15727Google Scholar 18.Story
A, Aldridge
RW, Smith
CM,
et al. Smartphone-enabled video-observed versus directly observed treatment for tuberculosis: a multicentre, analyst-blinded, randomised, controlled superiority trial.
Lancet. 2019;393(10177):1216-1224. doi:
10.1016/S0140-6736(18)32993-3PubMedGoogle ScholarCrossref 19.Ravenscroft
L, Kettle
S, Persian
R,
et al. Video-observed therapy and medication adherence for tuberculosis patients: randomised controlled trial in Moldova.
Eur Respir J. 2020;56(2):2000493. doi:
10.1183/13993003.00493-2020Google Scholar 21.Adewole
OO, Oladele
T, Osunkoya
AH,
et al. A randomized controlled study comparing community based with health facility based direct observation of treatment models on patients’ satisfaction and TB treatment outcome in Nigeria.
Trans R Soc Trop Med Hyg. 2015;109(12):783-792. doi:
10.1093/trstmh/trv091PubMedGoogle ScholarCrossref 22.Cavalcante
SC, Soares
EC, Pacheco
AG, Chaisson
RE, Durovni
B; DOTS Expansion Team. Community DOT for tuberculosis in a Brazilian favela: comparison with a clinic model.
Int J Tuberc Lung Dis. 2007;11(5):544-549.
PubMedGoogle Scholar 23.van den Boogaard
J, Lyimo
R, Irongo
CF,
et al. Community vs. facility-based directly observed treatment for tuberculosis in Tanzania’s Kilimanjaro region.
Int J Tuberc Lung Dis. 2009;13(12):1524-1529.
PubMedGoogle Scholar 24.Zhang
H, Ehiri
J, Yang
H, Tang
S, Li
Y. Impact of community-based DOT on tuberculosis treatment outcomes: a systematic review and meta-analysis.
PLoS One. 2016;11(2):e0147744. doi:
10.1371/journal.pone.0147744Google Scholar 25.Wright
CM, Westerkamp
L, Korver
S, Dobler
CC. Community-based directly observed therapy (DOT) versus clinic DOT for tuberculosis: a systematic review and meta-analysis of comparative effectiveness.
BMC Infect Dis. 2015;15:210. doi:
10.1186/s12879-015-0945-5Google Scholar 28.Chow
S, Shao
J, Wang
H, Lokhnygina
Y. Sample Size Calculations in Clinical Research. 2nd ed. Chapman & Hall; 2008:90.
29.Agresti
A. Categorical Data Analysis. 3rd ed. Wiley; 2013.
32.Imperial
MZ, Nahid
P, Phillips
PPJ,
et al. A patient-level pooled analysis of treatment-shortening regimens for drug-susceptible pulmonary tuberculosis.
Nat Med. 2018;24(11):1708-1715. Published correction appears in Nat Med. 2019 Jan;25(1):190. doi:
10.1038/s41591-018-0224-2PubMedGoogle ScholarCrossref