Customize your JAMA Network experience by selecting one or more topics from the list below.
Identify all potential conflicts of interest that might be relevant to your comment.
Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.
Err on the side of full disclosure.
If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.
Not all submitted comments are published. Please see our commenting policy for details.
Carlo AD, Hosseini Ghomi R, Renn BN, Strong MA, Areán PA. Assessment of Real-World Use of Behavioral Health Mobile Applications by a Novel Stickiness Metric. JAMA Netw Open. 2020;3(8):e2011978. doi:10.1001/jamanetworkopen.2020.11978
Digital health treatments for individuals with behavioral health problems are increasing rapidly in number.1,2 Studies to date have demonstrated that longitudinal patient engagement is challenging,3,4 making it unlikely for most applications (hereafter, apps) to effect real-world change. In this cross-sectional study, we describe usage patterns of popular mobile behavioral health apps and identify characteristics of those that are most continually accessed (also known as the stickiest),5 with the aim of informing future research on behavioral health app engagement.
This cross-sectional study was granted exemption from review and informed consent by the University of Washington institutional review board as it did not involve human participants. The study follows the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline for cross-sectional studies.
This study builds on a previously published study1 that curated a sample of apps listed on 5 online rating frameworks, including 201 apps from iOS (Apple) and 152 apps from Google Play (Alphabet), excluding those (1) not designed to treat a behavioral health disorder or (2) not promoting a specific behavioral health treatment or technique. Download and utilization data for all apps were procured between September 21 and October 21, 2018, from Priori Data, a mobile app market research firm.1 For this analysis, we applied additional selection criteria excluding apps with (1) availability restricted to 1 of the 2 leading marketplaces (either Apple iOS or Google Play), (2) fewer than 10 000 total global downloads since first tracked on Priori Data, or (3) fewer than 1000 total global monthly active users over the 30 days preceding data acquisition.
We specify a novel stickiness metric derived from the quotient of 2 of our app parameters: monthly active users and total downloads. This metric, termed number of monthly active users per normalized total downloads, estimates the number of active app users per normalized download, with a higher number indicating greater stickiness. Of note, total download figures were normalized at the month level to account for variation in the lengths of time that apps have been available on the marketplaces. Additionally, the total download and stickiness metric values for each app across both marketplaces were summed to create composite values. Calculations were conducted using Excel version 16.36 (Microsoft). Data were analyzed from November 2019 to May 2020.
A total of 46 apps met inclusion criteria for this analysis. Total downloads and stickiness metric values for the 10 most downloaded apps1 are shown in Table 1. All of the 10 most downloaded apps were from private developers; 6 were for meditation (Headspace; Calm; Relax Meditation P; Insight Timer; Stop, Breathe & Think; and Sanvello for Stress & Anxiety), 2 focused on cognitive training (Peak and Lumosity), and none were developed for specific behavioral health conditions. Among the 46 included apps, the median (range) stickiness score was 1.76 (0.26-9.87) and the mean (SD) stickiness score was 2.23 (1.84). Only 3 of the 10 most downloaded apps had stickiness metric scores above the mean (Lumosity: stickiness score, 2.63; Headspace: stickiness score, 3.18; Relax Meditation P: stickiness score, 2.48). Table 2 shows the total downloads and stickiness metric values for the 10 stickiest apps, only 2 of which were ranked in the top half with regard to total downloads (Smiling Mind: 2.2 million total downloads; stickiness score, 3.44; Headspace: 26 million total downloads; stickiness score, 3.18). Among the top 10 stickiest apps, 9 were from private developers (CogniFit, Muse, Stop Smoking in 2 Hours, Daybreak, Youper, PanicShield, Smiling Mind, Relax and Sleep Well, and Headspace), 3 targeted alcohol or cigarette use (My QuitBuddy, Stop Smoking in 2 Hours, and Daybreak), 2 included cognitive behavioral therapy skills (Youper and PanicShield), and 5 focused on meditation (Muse, Youper, Smiling Mind, Relax and Sleep Well Hypnosis, and Headspace).
This cross-sectional study examined download and utilization data of behavioral health apps with a focus on stickiness. We found that the most downloaded apps were not necessarily the stickiest; 7 of the 10 most downloaded were below the stickiness mean of the full included sample. We also found that some lesser known behavioral health apps (eg, Youper) are stickier than those that are widely known and downloaded (eg, Lumosity). Furthermore, our findings suggest that attributes of behavioral health apps focusing on alcohol use, smoking cessation, cognitive training, and cognitive behavioral therapy skills may be auspicious targets for future quantitative or qualitative studies on digital health engagement. While app-specific total download statistics may reflect differing levels of marketing and financial resources, we believe that the stickiness metric is more suggestive of intrinsic app-related characteristics that may be compelling to users.4 In a field in which few apps are used longitudinally6 and a panoply of engagement strategies has been tested with disappointing results,3 this is precisely the type of knowledge that is needed to inform app development.
Our findings are limited by the cross-sectional nature of our data; although total downloads are cumulative, the monthly active users figure is based on solely the 30 days prior to data acquisition (August-October 2018). In conclusion, future research is needed to identify key features of notably sticky apps using longitudinal methods. Such knowledge may assist in the successful real-world dissemination of evidence-based treatment models through mobile platforms and may allow the burgeoning digital health field to reach its full potential.
Accepted for Publication: May 18, 2020.
Published: August 3, 2020. doi:10.1001/jamanetworkopen.2020.11978
Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2020 Carlo AD et al. JAMA Network Open.
Corresponding Author: Andrew D. Carlo, MD, MPH, Department of Psychiatry and Behavioral Sciences, University of Washington School of Medicine, 1959 NE Pacific St, PO Box 356560, Room BB1644, Seattle, WA 98191 (email@example.com).
Author Contributions: Dr Carlo had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: All authors.
Acquisition, analysis, or interpretation of data: All authors.
Drafting of the manuscript: Carlo, Strong.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Carlo, Hosseini Ghomi.
Obtained funding: Hosseini Ghomi.
Administrative, technical, or material support: Carlo, Hosseini Ghomi, Areán.
Supervision: Hosseini Ghomi, Areán.
Conflict of Interest Disclosures: Dr Hosseini Ghomi reported owning stock options in NeuroLex Laboratories outside the submitted work. No other disclosures were reported.
Funding/Support: Dr Carlo was supported by training grant No. T32 MH073553 from the National Institute of Mental Health (NIMH) during the course of the study. Dr Hosseini Ghomi was supported by grant No. R25MH104159 from the NIMH. Drs Areán and Renn were supported by grant No. P50 MH115837 from an NIMH-funded postdoctoral fellowship outside the submitted work.
Role of the Funder/Sponsor: The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Create a personal account or sign in to: