eAppendix. Automated software targeting original research spreads COVID-19 misinformation on Facebook
Customize your JAMA Network experience by selecting one or more topics from the list below.
Identify all potential conflicts of interest that might be relevant to your comment.
Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.
Err on the side of full disclosure.
If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.
Not all submitted comments are published. Please see our commenting policy for details.
Ayers JW, Chu B, Zhu Z, et al. Spread of Misinformation About Face Masks and COVID-19 by Automated Software on Facebook. JAMA Intern Med. 2021;181(9):1251–1253. doi:10.1001/jamainternmed.2021.2498
The dangers of misinformation spreading on social media during the COVID-19 pandemic are known.1 However, software that allows individuals to generate automated content and share it via counterfeit accounts (or “bots”)2 to amplify misinformation has been overlooked, including how automated software can be used to disseminate original research while undermining scientific communication.
We analyzed conversations on public Facebook groups, a platform known to be susceptible to automated misinformation,3 concerning the publication of the Danish Study to Assess Face Masks for the Protection Against COVID-19 Infection (DANMASK-19) to explore automated misinformation.4 We selected DANMASK-19 because it was widely discussed (it was the fifth most shared research article of all time as of March 2021 according to Altmetric5) and demonstrated that masks are an important public health measure to control the pandemic.
We obtained the names of 563 Facebook groups in which a link to the publication of DANMASK-19 on the Annals of Internal Medicine website was posted and downloaded all available posts (N = 299 925) from these groups using CrowdTangle (crowdtangle.com). We limited our study period to the 5 days following the publication of DANMASK-19 (November 18, 2020, through November 22, 2020) because media interest is typically greatest initially. This study was exempted as not human participants research by the University of California, San Diego Human Research Protections Program. Additional details are provided in the eAppendix in the Supplement.
When identical links are posted in close succession, it suggests that automated software was used.2,3 We identified the subsets of Facebook groups that were the most or least likely to be affected by automation by calculating the frequency that identical links were posted to pairs of Facebook groups and the time that elapsed between these posts for all links (n = 251 656) shared during the study period. Adapting past operationalizations,3 a pair of Facebook groups that (1) hosted identical links 5 or more times and (2) at least half of these links being posted within less than 10 seconds would be considered the most affected by automation. Comparatively, Facebook groups in which the total time elapsed between identical links was in the top 90th percentage of time between postings were considered the least affected by automation. Facebook groups that were most affected by automation had a mean (SD) of 4.28 (3.02) seconds between shares of identical links compared with 4.35 (11.71) hours for those least affected by automation.
To quantify the extent to which Facebook groups were subject to misinformation, all posts that linked to DANMASK-19 in the groups most or least affected by automation were qualitatively coded by 2 authors (B.C. and Z.Z.) for 2 types of misinformation: (1) whether the primary conclusion of DANMASK-19 was misrepresented (eg, mask wearing harms the wearer) and (2) whether conspiratorial claims were made about DANMASK-19 (eg, claims of covert political/corporate control). A separate outcome for not including either form of misinformation was computed. Table 1 presents example posts. Coders disagreed on 3.9% of labels (Cohen κ = 0.76) and resolved disagreements unanimously with the first author (J.W.A.).
The percentage of posts that linked to DANMASK-19 that included each type of misinformation or neither type was calculated separately for the sets of Facebook groups most and least affected by automation along with prevalence ratios comparing these percentages. Statistical significance was set to P < .05 and 95% confidence intervals were bootstrapped. Analyses were computed with R, version 3.6.1 (R Foundation).
A total of 712 posts that provided direct links to DANMASK-19 were shared in 563 public Facebook groups. Of these, 279 posts (39%) that linked to DANMASK-19 were in Facebook groups most affected by automation, of which 17 were deleted and unavailable for further analysis. Sixty-two posts (9%) were made in Facebook groups that were least affected by automation, and 3 were deleted.
Among posts made to groups most affected by automation, 19.8% (95% CI,14.9%-24.5%) claimed masks harmed the wearer, 50.8% (95% CI, 44.6%-56.5%) made conspiratorial claims about the trial, and 43.9% (95% CI, 37.4%-49.6%) made neither claim (Table 2). In contrast, among posts made to groups least affected by automation, 8.5% (95% CI, 1.7%-15.2%) claimed masks harmed the wearer, 20.3% (95% CI, 10.2%-30.5%) made conspiratorial claims about the trial, and 72.9% (95% CI, 59.3%-81.4%) made neither claim.
The percentage of posts linking to DANMASK-19 that claimed that masks harmed the wearer was 2.3 (95% CI, 1.0-6.5) times higher in Facebook groups that were most affected by automation vs groups that were least affected by automation; conspiratorial claims (prevalence ratio, 2.5; 95% CI, 1.5-4.5) were also higher in Facebook groups that were most affected by automation. Making neither claim was more common in Facebook groups that were least affected by automation (prevalence ratio, 0.6; 95%CI, 0.5-0.7).
A campaign that presumably used automated software6 promoted DANMASK-19 on Facebook groups to disseminate misinformation. The limitations of the study include that the entities responsible for organizing this automated campaign cannot be determined, only public Facebook groups were studied, and only a single high-profile study over a few days was evaluated.
Scientific journals are easy targets of automated software. Possible approaches to prevent misinformation due to dissemination of articles by automated software include legislation that penalizes those behind automation; greater enforcement of rules by social media companies to prohibit automation; and counter-campaigns by health experts.
Accepted for Publication: April 15, 2021.
Published Online: June 7, 2021. doi:10.1001/jamainternmed.2021.2498
Corresponding Author: John W. Ayers, PhD MA; #333 CRSF 9500 Gilman Drive, La Jolla, CA 92093 (email@example.com).
Author Contributions: Drs Ayers and Broniatowski had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: All authors.
Acquisition, analysis, or interpretation of data: Ayers, Chu, Zhu, Leas, Smith, Broniatowski.
Drafting of the manuscript: Ayers, Chu, Zhu, Leas, Smith, Broniatowski.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Chu, Leas, Broniatowski.
Obtained funding: Smith.
Administrative, technical, or material support: Chu, Smith, Dredze, Broniatowski.
Supervision: Smith, Dredze, Broniatowski.
Conflict of Interest Disclosures: Dr Ayers reported owning equity in Directing Medicine, Health Watcher, and Good Analytics outside the submitted work. Dr Leas reported receiving consulting fees from Health Watcher and Good Analytics for similar work outside the context of these analyses. Dr Smith reported an endowment from the John and Mary Tu Foundation Endowment and grants from the National Institutes of Health during the conduct of the study and consulting fees from FluxErgy, Bayer, Arena Pharmaceuticals, and Linear Therapies outside the submitted work. Dr Dredze reported personal fees from Bloomberg LP and Good Analytics outside the submitted work. No other disclosures were reported.
Funding/Support: This work was supported by the Burroughs Wellcome Fund, National Institutes of Health grant AI036214, The John S. and James L. Knight Foundation to the George Washington University Institute for Data, Democracy, and Politics, and The John and Mary Tu Foundation.
Role of the Funder/Sponsor: The funding organizations had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Data Sharing Statement: The data used in the study are public in nature and are available from CrowdTangle. CrowdTangle prohibits providing raw data to anyone outside of a CrowdTangle user’s account. Anyone with a CrowdTangle account may access these corresponding data. Researchers may request CrowdTangle access at https://help.crowdtangle.com/en/articles/4302208-crowdtangle-for-academics-and-researchers.
Additional Contributions: We appreciate discussions and editing with Benjamin M. Althouse, PhD, ScM (Gates Foundation), Alicia Nobles, PhD (University of California, San Diego) and Christopher Longhurst MD (University of California San Diego). They were not compensated for their contributions.
Create a personal account or sign in to: