Russian online users known as trolls are taking to social media to sow public conflict about vaccination, according to a new study published in the American Journal of Public Health.
In what sounds like a hybrid story joining Star Trek and The Hobbit, US researchers from George Washington University, the University of Maryland and Johns Hopkins University found that bots, trolls, cyborgs, content polluters and other actors stimulate distrust in the efficacy of vaccination to prevent diseases.
Many of these actors spread false or misleading information, but sources identified as Russian in origin produce messages that argue on both sides of the discussion, with the aim, the researchers suggest, of “promoting discord”.
“Accounts masquerading as legitimate users create false equivalency, eroding public consensus on vaccination,” the report finds.
“These trolls seem to be using vaccination as a wedge issue, promoting discord in American society,” says Mark Dredze, one of the authors of the study.
“However, by playing both sides, they erode public trust in vaccination, exposing us all to the risk of infectious diseases. Viruses don’t respect national boundaries.”
The researchers examined thousands of vaccination-related tweets sent between July 2014 and September 2017. They discovered several accounts known to belong to the same Russian trolls who interfered in the 2016 US elections.
They closely examined tweets under the hashtag #VaccinateUS, which “were uniquely identified with Russian troll accounts linked to the Internet Research Agency – a company backed by the Russian government, specialising in online influence operations,” the report says.
It says #VacccinateUS messages tend to tie both pro and anti-vaccine messages explicitly to US politics and frequently use emotional appeals to “freedom,” “democracy,” and “constitutional rights.” By contrast, other tweets from the vaccine stream focus more on “parental choice” and specific vaccine-related legislation.
“The vast majority of Americans believe vaccines are safe and effective, but looking at Twitter gives the impression that there is a lot of debate,” says study co-author David Broniatowski.
“It turns out that many anti-vaccine tweets come from accounts whose provenance is unclear. These might be bots, human users or cyborgs – hacked accounts that are sometimes taken over by bots. Although it’s impossible to know exactly how many tweets were generated by bots and trolls, our findings suggest that a significant portion of the online discourse about vaccines may be generated by malicious actors with a range of hidden agendas.”
The CNet website explains that a bot is an application that performs an automated task, such as setting an alarm, telling you the weather or searching online. About trolls, the Urban Dictionary says, “The most essential part of trolling is convincing your victim that you truly believe in what you are saying, no matter how outrageous, or to give your victim malicious instructions under the guise of help. Trolling requires deceiving.”
The researchers found that content polluters – bot accounts that distribute malware, unsolicited commercial content and disruptive materials – shared anti-vaccination messages far more often than average Twitter users.
“Content polluters seem to use anti-vaccine messages as bait to entice their followers to click on advertisements and links to malicious websites, says report co-author Sandra Crouse Quinn. “Ironically, content that promotes exposure to biological viruses may also promote exposure to computer viruses.”
The report adds that by giving equal attention to pro- and anti-vaccination arguments, Russian trolls and Twitter bots are following a strategy “of promoting discord across a range of controversial topics – a known tactic employed by Russian troll accounts”.
It says such strategies “may undermine the public health: normalising these debates may lead the public to question long-standing scientific consensus regarding vaccine efficacy”.