Inoculating against disinformation

When it comes to politically or socially sensitive issues, disinformation and misinformation aren’t new. The basic anti-vaccine claims being made about the COVID-19 vaccine, for example, are largely recycled from ones made 65 years ago against polio vaccines, says Rebecca Goldberg, a behavioural science researcher at Google.

“The techniques [of spreading disinformation] are quite stable,” she told this year’s annual meeting of the American Association for the Advancement of Science, recently held online. “They occur over and over again with every new vaccine that is produced.”

What’s changed is how the mis- and disinformation spreads, and the best ways to combat it, including finding ways, in Goldberg’s terms, to “inoculate” people against it. (The difference between misinformation and disinformation is that misinformation is erroneous, while disinformation is deliberate.)

“Inoculation takes a medical metaphor and applies it to the science of propaganda,” she says. “It attempts to build psychological resistance to an attack trying to persuade you [of something false], just like your body builds physical resistance with a medical inoculation.”

It works, she says, not by debunking disinformation after the fact (which is generally too late), but by “pre-bunking” it before you are exposed to it. In a test of how to do it (and whether it works), her team created a set of 90-second videos, each focusing on a common method. One explained fearmongering. Another discussed ad hominem attacks (slurs on a particular individual that aren’t related to the underlying factual issue). Others depicted scapegoating, false dichotomies and other logical fallacies.

“The techniques [of spreading disinformation] are quite stable, they occur over and over again with every new vaccine that is produced.”

Rebecca Goldberg, Google behavioural science researcher

Her team then recruited 5500 US residents, some of whom watched one of these “inoculation” videos, and some of whom didn’t.

All the recruits were then shown hypothetical tweets, some of which used disinformation methods, some of which didn’t. They were then asked to assess the tweets’ use of disinformation tactics and to rate their trustworthiness. They were also asked how likely they would be to share the tweets with their friends.

On all these metrics, the inoculated group was substantially more alert to the possibility of disinformation, regardless of their political ideology, even if the hypothetical tweets involved politically touchy subjects, such as climate change.

“It’s an exciting find,” says Daniel Rogers, co-founder and executive director of the Global Disinformation Index and an adjunct faculty member at New York University, who was not part of the study team. “It taps into the fact that people don’t like to be fooled, so when you show them ways they might be fooled, it helps inoculate them.”

It’s also important because other research shows that understanding who has or hasn’t accepted bad information isn’t just an issue in painfully useless arguments at family gatherings. It can literally be a matter of life and death.

David Yanagizawa-Drott, an economist at the University of Zurich, Switzerland, realised he had an unusual opportunity to test this by studying watchers of opinion-oriented cable news shows in the US during the early months of COVID in 2020.

For years, such shows have slowly polarised American viewers, dividing them largely along political lines. But each network – be it CNN, MSNBC or Fox News etc – has many shows, and Yanagizawa-Drott realised that two of the most popular hosts on Fox News – Tucker Carlson and Sean Hannity – had addressed COVID quite differently, at least during the early months, when nobody was sure how dangerous the virus might be.

“It’s an exciting find, it taps into the fact that people don’t like to be fooled, so when you show them ways they might be fooled, it helps inoculate them”

Daniel Rogers, Global Disinformation Index

Better yet for Yanagizawa-Drott’s study, the audience for the Fox News shows was demographically and politically quite similar – which host viewers preferred seemed to be more a matter of which personality they liked more, or which show came on at a more convenient time.

From a scientific point of view, Yanagizawa-Drott says, that offered “a neat natural experiment”, because Carlson and Hannity “had very different narratives early on in the pandemic about how dangerous the virus was”.

Carlson, he says, was early to raise the alarm, while Hannity didn’t discuss it much (if at all) until late February and didn’t reach the same level of concern as Carlson until sometime in March. That created a 90-day window in which their viewers received significantly different information.

To assess this, Yanagizawa-Drott and his colleagues examined county-by-county Neilson data on which show was more popular, and then compared it to how people in these counties acted. They found that in counties where Carlson was more popular, people changed their behaviour earlier on, with a resulting reduction in both cases and deaths. In counties where Hannity was more popular, they found sharper rises in both cases and deaths. “Opinion shows are powerful, and have important consequences,” says Yanagizawa-Drott.

Nor is it just a matter of quickly trying to figure out the severity of a new virus like COVID-19. Yanagizawa-Drott did his PhD research studying the Rwandan genocide, trying to determine the impact of a radio station spewing hate toward the Tutsi minority.

“I was able to get data on where the antennas were that were used with these broadcasts,” he says. From that, he says, he could determine which villages were able to hear them, and which were blocked by intervening mountains. He then tracked the violence level from village to village, showing that this kind of disinformation increased the violence against Tutsis in the villages able to hear it. “This shows the power of dangerous narratives,” he says.

“Opinion shows are powerful, and have important consequences”

David Yanagizawa-Drott, University of Zurich, Switzerland

Another example involves Father Coughlin, a Canadian-American priest who, in the 1930s, spewed anti-Semitism over the radio to an estimate 30 million listeners. As in Rwanda, parts of North America were in range of his broadcasts, and parts were not. Those parts that were, Yanagizawa-Drott says, had higher levels of hate crimes against Jews and higher levels of membership in extremist groups.

All of which means that disinformation isn’t just something we can shrug off as conspiracy-theory nonsense. Instead, Yanagizawa-Drott says, there’s “growing evidence” that it has power not only to alter beliefs but to alter behaviour.

Combatting it in today’s world, however, isn’t as simple as figuring out whose village is in line of sight of a radio transmitter. Instead, says Neil Johnson, of George Washington University, Washington DC, it is important to understand how disinformation travels in social media.

To do this, Johnson has studied the social networks of 100 million Facebook users, focusing on how they reacted to COVID and vaccines. What he found was that they could be grouped into three basic communities.

One is the pro-vaccine community, which is sizeable, but largely talks only to itself.

Another is the anti-vaccine (and anti-masking) community, which gets a lot of attention in the press, because “everyone” talks about them.

Johnson has studied the social networks of 100 million Facebook users, focusing on how they reacted to COVID and vaccines. What he found was that they could be grouped into three basic communities.

Overlooked, he says, is the “huge mainstream” of online communities that range from people looking for parenting advice, to pet owners looking for pet advice, to folks interested in organic foods, dietary supplements or other alternative-health topics. “They are the majority of people,” Johnson says. “They don’t usually discuss vaccines or masks. They are discussing pets, or babies, or their kids.”

But, he says, these groups have much stronger interconnections with the anti- groups than with the pro-science groups, and since COVID arose, those connections have become even tighter.

“That is where misinformation is leaking into the mainstream,” he says.

Or, in Goldberg’s terms, this is where it’s most effective to inoculate against disinformation.

Which, she adds, is something her team may soon be testing by using her videos in ads designed to combat the disinformation advertisements so often used in politics and related arenas.

If it works, who knows? Maybe someday soon you will be getting online ads offering to show you how to detect fearmongering, false dichotomies and other logical fallacies we all should have been taught about long ago.

Maybe their messages will be boringly familiar. Maybe not. Either way, realise that they are an effort to inoculate not just you, but all of society, to the deadly virus of disinformation.

Please login to favourite this article.