
Facebook communities that distrust establishment health advice are more effective than government agencies and other reliable groups at reaching and engaging with undecided individuals, according to a study published in the journal Nature.
US researchers tracked vaccine conversations among 100 million Facebook users at the height of the 2019 measles outbreak and created a “battleground” map they say shows how such distrust could spread.
“Its core reveals a multi-sided landscape of unprecedented intricacy that involves nearly 100 million individuals partitioned into highly dynamic, interconnected clusters across cities, countries, continents and languages,” they write.
As a theoretical framework it “reproduces the recent explosive growth in anti-vaccination views and predicts that these views will dominate in a decade”, they add.

The team was led by Neil Johnson from George Washington University, US, and included researchers with backgrounds in health science, politics, engineering, physics, data, computing and theoretical biology.
They identified three camps of Facebook communities – pro-vaccination, anti-vaccination and undecided – then, starting with one, looked to find a second that was strongly entangled with the original, and so on, to better understand how they interacted with each other.
They report that while there are fewer individuals with anti-vaccination sentiments rather than pro-vaccination sentiments on Facebook, there are nearly three times the number of anti-vaccination communities as pro-vaccination communities.
This, they say, allows anti-vaccination communities to become highly entangled with undecided but often engaged communities, such as parents’ groups, while pro-vaccination communities remain mostly peripheral.
In addition, pro-vaccination communities focus on countering larger anti-vaccination communities and so might miss medium-sized ones growing under the radar.
“Mass-action models suggest that given the large pro-vaccination majority, the anti-vaccination clusters should shrink relative to pro-vaccination clusters under attrition, which is the opposite of what happened in 2019,” the paper says.
The researchers also found anti-vaccination communities offer more diverse narratives than pro-vaccination communities, which during the period reviewed mostly offered monothematic messaging focused on the established public health benefits of vaccinations.
“We thought we would see major public health entities and state-run health departments at the centre of this online battle, but we found the opposite,” Johnson says. “They were fighting off to one side, in the wrong place.”
In their paper, the researchers propose several strategies to fight online disinformation, including influencing the heterogeneity of individual communities to delay the onset and decrease their growth and manipulating the links between communities in order to prevent the spread of negative views.
They acknowledge that their analysis is incomplete and that other channels of influence should be explored but suggest that similar behaviours should arise in any online setting in which clusters can form.
And they conclude: “One may also wonder about external agents or entities – however, clusters tend to police themselves for bot-like or troll behaviour. The crudely power law-like distribution of the cluster sizes of anti-vaccination clusters suggests that any top-down presence is not dominant.”
Originally published by Cosmos as Mapping online distrust in health expertise
Read science facts, not fiction...
There’s never been a more important time to explain the facts, cherish evidence-based knowledge and to showcase the latest scientific, technological and engineering breakthroughs. Cosmos is published by The Royal Institution of Australia, a charity dedicated to connecting people with the world of science. Financial contributions, however big or small, help us provide access to trusted science information at a time when the world needs it most. Please support us by making a donation or purchasing a subscription today.