“You can’t just hope to pour a big bucket of science into [people] as if they’re just an empty vessel waiting to be filled,” says Associate Professor Antony Eagle.
It’s a matter of volume. It’s a matter of taste. It’s a matter of digestibility.
Which means quality comes second.
And that’s the core of a modern phenomenon.
Three years ago, the RAND Corporation released a remarkable new study called ‘Truth Decay – An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life’.
The study characterised truth decay as four interrelated trends: an increasing disagreement about facts and analytical interpretations of facts and data; a blurring of the line between opinion and fact; an increase in the relative volume, and resulting influence, of opinion and personal experience over fact; and lowered trust in formerly respected sources of factual information.
It was by any standard a seminal moment in history: the midpoint of Donald J Trump’s disruptive presidency and a time when several events – the 2016 US presidential elections and Brexit among them – had cast an uncomfortably bright light on the truthfulness of people occupying high public office.
Many people were asking the question: what is truth?
Can it be denied simply by the use of the loudest megaphone?
If our elected leaders aren’t prepared to protect it, what hope is there for the rest of us?
The climate change ‘debate’ continued, despite an overwhelming majority of the world’s scientists accepting it as fact and ever-more-desperately counselling decisive action to mitigate it.
And finally, late in 2019, the SARS-Cov-2 virus emerged. As COVID-19 raced around the world, we were by turn confronted with information and disinformation. We witnessed national or state leaders and their scientific advisers striving for accuracy and honesty. And we saw federal or state leaders not only disregard the advice of their scientific advisers, but in some cases publicly denigrate those advisers.
Against this background, The Royal Institution of Australia decided to assemble an expert panel to discuss a fairly simple proposition: whether science had played a role in this widespread decline of truth. The panellists – Professor Caroline McMillen, South Australia’s Chief Scientist; former High Court judge the Hon Michael Kirby; and University of Adelaide science philosopher Associate Professor Antony Eagle – spent 40 minutes together late last month.
The truth will always be the truth.
Identifying it is the hard part.
What is the truth, anyway?
“I think truth is simple,” says Eagle. “It is what it is. It’s not another thing. When what you say corresponds to how things are, then what you say is true. But obviously, it’s very contested. It’s not always easy to find out.”’
But Kirby says he quickly learnt there was usually more to the truth than meets the eye.
“When I was a young lawyer, I thought: ‘It’s obvious – in a trial, there’s only going to be one truth – and that’s the one that is either found by the jury or found by the judge.’”
Then he began to encounter equally qualified, articulate and honest experts – “and their opinions were absolutely different”.
McMillen argues science shouldn’t be confused with truth, even though it’s often presented as such.
“Truth is a noun,” she says. “Science is actually a verb – it’s a doing thing. So, in a sentence, science is not about seeking the truth. It’s about seeking to understand, explore, and then begin to sift some of that information to see how it might be harnessed to bring about an improvement.”
But unless that understanding is as close to the truth as possible, aircraft start falling out of the air. Teslas start crashing. And pandemics breach barricades.
If science produces it, will people believe it?
“You can’t just sort of hope to pour a big bucket of science into them as if they’re just an empty vessel waiting to be filled,” Eagle says. “It’s just not the right model for understanding how people want to engage with science. They don’t want to consume it by the pound.”
Information, however, is power.
“We are in a world in which facts – and the analysis that has led to those – can be contested from an armchair,” says McMillen. “It’s a world in which opinion and fact are often difficult to sort out. And opinions are placed in a way in which you lean towards them – they confirm your innate thinking, they confirm your belief system.”
Misinformation, however, isn’t always malicious. Eagle says: “A lot of the things that we’re talking about arise when – even among people who agree that there is an objective truth out there – they just disagree on what it is. But we know that a genuine diversity of viewpoint can improve the reliability of the decision-making process.”
Groupthink, however, can be weaponised in the service of an agenda.
“So I think part of what we see in this phenomenon of truth decay is an almost unintended, undesired side effect of something very desirable – a greater diversity of people getting to express their views,” he adds.
But Kirby warns our modern world of aggregated and algorithmically assessed information makes people susceptible to manipulation. Information, be it good or bad, can also be targeted with pinpoint accuracy.
“Information is great. Getting more information is greater still. Getting the information of lots of experts – given that they will have different views – is even greater yet,” he says. “But the manipulation of information and mega data is going to be very bad. That’s going to manipulate people’s opinions in ways that we’re going to have – as a civilised society and science-respecting community – to take into account.”
Snake oil salespeople, prophets and political pundits have quickly mastered the art of social media persuasion.
It’s time for science to catch up.
“I think we’ve seen the power of algorithms, and we’ve seen them writ large,” says McMillen. “I think we have just a surface level of understanding why we see a disaffection emerging in communities. It’s fuelled by disinformation in part, and also by reinforcement of belief systems. And that tells us that we’re missing a lot in the way in which material is delivered.
“It’s delivered in ways that meet the needs of very many different people [where] we often have a monochromatic way of delivering knowledge. “
In its drive to maximise profit, a social media algorithm pushes the stories, pictures and videos it thinks individual users want to see. That, McMillen adds, is how “confirmation bias can grow and allow you to feel comfortable that you have explored all the issues”.
Marketers and propagandists have figured out these keywords and emotional triggers, along with the more mercenary tactics of sending a message “viral”.
Science and public institutions largely have not.
“We’ve got to be much more sophisticated because the algorithms will be beating us every time,” says McMillen. “We have to look at where those algorithms come from. We have to think really carefully about how we can take out the negative and the malicious. And we have to think about critical thinking.”
That remains an eternal constant. And it’s not the exclusive domain of science, she adds. Children are natural critical thinkers in order to make sense of their world. But only critical thinking will cause people to pause and question what motive an algorithm may have had in throwing something up on their social feeds.
Amid it all, Eagle remains hopeful.
“I’m optimistic that people are responsive to evidence, they’re looking for evidence, that they’re looking for reasons to shift their opinion or to modify their opinion.”