Why is it that for so many contentious science-based issues, explaining the science has so little impact on public concerns?
Think genetically modified foods, windmills, nanotechnology, embryonic stem cells, infant vaccination and so on, where scientists regularly stand up in front of a bolshie audience to try to explain the science in an attempt to address concerns.
It rarely works, does it?
This is because when public concerns are high, you need to throw traditional science communication methods out the window and turn to risk communication principles. Fortunately, there has been a lot of work done on effective risk communication over the past few years, that can be boiled down to some key ideas.
The first one is an adage coined by Vince Covello, one of the risk communication gurus out there. It goes: people don’t care what you know, they want to know that you care.
Put simply, don’t talk the science to a concerned audience, because their concerns are rarely about the actual science, and they are generally looking for someone to acknowledge their concerns and validate them first. You may get to explaining the science ultimately, but only at the invitation of the crowd.
The types of situations where you will find traditional science communication don’t work so well are when there are:
- high levels of concern or outrage,
- low levels of trust,
- high perception of risk, and
- high circulation of alternative reports or different positions on the science that are getting a lot of traction with the public.
Many of the problems relating to communicating “risk” stem from the fact that scientific definitions of the word can be very different from public perceptions of it. You might be familiar with the scientific definition of risk as:
Risk = probability x impact
But the public view of risk is more like:
Risk = OMG x WTF.
To greatly simplify things, we have a scientific perspective of an issue that is predominantly based on data, competing with a perspective that is often based on emotion (“this might harm my baby”, “it doesn’t feel natural”, “I don’t like the idea of it”). You effectively have people speaking entirely different languages, like Farsi and Catalan, and not getting why the other person doesn’t understand them.
To quote a famous science communicator of yesteryear: “Why is it so?”
Well, the heart of the problem is the way we are wired psychologically. It can lead us to distortions of perception, inaccurate judgments or illogical interpretations. If you need evidence of this, think about the last heated conversation you had with a colleague about the merits of their favourite football team or their political affiliation – and let’s not even start on the marriage equality debate.
If you analyse the arguments being put forward you will quickly see that many people have an opinion first and then go looking for data to confirm that. Commonly known as confirmation bias, it really gets rusted on when we only follow media channels that support our world-view of things. This is commonly known as the echo chamber effect.
For science communicators, these are things we just need to work with, along with other cognitive biases like backfire, which is when people are shown information demonstrating their opinion is wrong but it just results in them holding that opinion more strongly. Or amplification of risk, whereby the more people with opposing points of view talk about the topic the less likely they will ever see the other’s perspective.
If you’ve ever attended a public forum on any hot issue, you’ll see this in action.
For most people, when faced with a matter related to science and technology, they adopt an initial position of support or opposition, based on a variety of mental shortcuts and predisposed beliefs rather than any scientific evidence.
This explains how, if you have pro-development or anthropocentric values you can boldly state that we should respect the science on GMOs, but insist the science on climate change is a bit dodgy. And, on the other hand, if you value nature over development, you can boldly state that we should respect the science on climate change, but that the science on GMOs is a bit dodgy.
Your position is based on the need to align your opinion with your values.
Another great risk communication adage is that opinions that were not formed by logic or facts are not able to be easily influenced by logic or facts.
So, what is a person to do about it?
Peter Sandman, another great guru of risk communication, has said that effectiveness arises from diminishing outrage, not explaining the data.
And towards this it is important to know that research also indicates that when people are stressed, their perceptions and decision are influenced by a wide range of factors, but technical facts or technical expertise are about the least important, worth around 10% of impact, compared to empathy and listening, which account for about 50%.
If you engage with communities or individuals on heated science topics, these facts might be stressing you out – but calm down, I understand your concerns and they are perfectly valid. Then tell me a little more about the way you value science-based information, and how important that is to you. Again, I understand what you are saying.