It seems the political hyper-partisanship engulfing the United States has found yet another victim: science. New research shows that political and religious orientations are strongly associated with polarized views of scientific consensus.
There’s a twist, however: the more scientific education and literacy a person has, the more their views are likely to be polarized. These puzzling findings are outlined in a paper published in the Proceedings of the National Academy of Sciences authored by Caitlin Drummond and Baruch Fischoff of Carnegie Mellon University.
The pair studied data from the General Social Survey about Americans’ views on six controversial topics: human evolution, the Big Bang, stem cell research, anthropogenic climate change, genetically modified foods and nanotechnology. For the first four issues there was significant polarization among respondents, while the last two showed little evidence of it.
Respondents who identified themselves as politically and religiously conservative were far more likely to reject scientific consensus on the polarised issues, while those who identified as liberal were more likely to accept it.
{%recommended 433%}The baffling part of the finding, Drummond says, is that “individuals with greater education and science knowledge tend to be more polarized”. On polarised issues such as climate change and evolution, more knowledge makes conservatives more likely to reject scientific consensus and liberals more likely to accept it.
Both for other subjects, such as genetically modified food, that are controversial but “have not become part of these larger social conflicts in America”, Drummond and Fischoff found no connection between education and polarisation.
So how to explain this? One model the authors suggest is known as ‘motivated reasoning’ which suggests that “more knowledgeable individuals are more adept at interpreting evidence in support of their preferred conclusions”. The authors also speculate that “better educated people are more likely to know when political and religious communities have chosen sides on an issue, and hence what they should think (or say) in keeping with their identity”.
This, of course, will have a substantial effect on science communications efforts. Drummond suggests that “science communication on polarized topics should take into account not just science itself, but also its context and its implications for things people care about, such as their political and religious identities.” While pragmatic, this may be a bitter pill to swallow for those who think that science should stand or fall on its epistemic merits.
There was one positive finding: greater trust in the scientific community meant greater agreement with the scientific consensus. Perhaps, then, scientists and science’s advocates need to work on building such trust, on both sides of the aisle.