Why clever people believe silly things

Yes Virginia, clever people do believe in stupid things. And you can blame it all on evolution. Craig Cormick reports.


SO WHY DO SO many otherwise clever people believe in paranormal events, or the benefits of fringe medicines and the dangers of infant vaccination – despite there being no real evidence to support their beliefs?

In Australia about half the population believes in ESP (extra-sensory perception, such as telepathy) and one third believes in UFOs as evidence of extraterrestrial visitation. Other surveys indicate that about 80% of the population hold at least one paranormal belief – which includes astrology.

And a 2005 survey published in the Medical Journal of Australia stated that half of all Australians are using alternative medicines – and one in four are risking their health by not telling their doctor they are doing so.

We’ve probably all met somebody at a party trying to convince us of the benefits of the latest alternative therapies, which is harmless enough.

But it does become an issue of societal concern when we see fringe beliefs, based on non-scientific values, leading to people dying from putting their trust in natural therapies or faith healing when Western medicine could have saved them.

A U.S. National Science Foundation study found that almost nine in 10 Americans agreed that there were some good ways of treating sickness that medical science did not recognise, while four in 10 Americans had used alternative therapies. This is similar to Australian data, where such beliefs are more common among well-educated, upper middle-class women.

THE ISSUE MOST UNDER the spotlight here is infant vaccination and belief in its link to autism or other nasty side effects, an erroneous belief which has persisted despite the original study by Andrew Wakefield, linking vaccinations with autism, being discredited and retracted by The Lancet in February 2010.

In addition, his co-authors withdrew support for the study’s interpretations, other researchers were unable to confirm or reproduce his results, and there have since been revelations about undisclosed financial conflicts of interest on Wakefield’s part.

The reasons for the persistence of this belief in vaccination being linked to autism – despite the evidence – are complex. But it is also very important to understand why it persists if we believe that there is a need to counter the growth in anti-science in society.

Ben Goldacre – the British doctor and author of the Guardian column, book and blog called Bad Science – coined a phrase that is crucial for us to examine: “Why clever people believe stupid things”.

In the U.S., where the anti-vaccination movement has really taken off, the Centres for Disease Control and Prevention in Atlanta estimates that one in five Americans believes that vaccines can cause autism, and two in five have either delayed or refused vaccines for their child.

And in Australia, according to the Australian General Practice Network, vaccination rates have been dropping over the past seven years to the point that only 83% of four-year-olds nationally are covered — which is below the 90% rate needed to assures community-wide disease protection and prevent outbreaks of fatal, but preventable, diseases.

In some areas, usually where there are high pockets of alternative lifestyles – such as southeast Queensland, the northern rivers of New South Wales, the Adelaide Hills and the southwest of Western Australia – vaccination rates are as low as 70%.

The problem is not just that non-scientific beliefs can be very strongly ingrained in people, but that such beliefs are unlikely to ever be influenced by scientific fact.

So should we be concerned? Well only if we think that the dangers of non-science and pseudoscience are tangible and that widespread support for non-scientific beliefs can impede a society’s ability to function, or compete, in an ever more complicated and science and technology-driven world.

Social scientists call these ways of thinking ‘heuristics': mental shortcuts we take as a way of responding to rapid and complex information being fired at us.

IF THE ANSWERS to those questions are yes, then we need to better understand the factors that make otherwise rational people subscribe to irrational beliefs – and, importantly, what might be done to prevent a growth in anti-science thinking.

Fortunately there is enough research in this area that can be drawn together to provide a fairly clear overview of why this happens.

At the heart of the problem, as outlined extremely well by Goldacre in his book Bad Science, is the way we are wired psychologically, that leads us to common errors in our thinking which in turn leads to distortions of perception, inaccurate judgments or illogical interpretations.

Social scientists call these ways of thinking ‘heuristics': mental shortcuts we take as a way of responding to rapid and complex information being fired at us. We need to quickly sort new information into categories – and an easy way to do this is to sort it according to our existing belief systems or values.

This holds true for beliefs about genetically modified (GM) foods, the safety of nanotechnology, climate change, your favourite football club and so on – and the more complex the issue, the more likely people will make decisions based on beliefs or values.

IN AN IDEAL WORLD, we look at different information, analyse it carefully and make up our minds on a case-by-case basis. But that doesn’t work when we don’t have the motive or ability to do this.

We are increasingly time-poor in an increasingly data-rich world; that forces us to make mental shortcuts more often, drawing upon whatever existing knowledge we have (all too often from the media rather than from formal education), or falling back on our basic beliefs.

And in the age of the Internet, the information-communication flows are entirely different to what we may have been used to, even a decade or more ago.

We all know that the promise of the Internet to provide us with a wealth of information to make us smarter was akin to the early hopes that television would make us more educated and could teach us many languages and so on.

Instead we are better at watching people dance and sing and cook on TV, and better at watching talking babies and satires of the Hitler bunker scene in the film Downfall on the Internet. And amongst the tsunami of irrelevant data on the web, we invariably end up hunting down data that supports our existing beliefs.

It’s not the Internet that is fully to blame – it’s just a channel for information – but the sheer amount of data of dubious credibility that is available and which doesn’t readily distinguish between comment and research, or blog and news – and this has changed the relationship between information and attitude formation.

Where once as kids we might have started with the germ of a wacky idea and sought to check its validity with experts such as teachers, or even by reading an encyclopaedia, we now have the ability to pretty well find a community of people somewhere in the world with similar wacky ideas, never tested by an expert.

And through the Internet, we can also reinforce each other’s wackiness to the point it becomes a solid value that ain’t shifting for nobody, no how. Just Google ‘sexually abused by aliens’ or ‘sin causes cancer’ to see what I mean.

People shop around for the data that most supports their existing values. And if you can’t find it in scientific studies, rest assured you will find it somewhere else, like on the Internet.

ACCESS TO THE enormous breadth of opinions on the Internet has revealed that people, when swamped with information and using mental shortcuts, follow this up by ‘motivated reasoning’. This means only acknowledging information that accords with our beliefs, and dismissing information that does not accord with them.

So if you believe UFOs are evidence of alien visitations, you would acknowledge every bit of data you found that supported the existence of UFOs being such, and would dismiss everything that argued against their existence.

And as a result you would only tend to find information that supported your beliefs, increasingly reinforcing them.

Likewise with climate change, as we are seeing played out over and over again in public debates. Those who reject climate change as being human induced – or even happening at all – are not swayed by any scientific evidence, and cling tenaciously to the sporadic data here that might seem to undermine this. Again, the reasons for this appear to be about the way we are wired.

It has been well documented in surveys that those who are politically conservative tend to reject human-induced climate change; while those who are more politically left-leaning tend to support it.

BUT IT IS NOT a person’s politics that are the key drivers of attitudes here, but our underlying beliefs and values that, in turn, affect our political alignment.

If your underlying belief system is that humans should dominate or tame nature (anthropocentricism); that economic growth is inherently good for society and should be maintained at all costs; and that an individual’s rights are paramount over public good – then the idea that individual actions are actually causing damage conflicts so strongly with that belief system that you instinctively reject it.

Likewise, if you believe that humanity must live in equilibrium with the planet (geocentricism); that we need to put the brakes on material progress to be more sustainable; and that public good is more important than individual rights – then the concept of human-induced climate change aligns well with that belief system and you will accept it very easily.

People then shop around for the data that most supports their existing values. And if you can’t find it in scientific studies, rest assured you will find it somewhere else, like on the Internet.

An interesting statistic from a 2003 PhD study by Cathy Fraser at the Australian National University in Canberra into vaccinating and non-vaccinating parents – all of whom had access to the standard Health Department publications on vaccinations – shows that while only 1.6% of vaccinating parents used the Internet for more information, 36.2% of non-vaccinating parents sought data from it.

So is getting more good facts out there the answer? Maybe not. Brendan Nyhan at the University of Michigan in the U.S. undertook a study which found that when people were shown information proving that their beliefs were wrong, they actually become more entrenched in their original beliefs. This is known in the business as ‘backfire’.

And what’s more, highly intelligent people tend to suffer backfire more than less intelligent people do – making us immune to any facts that are counter to our strongly-held beliefs. The adage that attitudes that are not formed by logic and facts, and cannot be influenced by logic and facts, holds true here.

So what about providing the public with more balanced and factual information?

Well, that can be a problem too. When you present the public with both sides of a story giving them the arguments for and against, research shows that people with an existing attitude tend to becomemore entrenched in their original viewpoint, and are less likely to see the merit of other viewpoints. This research, conducted by Andrew Binder at North Carolina State University in the U.S., found that most people, when faced with an issue related to science and technology, fairly quickly adopted an initial position of support or opposition, based on a variety of mental shortcuts and predisposed beliefs.

And then, the more people with opposing points of view talked about divisive science and technology issues – like GM food, nanotechnology, stem cells, take your pick – the less likely different camps would agree on any issue or even see it the same way.

Binder stated, “This is problematic because it suggests that individuals are very selective in choosing their discussion partners, and hearing only what they want to hear during discussions of controversial issues.”

This means that as the media tend to always look for balance in their stories, particularly on contentious topics, giving one side of the argument on, say, climate change or GM foods, and then giving the other side, it can actually exacerbate this problem of polarised extreme opinions. The next thing we need to know is that the dismissing of facts and figures can be increased when somebody is highly emotive about a topic.

Former U.S. President Franklin D. Roosevelt once said, “We have nothing to fear but fear itself.” If only.

SO LOOK AROUND and see who is playing the ‘scare card’ and whipping up a bit of emotional concern about topics. The more agitated, scared, upset or angry we are, the more receptive to emotive messages we are – and less receptive to facts.

Which brings us to the fear factor. Former U.S. President Franklin D. Roosevelt once said, “We have nothing to fear but fear itself.” If only.

According to Frank Furedi, professor of sociology at the University of Kent in England and author of the Precautionary Principle and the Crisis of Causality, we are losing our capacity to deal with the unknown because we are increasingly believing that we are too powerless to deal with the perils confronting us.

He says that one of the many consequences of this is a growth in policies designed to deal with threats or risks, increasingly being based on feeling and intuitions, rather than on evidence and facts.

Jenny McCarthy, celebrity leader of the anti-vaccination movement in the United States, says she bases vaccine rejectionism on her intuition. Likewise many alternative therapists advocate that people should put their trust in their own intuition to justify their choices.

SO PEOPLE CHOSE not to vaccinate, it’s not because they are stupid – it’s because their fear of the harm from vaccination has become stronger than their fear of the harm from not vaccinating – even though the evidence shows the exact opposite.

The diseases that we vaccinate against are these days unknown and unseen – we no longer see children dying from whooping cough or suffering from polio. However, what we do see are stories of children suffering autism and other conditions supposedly as a result of vaccinations.

And so, no matter how small a risk this might be, it is one that is visible and known – and therefore much more prominent a risk.

A serious outbreak of whooping cough or measles might change all this, of course, which is a dangerous possibility in some parts of Australia at the moment or the U.S., where in California alone there were over 7,800 cases of whooping cough in 2010, with 10 deaths, due to lack of vaccination.

U.S. health officials at the turn of the 20th Century, when facing a large public rejection of smallpox vaccinations, talked about the need for a ‘fool killer’ – an outbreak of smallpox devastating enough to convince people of the need for vaccinations and overturn people’s intuitive mistrust of them.

Ferudi argues that this reliance on intuition, which has served us well for tens of thousands of years – stopping us from stepping out of the safe cave into the dangerous dark of night – can lead to false beliefs such as superstitions, paranormal phenomena and pseudoscience.

And according to Bruce Hood from the Britain’s University of Bristol, humans have evolved to be susceptible to supernatural beliefs – despite the lack of evidence for them. He has postulated that the human mind is adapted to reason intuitively and to understand unobservable properties, such as what makes something alive, or understanding people’s motivations.

On the plus side, it is this intuitive thinking that has lead to many scientific theories and understandings, such as gravity; but it also leaves us prone to irrational ideas.

Ferudi has stated that misconceptions about the working of the world around us, such as astrology and other supernatural beliefs, are due to naïve intuitive theories.

At the heart of a lot of our non-science beliefs is a need for more control.

PSYCHOLOGISTS MARJAANA Lindeman and Kia Aarnio from the University of Helsinki in Finland have gone one step further and described this as ‘immature errors of reasoning’, which are on par with children still learning about the natural world.

They say there are three major sorts of knowledge that determine children’s understanding of the world: intuitive physics, which is an understanding of the physical world; intuitive psychology, which is an understanding of how people think and behave; and intuitive biology, which is an understanding of the principles and forces of life.

When we mix these up, they argue – such as investing physical objects with powers like ‘healing crystals’ – we are suffering from ‘ontological confusion’.

This confusion underpins many alternative health treatments based on the belief that thought can alter health outcomes, or that touch can convey healing powers.

Similarly they state that cognitive errors underlie homeopathy, reiki, healing by touch, distance healing and birth affirmations that are often based on attributing physical events with life forces.

WHICH BRINGS US TO the next thing we need to better understand – the impact of uncertainty and control. At the heart of a lot of our non-science beliefs is a need for more control.

We live in an ever uncertain and more out-of-control world, but superstitious beliefs and pseudoscience can give people a certain sense of control and certainty, providing simple answers that reduce our levels of stress at not having control, which again is a necessary adaptive mechanism and something we tend to be wired to seek out.

But here’s the cruncher: science is predominantly based on uncertainty, while fringe beliefs are often based on providing more certainty. We are actually wired to favour non-scientific beliefs and values in many cases.

So what are we to do? Here’s the issue boiled down simply: we are living in a technology-driven world for which our innate instinctive reasoning poorly equips us.

There is some good news though, as evidence shows that adults with more science training will more often reject astrology or lucky numbers, and more often accept evolution.

Likewise a 2002 PhD study by Alyssa Taylor from the University of Virginia found that a course on critical thinking led to a significant decline in belief in paranormal claims.

However we need to temper this finding with the results of a Canadian study that found that a 13-week lecture course critically examining paranormal beliefs led to a reduction in belief from 56% to 41%. But that figure crept back up to 50% a year later.

So we clearly need to educate people before attitudes and beliefs are strongly formed. And in this it is more important to teach them how to think than what to think. The only way to make people bullet-proof to pseudoscience is to effectively teach the values and ways of science thinking while still young, before alternative belief systems have formed.

There is no guarantee it will work with everybody, which is evidenced by the many mixed attempts of totalitarian regimes to indoctrinate their young into certain beliefs and values.

But without it we are left at the mercy of our mental shortcuts, our fears, our intuitions and our desire for simple answers to complex issues.

These will not serve us very well for the challenges of the future: particularly when marketing gurus, such as Australian social researcher and demographer Mark McCrindle, tell us that modern consumers are much more engaged on an emotive scale than a cognitive scale.

We should judge a society’s scientific literacy not on what we do or don’t know – but on how we think.

AMERICAN ASTRONOMER, astrophysicist and cosmologist Carl Sagan was an early advocate of science-based thinking over non-science thinking, and argued in his book The Demon-Haunted World, that scientific thinking was necessary to safeguard our democratic institutions and our technical civilisation.

He said we need to teach both the scepticism and wonderment of scientific thought.

If it was widely understood that any claim to knowledge needed adequate evidence before it could be accepted, he said, then there would be no room for pseudoscience.

So we should judge a society’s scientific literacy not on what we do or don’t know – but on how we think.

Despite surveys that regularly criticise society’s lack of knowledge to things as vital as how many kilometres the Sun is from the Earth, or whether oxygen occurs naturally in the air we breathe or is released by plants, it’s more important that we are educated on how to make decisions based on evidence, not vague claims that align with our emotions.

Without that we will continue to vainly argue science facts against non-science values, in an arena where facts and logic have little impact.

Julian Cribb, a Canberra-based science communicator and author, recently described the implications succinctly: “It is not in anybody’s interests for Australia to become more technologically backward, belief-driven, irrational, or based-on-bullshit rather than on hard-won, meticulously gathered evidence and its skilled analysis.”

BUT – AND THIS IS VERY BIG BUT – we need to be clear that the overall purpose of understanding the drivers of beliefs in pseudoscience or alternative beliefs is not to ridicule, but to understand.

The “Ha ha ha, aren’t you dumb” approach, common among some sceptics and critical thinkers, wins few arguments. It might feel easy to triumphantly declare one way of looking at the world is superior to another, but fail to note that some people get enormous purpose and meaning from the way they do.

But what do we make of the fact that research – yes, scientific research – shows that those who believe in, and invoke, good luck or blessings, tend to have higher performance scores across a range of tests than those who don’t?

Or the more tricky ethical question of who has the right to say that traditional beliefs are based on superstitions, not science, and are therefore less valid?

We are also wired to tend to divide people into us and them camps – which the Commonwealth Scientific and Industrial Research Organisation (CSIRO) science communicator Mike McRae describes in his recent book Tribal Science: Brains, beliefs and bad ideas, as our tribes of similar minded beliefs.

But if we can rise above instinctive fears to embrace a scientific evidence-based approach to thinking, we can surely rise above instinctive tribalism and look for points of common values that allow for a complexity of world views.

Many scientists are Christians – or Muslims and Hindis or Jews – or regularly consult horoscopes. That is achieving a compatibility of divergent beliefs.

At the other extreme, every individual on the planet has the right to be a contender for the Darwin Awards, a tongue-in-cheek honour created by American scientist Wendy Northcutt to recognise those who have contributed to human evolution by “removing themselves from the gene pool” through beliefs or acts of amazing stupidity that are ultimately fatal. That’s evolution at work.

But to allow any dangerous beliefs or behaviours to be spread unchallenged throughout society – well that’s that’s detrimental to the collective gene pool. And that’s something we must challenge, no matter whether it is based on our instinct or our scientific reasoning.

Subscriber Exclusive The remainder of this article is exclusive to Cosmos subscribers

To continue reading this article, please subscribe for unlimited access or log in
Craig Cormick is the President of the Australian Science Communicators. His communication research has been published in journals including Nature and Cell – which he suspects no one has ever read.
Latest Stories
MoreMore Articles