hide announcement WIN your very own meteorite! Subscribe, gift or renew a subscription to Cosmos and automatically go into the draw – Shop now!

Better climate models show no end to warming

Why can't climate models explain the "pause" in global warming? Scientists are beginning to hone in on the answer. James Mitchell Crow reports. 

The decade hosted nine of the 10 hottest years on record. – ISTOCK

The startling fact that warming had “paused” from 2000 to 2009, which made a global headlines in May, led climate sceptics to proclaim victory while some scientists hailed it as a second chance to forestall catastrophic climate change.

COSMOS thinks it speaks more to our worrying inability to predict the course of climate change. Spaceship Earth is at sea with the weather-forecasting skills of a 19th century mariner. And ask any captain – these days the storms are bigger.

What we have no doubts about is that CO2 levels are rising. On 9 May this year they reached 400 parts per million (ppm), 40% up from the pre-industrial baseline of 280ppm. Planet Earth has not held this level of CO2 in its atmosphere for three million years when alligators basked on the banks of Antarctic streams.

It’s important to remember the thermometer did in fact rise. The decade hosted nine of the 10 hottest years on record. It’s just that on average, temperatures rose a lot more slowly than in the decade before.

“You can’t reasonably expect that you won’t see flat spots – it has happened in the past and will happen in the future,” says Roger Bodman at Victoria University in Melbourne. One easy explanation is that the atmosphere and ocean may be taking turns at heating up. We could be seeing a pause as the ocean acts like a heat sponge, before it becomes saturated and air temperature takes off again. Bodman thinks natural climate variability is behind the present pause in warming, which he says falls well within the expected range of our models. But herein lies the big problem: our models.

Playing with models has never been such a high-stakes game. According to the Intergovernmental Panel on Climate Change (IPCC), if the world doesn’t curb the soaring rate of CO2 emissions, the baseline global temperature is expected to rise by 4°C by the end of the century – way beyond the 2°C considered the safe threshold. But that figure is merely a “best estimate” and the panel’s “likely range” suggests the actual temperature rise could be anywhere from 2.4 to 6.4 °C – the difference between a bumpy ride or a catastrophe.

Policymakers need more certainty if they are to be galvanised. Climate modellers are responding by plugging in more data. Alexander Otto at the University of Oxford and his colleagues decided to play devil’s advocate. What if they made a model based only on the past 10 years of slow rising average temperature? The team calculated that “transient climate response” – a measure of the speed with which the atmosphere warms up when given a kick of carbon dioxide – would be lower than IPCC predictions. In other words, temperatures that the IPCC calculations predict we will experience in 2050 might not arrive until 2060 or even 2065, potentially buying us a little more time to avoid catastrophic climate shifts.

According to the IPCC, temperature rise could be anywhere from 2.4 to 6.4 °C – that's the difference between a bumpy ride or a catastrophe.

But when Otto fed data from the past four decades into his model, the predicted temperatures fell in line with the IPCC’s calculations.

Bodman and his colleagues at the University of Melbourne published their climate models using recent data in Nature Climate Change on 26 May. They focused on one of the biggest bugbears of current models – the impact of rising temperatures on the carbon cycle.

To reduce uncertainty, they compared how temperature change affected the carbon cycle in the recent past. Unlike Otto’s work, the study predicts no temporary reprieve; it came up with a figure of 4.1°C by 2100, 0.1°C above the IPCC figure.

“What we were able to do was narrow the uncertainty range,” says David Karoly, a co-author of the study. “This means the risk of exceeding 6°C is less likely, but the risk of exceeding 2°C – the threshold agreed as being important to avoid – is virtually certain under business-as-usual conditions.”

Narrowing the uncertainty range without significant change to the “best estimate” temperature prediction is a story that has been repeated over and over as climate models improve, says Dave Griggs, former head of the IPCC science working group secretariat and current director of the Monash Sustainability Institute in Melbourne. “For the past 20 years predictions from climate models have been pretty stable, which is quite remarkable when you consider how crude they were back then.”

That’s not to say there isn’t more work to do. The models frequently still fall down in predicting local weather patterns.

Two papers published this year highlight the scale of the problem. Writing in Nature, Jessica Tierney and her colleagues at the Woods Hole Oceanographic Institution in Massachusetts highlighted the need for better models to explain decreasing rainfall in East Africa which contradicts model predictions that the area would get wetter. Similarly, Sloan Coats at Columbia University in New York and his co-workers were unable to accurately reproduce known patterns of historical ‘megadroughts’ in the southwestern US, and so could not predict when future severe droughts might occur.

James mitchell crow 2014.png?ixlib=rails 2.1
James Mitchell Crow is a freelance writer and editor.
Latest Stories
MoreMore Articles