Ancient exploding stars showered Earth – and life – with a long-lasting blast of radiation equivalent to a CT scan per animal each year, new research suggests – and it may have cooled the planet to trigger a minor mass extinction.
Brian Thomas from Washburn University in Kansas and colleagues from the US and Europe modelled the effects of two supernovae, around 300 light-years away, on the planet’s atmosphere and at ground level. They found while the bright blasts would have lit up the night sky for a few weeks, the associated high-energy cosmic rays also emitted by the explosions tripled the usual, background radiation dose.
This, they write, had “substantial effects on the terrestria atmosphere and biota” by cooling the planet and causing more mutations in organisms.
Earlier this year, rocks on Earth and the moon showed our solar system was bombarded with supernova scraps – the first event 1.7 to 3.2 million years ago and another 6.5 to 8.7 million years ago. (There were certainly earlier supernovae showers, but the detection method – which relied on finding minute amounts of iron-60 – was only sensitive enough to pick up remnants fired out by the closest, most recent explosions.)
Following the supernovae, there was certainly a brilliant flash, with gamma-rays and X-rays washing over the planet alongside visible and ultraviolet light. But, Thomas and colleagues calculated, the light show would have been fairly short-lived – over in a few weeks.
The long-lasting effects, they found, came from cosmic radiation. Cosmic rays – particles travelling through space at close to the speed of light – naturally shower on Earth. Our atmosphere and magnetic field generally shield us from a lot of the radiation, while smaller, slippery particles tend to whiz straight through rock.
One such slippery particle is the muon. It carries a negative charge and weighs around 207 times more than an electron.
Thomas and colleagues found the supernovae would have increased muon bombardment 20-fold – and as muons contribute around a sixth of our normal radiation dose, radiation levels on Earth tripled, says study co-author Adrian Melott. This uptick may have increased genetic mutations, thus boosting rates of cancer and evolution.
And as the cosmic rays bumped and jostled their way through the troposphere – the lowest level of the atmosphere – they would have torn electrons from atoms in a process called ionisation. This rate of ionisation, a whole order of magnitude higher than before, destroyed ozone and tripled nitrogen oxides in the upper troposphere, boosting lightning strikes and cooling the Earth for around 5,000 years.
A minor mass extinction, during the transition to the Pleistocene period around 2.59 million years ago, may be connected to cosmic ray cooling.
“Africa dried out and a lot of the forest turned into savannah,” Melott says. “Around this time and afterwards, we started having glaciations – ice ages – over and over again, and it’s not clear why that started to happen.”
But it’s an idea that, he concedes, is controversial.
The work, which you can find on Arxiv, will be published in the Astrophysical Journal Letters.
Further reading:
How exploding stars shape life – and death – on Earth
Supernova scraps found on the moon