Glass from first nuclear blast tests theory of moon's origin
Remnants of a wartime atomic test could provide vital clues about the formation of the moon.
Glass made from sand melted during a nuclear test in 1945 may explain why the moon contains less zinc than Earth.
The first controlled detonation of a full-size nuclear bomb was the so-called Trinity test, which took place in the New Mexico desert in July 1945. The blast was small by the standards of modern nuclear weapons, unleashing 84 terajoules of energy – or roughly the equivalent of 20,000 tonnes of TNT.
Modest though the explosion was, a team of scientists led by James Day of the Scripps Institution of Oceanography at the University of California, San Diego, realised that the test site could be studied as a proxy for the high-pressure and extremely hot conditions that led to the moon’s formation some 4.5 billion years ago.
In particular, the team – writing in Science Advances – wondered if studying the earth around the Trinity site’s ground zero could explain why zinc is far less common in lunar rocks than ones on earth.
Zinc is thought to have developed in planetary bodies very early in the formation of the solar system. It is a volatile element, with a comparatively low boiling point – which means it evaporates in conditions of very high pressure and heat.
The low levels of lunar zinc, researchers suggest, could be evidence for the leading theory for the moon’s development – that it was formed when a Mars-sized object slammed into the proto-Earth. The result was a moon-to-be so over-heated that many of its volatile components simply burned away.
“Experiments or natural analogs approximating these early conditions are limited,” notes Day’s team.
The Trinity test site, though, offered a very reasonable possibility. When the nuke went off, it triggered the metamorphosis of the surrounding sands into a bizarre green glass – dubbed trinitite – out to a radius of about 350 metres from ground zero.
Testing the zinc content of this glass sheet at various points, from 10 metres from the hotspot out, the researchers reasoned, should provide an index of zinc depletion.
The hypothesis turned out to be correct, with zinc content correlating well with distance from ground zero. The higher the temperature – and at the blast site the heat hit 1,900 ºC – the greater the amount of zinc loss.