We owe the discovery of radioactivity to bad weather. French physicist Henri Becquerel was trying to study fluorescence, a phenomenon where certain materials glow when exposed to sunlight, but overcast days thwarted his experiments and so he wrapped his fluorescing uranium salts in cloth and left them in a drawer, along with a photographic plate and a copper cross. This simple serendipitous accident, in 1896, revealed the existence of radioactivity, a phenomenon that opened a window into the subatomic world and kickstarted the nuclear revolution.
When he finally fetched the salts, Becquerel found that an image of the cross had appeared on the photographic plate – even though the salts had not been exposed to light.
“I am now convinced that uranium salts produce invisible radiation, even when they have been kept in the dark,” he wrote after conducting further experiments.
Becquerel’s doctoral student, Marie Curie, investigated the matter with her husband Pierre and they realised the effect had nothing to do with fluorescence, instead discovering that certain materials naturally emit a constant flow of energy. They coined the term ‘radioactivity’ and also found two new radioactive elements: polonium and radium. For this profound and exciting work, Becquerel and the Curies received the Nobel Prize for Physics in 1903.
Physicists Ernest Rutherford and Frederick Soddy delved deeper and found that tiny amounts of matter contain huge reserves of energy. They also realised that in the process of radioactive decay, one element can turn into another – an atom of uranium can transform (via a few intermediate steps) into an atom of lead.
Around the world people assumed that these miraculously energetic materials could be put to good use. Until the 1920s, many manufacturers of laxatives and toothpaste proudly laced their products with radioactive thorium, and radioactive substances were only banned in consumer products in the US in 1938.
How does radioactivity work?
Today we have a much more comprehensive understanding of what radioactivity is, how it can be dangerous, and how we can use it.
Here’s a basic rundown: imagine an atom, composed of a cloud of electrons around a central nucleus where particles called neutrons and protons are crammed in together. Some arrangements of protons and neutrons are more stable than others; if there are too many neutrons compared to protons, the nucleus becomes unstable and falls apart. This decay releases nuclear radiation in the form of alpha particles, beta particles, and gamma radiation.
An alpha particle carries off two protons and two neutrons, and since an element is defined by its number of protons, the parent atom becomes a whole new element when an alpha particle is emitted. In beta decay, a neutron transforms into one proton and one electron, and the electron speeds off, leaving an extra proton behind and once again resulting in an atom of a different element. Alongside either of the above particles, decaying nuclei can also produce gamma rays: high energy electromagnetic radiation.
What are the health effects?
As Becquerel and the Curies discovered, radioactivity is a naturally-occurring phenomenon. Many minerals in the Earth emit a slow and steady trickle of radiation, the air we breathe contains radioactive gases, and even foods and our bodies contain a small percentage of radioactive atoms like potassium-40 and carbon-14. The Earth also receives radiation from the Sun and as high-energy cosmic rays. These sources create a natural but unavoidable level of background radiation. Many artificial sources add to this, including medical procedures such as X-rays, smoke detectors, building materials and combustible fuels.
We generally aren’t harmed by low-level background sources of radiation, as the extent of harm depends on the length and level of exposure. Radiation can damage the body’s internal chemistry, breaking up chemical bonds in our tissue, killing cells, and damaging DNA, which may lead to cancer. In very high doses, radiation can cause sickness and death within hours.
Harnessing nuclear power
The effects of radioactivity have been felt on an even grander scale with the meltdown of nuclear power plants throughout history. The radioactive process of fission has been harnessed for several decades to produce electricity: the nucleus of an atom is split, creating at least two “daughter” nuclei and releasing energy as heat. The heat is used to boil water and create steam, turning a turbine and generating electricity.
Unfortunately this isn’t a clean process – it produces radioactive waste that is difficult to safely dispose of, and in extreme cases reactions can spiral out of control, such as the disaster triggered by an earthquake at the Fukushima Daiichi nuclear power station in 2011.
Another radioactive process could provide a safe way to generate clean energy: fusion. In contrast to fission, fusion involves joining two atomic nuclei together. This process also releases energy – it’s the exact process occurring in the Sun and other stars – but fusion requires extremely high temperatures and pressures, which are expensive and difficult to recreate on Earth.
A long road ahead
Becquerel died 12 years after his initial discovery at age 54, with burns and scars likely from handling radioactive materials, and Marie Curie died several decades later from leukemia. Radiation was probably slowly killing Pierre Curie too, although it’s difficult to know as he was fatally run down by a carriage in 1906.
Today our greater understanding of radioactivity allows us to use it much more safely. Accidents with radioactive materials have decreased in frequency and produce fewer fatalities due to stringent safety measures and thorough emergency responses. In the most recent nuclear disaster at Fukushima, no deaths resulted from radiation exposure – but there’s still a long way to go before we can safely harness the immense raw power of radioactivity.