Feeling small? It’s on any physicist’s wishlist. With a new Australian innovation, they can get a close-up view of the tiniest particles yet detected.
Have you ever felt an atom? Being made of atoms ourselves, we are always in contact with them, both in our own bodies and in every aspect of the physical world. But we don’t feel them, per se. Even when you lay your palm on the top of a table, you’re not actually feeling atoms – you’re feeling the repulsion of the electrostatic field created by the electrons that whiz around the periphery of every atom at speeds approaching that of light. They create a negative charge that prevents other atoms – also possessing negative charges – from getting too close together. At that level of detail, the whole world of “hard” surfaces becomes something akin to unthinkable numbers of tiny same-pole magnets trying to jam themselves together. They can get close – but not too close.
The physics of the “untouchable” atom opened the door to the first real attempts to be able to “feel” matter at the atomic scale. In 1981, Gerd Binnig and Heinrich Rohrer, researchers working for IBM Zürich, developed the “scanning tunnelling microscope” (STM). Built upon one of the basic effects of quantum mechanics, the STM places what is, in essence, the very sharp tip of a pin very close to a material being examined. When given an electric charge, electrons “jump off” the probe tip and “tunnel” through the material. The pattern of that tunnelling – where and when the electrons leap from tip to material – gives you an image of the material, much as if it had been shot through by an X-ray. Although atoms can’t get close together, Binnig and Rohrer harnessed quantum tunnelling to allow them to ever-so-gently graze one another – research that won them the 1986 Nobel Prize in Physics.
In 1985, Binnig went on to create the first real improvement in the STM – the “atomic force microscope”, or AFM, which added a micromechanical vibrator to the tip of the probe. As the tip of the AFM vibrates back-and-forth, it scans an area of a material at the atomic scale. This tip – just a few millionths of a metre in length – could both “read” the material beneath it, and (with the addition of the appropriate electrical charge) even be used to push that material around, gently nudging individual atoms into new positions. To demonstrate their newfound capability, in 1989 IBM released a famous photo of a set of xenon atoms arranged to form IBM’s logo. This was no easy feat – the same quantum effect that allows electrons to tunnel from tip into material also made it terrifically easy for those atoms to “wander” away from the positions they’d been coaxed into by the AFM.
Atomic force microscopy made it possible to both “read” and “write” atoms, but it took a very clever graduate student at the University of North Carolina, US, to work out how to touch them. Russell M. Taylor fed the information generated by an atomic force microscope into a multi-million-dollar graphics supercomputer (which, given this was back in 1993, was almost certainly less powerful than your average smartphone), using that data to generate a three-dimensional “contour” of the material under the probe tip. Although images generated from AFM scans had given a rough picture of the “shape” of atoms, Taylor’s visualisations offered a sense of depth, placement and orientation – not just a single atom, but this atom in relation to that atom, revealing the structures of chemically interlinked atoms (molecules). Projected onto a surface the size of a table, and viewed with special 3D glasses, these atoms and molecules looked as real as apples and oranges.
Taylor added one final touch to his research device – his VR system had a haptic interface; that is, it could deliver a faux sense of “touch” to the objects displayed within its tabletop virtual world. You could run your hand (virtually) across the surface of atoms, even push them around and feel them snap back into place. This Nanomanipulator, as Taylor christened it, became one of the landmark works of the first age of virtual reality. Sharing his work with some research chemists, they found themselves amazed that they could “feel” their way across chemical bonds and molecular structures that had always been theoretical abstractions, discovering things they never could have known about these substances, because their sense of touch revealed details no-one had ever even thought to intuit. Involving multiple senses, the Nanomanipulator made the atomic scale tangible, and gave chemists an incredible tool for thinking about their work.
But the Nanomanipulator was big, expensive and delicate. STMs and AFMs require a degree of precision and support that puts them in the rarest bits of laboratory kit – and even if you could get access to one, you’d still need a million-plus dollars of supercomputer to turn it into a Nanomanipulator. Taylor had crafted a breakthrough one-of-a-kind tool. Even preparing a sample for an AFM scan required considerable work; AFMs’ and STMs’ subjects must be placed into an isolated vacuum chamber – which immediately rules out the atomic-scale observation of anything even remotely alive. With the exception of tardigrades, vacuums and life don’t mix.
An accidental discovery made in a laboratory at the University of Melbourne by researcher Christopher Bolton opened a less toxic window onto the nanoscale. In his work with lasers, Bolton saw something he’d neither seen nor heard of before – illuminating something microscopically small from multiple angles produced multiple views of the same object, and Bolton could use a fairly simple bit of maths to sum those images together into a single view of that Very Small Thing.
How small? Optical microscopes hit the physical limit of what they can see at half a micron (a micron is a millionth of a metre) – because that’s when something is so small it’s smaller than the wavelength of light. Bolton found that using his shooting-the-subject-from-all-angles-with-light-beams method allowed him to image objects only one-20th of the size – just 25 nanometres (billionths of a metre).
Better yet, this technique worked with pretty much any sample you wanted to throw on a microscope slide – no vacuum necessary. “We put a living bacterium on a slide,” Bolton reported, “and watched it as it struggled. When it died, it spilled its nanoscale guts onto the slide – and we could see those too!” These were the sorts of events that biologists had theorised about, but have never been able to see happening. Bolton’s discovery – which he’s turned into startup Tiny Bright Things with research advisor Ray Dagastine – looks as though it could give both medicine and biology the eyes they need to see bacteria, viruses, and the deep but poorly understood interactions between our bodies and our environment.
Four hundred years ago, the first microscopes gave us a window onto a world we had never even imagined. These latest microscopes open a new vista onto a world we understand in theory, but have never visited in practice. How much more will we learn when we see the dance of nanoscopic living beings? And how long until some enterprising graduate student slaps a haptic interface onto this new microscope, so we can touch the surface of a virus, feel its spike proteins, and perhaps learn better how to defend ourselves from them?
Mark Pesce invented the technology for 3D on the Web, has written seven books, was for seven years a judge on the ABC's "The New Inventors", founded postgraduate programs at USC and AFTRS, holds an honorary appointment at Sydney University, is a multiple-award-winning columnist for The Register, pens another column for IEEE Spectrum, and is a professional futurist and public speaker. Pesce hosts both the award-winning "The Next Billion Seconds" and "This Week in Startups Australia" podcasts.