AI learns Nobel prize-winning quantum experiment
Physicists using artificial intelligence to run a complex experiment could be putting themselves out of a job. Cathal O'Connell reports.
A team of Australian physicists has employed a new research assistant in the form of an artificial intelligence (AI) algorithm to help set up experiments in quantum mechanics.
For its first task, the algorithm took control of a delicate experiment to create a Bose-Einstein condensate – a weird state of matter that can form in certain atoms at ultracold temperatures.
The algorithm didn’t need specific training and was able to learn on the job. It developed its own model of the process and tweaking the parameters to get them just right.
“I didn’t expect the machine could learn to do the experiment itself, from scratch, in under an hour,” said co-lead researcher Paul Wigley from the Australian National University in Canberra.
The work is the latest example of scientists turning to AI as a collaborator in research.
Jürgen Schmidhuber, a German computer scientist whose algorithms are central to Google's speech recognition, is working on his ambition is to build an optimal AI scientist, then retire (leaving the AI to replace him).
Meanwhile, at the University of Vienna, physicists are using a computer program to devise new quantum experiments they could not have thought of themselves. They see it as a way to get past the non-intuitive nature of quantum mechanics.
And now Australian physicists are letting a machine control their instruments to replicate an experiment that won the 2001 Nobel prize.
A Bose-Einstein condensate is like atomic "groupthink" – a bunch of atoms behaves as if it was a single atom. This only happens thanks to quantum effects in certain elements, but only when they are incredibly cold – typically less than a billionth of a degree above absolute zero (-273.15 ºC).Getting down to that temperature is a finicky business involving trapping the atoms between two laser beams.
The Australian team developed their AI algorithm to control the lasers during cooling. And it achieved the condensation ten times faster than using a regular, non-AI program.
“This is the first application of AI like this, where it's controlling an experiment and optimising it on its own,” says Michael Hush, a physicist at the University of New South Wales in Sydney who co-led the work.
In the experiment, the physicists trapped about 40 million rubidium atoms at the intersection of two laser beams. They then used magnetic fields to cool the atoms down about five millionths of a degree above absolute zero. Pretty cold, but still too warm to condense.
For the final and most delicate cooling stage, AI was put in the driver seat.
It carefully tuned the power of the two lasers to allow the most energetic atoms to escape, but without losing hold of the coldest ones – and did it with surprising success.
“It did things a person wouldn’t guess, such as changing one laser’s power up and down, and compensating with another,” Wigley says.
The AI could be used for any experiment that involves optimising parameters, such as achieving perfect focus in high-resolution microscopy.
“The exciting thing about the AI is that it requires no prior knowledge of the system, making it quite general,” Wigley says.
The paper is published in the journal Scientific Reports today.
And no, the AI is not included on the list of authors.