Next gen quantum sensors “one hundred million times more sensitive”
Sydney researchers develop new protocols to revolutionise sensor accuracy. Andrew Masterson reports.
Quantum sensors up to one hundred million times more sensitive than existing models are now possible, thanks to research led by scientists at the University of Sydney in Australia.
Quantum sensors themselves have been around for at least a decade, and use properties such as entanglement to achieve measurements beyond the reach of classical systems. They are used in devices such as atomic clocks and magnetometers.
According to an overview published this year in the journal Reviews of Modern Physics, the sensors are the focus of much optimism, and “the field is expected to provide new opportunities — especially with regard to high sensitivity and precision — in applied physics and other areas of science.”
One of the drawbacks of quantum sensors to date is, ironically enough, an issue they share with their larger, clunkier classical analogues – distinguishing signal from noise.
Very few sensors operate in pristine conditions wherein the signal they are calibrated to detect is the only one likely to be produced. The world is messier than that, and most of the time the devices operate in environments in which the desired signal has to be sifted out from within a wide range of competing fuzz.
Now, however, a team led by Michael Biercuk from the ARC Centre for Excellence for Engineered Quantum Systems, based at the university, has developed new protocols that dramatically increase the sensitivity and discrimination of quantum systems.
“By applying the right quantum controls to a qubit-based sensor, we can adjust its response in a way that guarantees the best possible exclusion of the background clutter – that is, the other voices in the room,” he says.
In quantum systems, the intrusion of unwanted signals into a sensor’s area of operation is called “spectral leakage”, and it has plagued the field since its inception. The inability to dial down unwanted noise and concentrate only on desired signals has meant results from quantum sensors have often been unclear.
To tackle the problem, Biercuk’s team conducted experiments using trapped atomic ions, and successfully reduced spectral leakage by many orders of magnitude.
“Our approach is relevant to nearly any quantum sensing application and can also be applied to quantum computing as it provides a way help identify sources of hardware error,” he explains.
“This is a major advance in how we operate quantum sensors.”
The new protocols, described in the journal Nature Communications, have potential applications in a wide range of fields, from medical imaging to defence.