Data is being created at a breakneck speed, and researchers have their hands full trying to figure out what to do with it.
Though many think of data as a virtual entity, it must be physically stored in data centres around the world. Currently, the total amount of data worldwide is on the order of 10 zettabytes, where one zettabyte equals a trillion gigabytes. This is an unimaginably huge number for the human brain to comprehend – and it’s estimated to double every couple of years.
Physically storing data is an energy-intensive process: information and communications technology chew through 8% of global electricity usage. This is only set to increase as our demand for processing power grows – every tweet, Instagram post and Facebook update needs to be stored somewhere.
We need to find more energy-efficient ways to store data – and quickly.
“Data storage is critical to both computing and long-term storage of our digitised data – which, these days, encompass almost every single aspect of our lives,” says Francesca Iacopi, a materials and nanoelectronics engineer at the University of Technology Sydney (UTS).
“Faster, denser memory, with high endurance – number of times it can be rewritten – is necessary for both purposes, and memory with longer and more reliable retention is key to long-term data storage.”
This was the focus of a recent paper, which looked at the future of data storage technology in search of one that ticks all the boxes. The comprehensive review, led by Australian researchers at the ARC Centre of Excellence in Future Low-Energy Electronics Technologies (FLEET), specifically delved into a promising candidate called multi-state memory.
According to Qiang Cao from the University of Wollongong, lead author of the paper, “fast data volume growth in [this] big data era has dramatically enhanced the demand for the memory devices with energy-efficient design, high storage density, fast access, and low cost”.
The next generation of memory storage technology must also be non-volatile, meaning it doesn’t require power to retrieve stored information.
“Contemporary computing systems utilise two-level memory configurations composed of working and storage devices to obtain an optimal trade-off between performance and capacity,” explains Cao.
“The working memories, like SRAM and DRAM, are volatile and unfavourable for smaller and/or more energy-conscious applications. The storage class memory, like flash, is facing the bottleneck of scalability and reliability.”
He adds: “The need for new data storage solutions is bigger than ever – and growing.”
The race to create this ideal technology has no clear winner yet, but one frontrunner is non-volatile multi-state memory, which has been dubbed a “beyond binary” technology.
Traditionally, data is stored in units called bits – either 0s or 1s – but multi-state memory has the ability to pack two or more pieces of data into each bit, so it has a much higher storage density than current technology without scaling down the memory cell.
“For example, we can encode 256 (28) characters or numbers in a sequence of 8 bits,” Cao explains. “If the same sequence of digits has five states per cell, then up to 390625 (58) characters can be stored, which is over 1500 times more than binary memory.”
To maintain accuracy, there is a limit to how much data can fit in each bit, but multi-state memory is still a cut above traditional data storage.
The “beyond binary” capability allows multi-state memory to conveniently sidestep the limitations of Moore’s Law – which, in 1965, predicted that computing power would double every eighteen months or so. Specifically, it predicted that transistors would shrink so that twice as many could be crammed into a chip, improving both processing power and memory capacity.
Moore’s Law is an observation of a trend rather than a physical law, but it has become a self-fulfilling prophecy, powering over 50 years of advances and innovations. Memory chips today store several billion times more data than they did in 1965.
But recently we have been seeing its inevitable plateau. Eventually, the laws of physics will prohibit transistors from shrinking any further, and in the meantime the costs of research and fabrication are becoming astronomical.
“Transistor sizes are now approaching the point where quantum effects become appreciable, which is a fundamental barrier for CMOS technology,” says Cao. “Take flash, for example – the cell shrinking will render the dielectric layer thinner, raising the risk of charge leakage.
“To enhance the data density and then reduce the bit-cost further, an inevitable way is to use multi-states technology.”
However, UTS’s Iacopi – who was not involved in the study – notes that while we have seen many advances in non-volatile memory like multi-state, it still requires a sizeable amount of power to operate.
“Power consumption is also an issue, as well as latency, endurance and long-term retention,” she explains.
But if these hurdles can be overcome, multi-state memory has intriguing uses.
Cao says that one irresistible attraction in this line of research is its application to neuromorphic computing.
“Neuromorphic computing takes inspiration from the way that the brain computes to increase the energy efficiency and computational power of our data processing systems,” he explains.
The next generation of memory storage technology must also be non-volatile, meaning it doesn’t require power to retrieve stored information.
“In contrast to current computer with Von Neumann architecture, our brain has impressive computing abilities at very low power consumption levels, since the brain is highly-parallel, interconnected, and enabled with in-situ synaptic memory storage.”
Iacopi further explains that in neuromorphic computing, “logic and memory functions happen within the same integrated circuit or chip – basically the same devices perform both logic and memory operations. This is the ultimate architecture for fast AI computing.”
In many ways, multi-state memory devices can emulate the attributes of the human brain – including plasticity, which is fundamental for learning and adapting to a changing environment.
Cao believes that multi-state memory storage is therefore likely the most promising contender to use in neuromorphic circuits.
This technology would vastly accelerate the progress of artificial intelligence. Many of the key attributes of AI that scientists are striving to develop – such as the ability to recognise speech, faces and objects – require an enormous amount of computing power and time, which are limited by the architecture of current systems. To overcome this, researchers are developing materials, devices and systems that mimic the way humans process the world – and neuromorphic computing could be an integral part of this.
Despite the remarkable progress of the field, much work is still to be done, not only on device-specific technologies but also to experimentally demonstrate a neuromorphic network using multi-state devices – though several recent studies (here and here, for example) have attempted to emulate facets of its functionalities and interconnectivity.
“A lack of detail about how information is processed within the human brain is also a strong barrier toward achieving a brain-like neuromorphic computer,” Cao adds.
“Hence, this field is wide open for more experiments and additional ideas. Further progress will require a broad and interdisciplinary approach.”