It’s now possible to read a person’s brain activity and “decode” it into speech, researchers have revealed, in what appears to be something akin to “mind reading.”
The discovery is initially focused on helping people with disabilities to communicate.
“We were kind of shocked that this works as well as it does,” says team leader Alexander Huth, a neuroscientist at the University of Texas, Austin, who announced the development at a news conference in the US and published in Nature Neuroscience.
The researchers say previous speech decoders have been applied to neural activity recorded after invasive neurosurgery, which limits their use. Other decoders that have used non-invasive brain activity recordings were limited to decoding single words or short phrases.
The new process begins by having people listen to stories while their brain activity is monitored by a method known as “functional MRI,” which measures tiny changes in blood flow as the brain responds to what it’s hearing.
Calibrating this, however, is not a quick process. It took a total of 16 hours per person. To make it more palatable, the researchers used podcasts, ranging from a story-telling radio show called The Moth Radio Hour to TED Talks. “We all like to listen to podcasts,” Huth notes.
From that, says Jerry Tang, the graduate student who did much of the work, the team built models to determine how each person’s brain responded to a wide range of word sequences. They then reversed the process, letting their research subjects listen to stories they’d not heard before and seeing if they could reconstruct, or “decode,” them from their resulting brain activity.
The findings were impressive, but not literal. For instance, Tang says, when one research subject heard a story in which the narrator said, “I don’t have my driver’s license yet,” the decoder interpreted it as, “She is not ready; she has not even started to learn to drive yet.”
Another story involved a lovers’ spat in which the narrator said, “I didn’t know whether to scream, cry, or run away. Instead, I said, ‘Leave me alone; I don’t need your help.’ Adam disappeared, and I cleaned up alone, crying.”
Also in Cosmos: Mind reading might help those who cannot speak
That was reconstructed as: “[I] started to scream and cry, and then she just said, ‘I told you to leave me alone, you can’t hurt me anymore. I’m sorry.’ And then he stormed off. I thought he had left [and] I started to cry.” That’s definitely a bit garbled, but other than messed up pronouns that the decoder ultimately corrected, not too far off the mark.
Huth thinks this means the scanner isn’t just recording how our hearing turns sound into words, but is finding a higher level involving how we interpret them. Supporting that is the fact that when his subjects were shown short animated films with no dialog, their brain scanners recorded a decent approximation of the films’ story lines.
Special feature: Do we understand the brain yet?
Not that Huth’s team has found a way to read your secret thoughts. Their process requires hours and hours of individualised calibration…and an algorithm trained on one person’s brain won’t work on another’s.
That, Tang thinks, is a good thing. “Mental privacy is important,” he says.
Meanwhile, the focus is on creating a way for people with real-world disabilities to have better lives. “Eventually,” Tang says, “we hope that this technology can help people who have lost the ability to speak due to injuries like strokes, or diseases like ALS [a degenerative disorder in which people’s control over muscles, including those involved in speech, gradually erodes].”
But, he adds, it’s time to start thinking now about how this technology can and cannot be used, in case it does someday progress to the point where it’s possible to read minds without permission. “[Then] we’ll have a regulatory foundation in place,” he says.