Moving a robot arm with your mind

Erik Sorto, a 34-year-old quadriplegic in the US, can control a robotic arm with his mind: when he imagines making certain movements the arm obeys him. “I was surprised at how easy it was,” Sorto says. “I remember just having this out-of-body experience, and I wanted to just run around and high-five everybody.”

A bullet hit Sorto’s spine 12 years ago. In 2013 he was selected as the first patient to receive an implant to a region of the brain called the posterior parietal cortex (PPC). Caltech researcher Richard Andersen and his team reported on the implant in Science. They showed the PPC produces signals that control planned movements such as scratching your nose or lifting a glass – and that these signals could be used to control a bionic arm.

“It was a big surprise that the patient was able to control the limb on day one – the very first day he tried,” says Andersen. “This attests to how intuitive the control is when using PPC activity.”

Andersen and his team long suspected that the PPC played a high-level role in planning movement. Human brain scans had shown the region lighting up when we imagine moving a limb – but was the PPC encoding intention to move, or attention to the movement? And the idea was impossible to prove definitively in animal studies. How do you ask a monkey to imagine moving its arm?

Their latest study provides the ultimate proof. Sorto’s device bristled with 96 electrodes that could read the firing of individual PPC neurons. Sixteen days after receiving the implant, Andersen and his team asked Sorto to imagine moving his arms to perform certain tasks and recorded the activity of neurons connected to the implant. They found certain imagined movements consistently caused specific neurons to fire. For example, one neuron would fire when Sorto imagined touching his mouth – but did not fire when he imagined touching his ear or chin. By teaching a robotic arm to recognise particular firing patterns, Sorto could trigger certain arm motions by imagining them – from performing a smooth handshake gesture to taking a sip of a drink.

The results contrast with earlier attempts to give patients control over a robotic arm via an implant in their motor cortex. The motor cortex receives signals from the PPC and produces the torrent of instructions that ultimately directs each muscle involved in a task. Researchers’ best efforts at reading these complex instructions have given patients only delayed and jerky control of a prosthetic arm, whereas the motions produced by tapping the PPC produced smoother and more natural movement.

“I think it’s a really promising way to go,” says David Grayden, who researches machine-brain interfaces for medical bionics at the University of Melbourne. Imagine equipping a paralysed person with a motorised exoskeleton, he says: “If you can obtain an intention that someone wants to walk forward, then the exoskeleton can just do that – you don’t have to decode knee movements and ankle movements and stuff like that.” The same approach could work in wheelchairs too, he adds.

Although the implant can only communicate with a handful of the neurons in the PPC, Grayden says his previous research with animals suggests the brain is so flexible and adept at rewiring itself it should be able to learn to fire those particular neurons for specific movements.

But even though the brain will make the most of the connections it has, neurologists designing these implants have more work to do. Sorto’s implant has allowed him to control larger movements but “to really do fine dexterous control, you also need feedback from touch”, Andersen says. “Without it, it’s like going to the dentist and having your mouth numbed. It’s very hard to speak without somatosensory feedback.” He and his colleagues are now working on a bionic arm that can provide that feedback, and relay these signals to an additional implant placed in the part of the brain that gives the perception of touch.

Please login to favourite this article.