Computer communication for motor neuron disease patients

A team of European researchers has reported a new method to successfully communicate with a patient with advanced motor neuron disease (MND).

What is motor neuron disease and how does it impact communication?

MND, also known as amyotrophic lateral sclerosis (ALS), is a neurodegenerative condition that causes progressive loss of voluntary muscle control.

According to the charity FightMND, MND affects about 2,000 patients in Australia, and is extremely costly to patients and carers.

The symptoms of MND may begin as muscle weakness, cramping or a loss of dexterity. As the disease progresses, patients eventually lose control of muscles throughout their body, including those we use to speak and breathe.

Such patients must rely on artificial ventilation to survive and may use eye movements to communicate. This is known as being in a “locked-in state”.

Patients who have lost even the ability to control their gaze, or whose eyes are involuntarily closed, are said to be in a “completely locked-in state”.

Brain-computer interfaces have previously been shown to assist patients in a locked-in state to maintain communication with the people around them using eye movements or neural signals.

However, this study reports the first known example of successful communication with a patient in a completely locked-in state.  

“There are several brain-computer interfaces, both non-invasive and invasive, for communication in a locked-in state, but none so far has demonstrated communication by patients in a completely locked-in state,” explains corresponding author Ujwal Chaudhary.

Schematic of a brain computer interface that can be used for example in motor neuron disease patients. On the left, signal acquistion showing electrodes picking up signals from the brain. Then an arrow to signal processing and a second arrow from signal processing to external device with the examples of a computer wheelchair or prosthetic arm. Then an arrow showing feedback returning from the external device to the person.
A schematic illustrating the concept of a brain-computer interface. Credit: Hakule / iStock / Getty Images.

How does the new communication device work?

The device consists of microelectrodes (which are implanted in the patient’s brain), a neural signal processor, and a computer with custom software. It uses a system of auditory neurofeedback to allow the patient to communicate using brain signals.

The device plays a tone to the patient that represents auditory feedback of their own neural activity. The tone is linked to the spike rate of the neural signal. The patient learned to modulate the feedback tone by using a specific neural response to either increase or decrease the frequency of the sound wave.

“The patient listens to a target tone, and their goal is to modulate the frequency of the sound wave following the target using their neural firing rate until the sound they hear informs them that the target has been achieved,” says Chaudhary.

Changes in neural firing rate lasting for longer than 250 milliseconds at the high end of a given range were interpreted as a “yes” response, and changes at the low end of the range interpreted as a “no”.

“Neurophysiologically, every activity we perform elicits a neural response,” Chaudhary explains.

“We can consistently perform two different tasks to produce two different kinds of neural activity, which can produce two different types of output – in this case, increasing and decreasing the frequency of the sound wave.”

Putting auditory neurofeedback into practice

Following this training, the patient was able to spell out words and sentences using a system that divides the alphabet into characters and plays them to the patient as an auditory signal, followed by a sound wave that the patient can modulate. The patient hears the signal for the letter, can respond “yes” or “no” using the auditory neurofeedback system, and so on with the next option until the word has been spelled out.

Using this system, the patient was able to express his needs and desires, such as asking for his head position to be changed or to listen to a particular type of music or watch a movie with his son.

The patient eventually lost the ability to voluntarily open his eyes, but was still able to use the device for successful communication.

Before this, scientists weren’t even sure if people in a completely locked-in state were still able to access the neural mechanisms used for communication.

“We proved that communication is possible even in a completely locked-in state,” says Chaudhary.

The study could open up new doors for communication and improving the quality of life of patients in a completely locked-in state.

The researchers have set up a not-for-profit organisation called ALS Voice to aid communication for patients in locked-in or completely locked-in states.

Please login to favourite this article.