Caltech Researchers Improving Brain-Machine Interfaces with Machine Learning


Caltech Researchers Improving Brain-Machine Interfaces with Machine Learning

Brain-machine interfaces (BMIs) have enabled a handful of test participants who are unable to move or speak to communicate simply by thinking. An implanted device picks up the neural signals associated with a particular thought and converts them into control signals that are fed into a computer or a robotic limb. For example, a quadriplegic person is asked to think about moving a cursor on a computer screen. Once the BMI is trained to recognize the neurological activity as this intent, the person's thought is transmitted via the BMI to carry out the action of moving the cursor. BMIs currently in an experimental phase may also consist of robotic limbs that can execute manual tasks as instructed by a disabled person's thoughts alone.

The hardware necessary for this incredible feat is a computer -- either freestanding or within a robotic device -- and an implant in the brain of the person who is using BMI technology to communicate their intent through their thoughts.

At Caltech, researchers use implants that consist of arrays of 100 microelectrodes mounted on a 4×4 mm chip. The microelectrodes are typically 1.5 mm long and penetrate into the brain's cortex, where they can record the activity of individual neurons.

Unfortunately, the performance of these microelectrode arrays is not consistent and degrades over time. To overcome this challenge, Caltech's Azita Emami, the Andrew and Peggy Cherng Professor of Electrical Engineering and Medical Engineering and director of the Center for Sensing to Intelligence (S2I), and her colleagues have used machine learning to effectively interpret the neuronal signals picked up by older implants.

"Not only do we observe day-to-day variations, but over time the performance of brain-computer interfaces degrade for a variety of reasons," Emami says. "There may be a small movement of the implant or its electrodes. The electrodes themselves may deteriorate or become encapsulated in brain tissue. Some people think that over time the neurons move away from the implant because they react to it as a foreign object in the brain. For whatever reason, the signals we receive become noisier."

When a BMI is first set up, the microelectrode array produces a signal characterized by strong action potentials that appear like spikes in the recordings. Once this strong signal is no longer detected by the microelectrode array -- that is, when the feedback from the array becomes noisier and neural spikes are no longer clearly detected -- it is a far trickier task to link a pattern of neural activity from more distant neurons to a specific intent that can be successfully transmitted to a computer or other device. Researchers have tried to identify alternative signals, such as so-called threshold crossings or local field potentials recorded from distant neurons. One approach has been to use wavelets that measure small oscillations in neuronal activity. But the success of wavelets and other methods has been limited.

Now Emami and her colleagues have found that by applying machine learning, BMIs can be trained to interpret data from neural activity even after the signal from an implant has become less clear.

Benyamin Haghi, formerly a graduate student in the Emami lab explains: "Where before we relied on counting neural spikes, we have now created a neural network that automatically extracts information from the entire neural signal, from all the little dips and picks and changes in the signal, and converts this into the intent of the patient." Emami adds, "Over time the BMI has been trained on both a signal that is neural activity and a signal that looks like noise, and is therefore able to interpret the user's intent."

Emami describes the experience of one participant, JJ, who lost mobility due to a vehicle accident. "When we first started working with him, his implant was three years old, and it had already degraded. We were thinking of removing the implant, but with our new algorithm, he's happy with continuing to use the system he already has. JJ can move a cursor very precisely on a grid, just as he did when the implant was new. He can play video games and control a computer environment that mimics driving."

The team's algorithm is called FENet, for Feature Extraction Network. Remarkably, it can be trained on data from one patient and then used successfully in another. "This means that there is some fundamental type of information in the neural data that we are picking up," Emami says. Not only that, FENet can generalize across different brain regions and types of electrodes and be easily incorporated into existing BMIs.

Richard Andersen, the James G. Boswell Professor of Neuroscience and leadership chair and director of the T&C Chen Brain-Machine Interface Center, says "FENet has already extended our clinical study with JJ by two years. BMI research is a perfect field for interdisciplinary research, in this case melding the disciplines of engineering, computer science, and neuroscience."

Today's BMIs require a manual connection from the electrode implant to a connector that then links with wires that lead to a microsystem that processes the raw data before sending it to a computer -- the visual interface the patient manipulates. "This is quite a cumbersome system," Emami says. "Our goal now, after the creation of FENet to better interpret brain signals, is to miniaturize the system so that one day it can be a wearable or an implant that communicates wirelessly with the computer."

This research was published in Nature Biomedical Engineering under the title "Enhanced control of a brain-machine interface by tetraplegic participants via neural-network-mediated feature extraction." Co-authors include Andersen; Emami; Haghi; Albert Yan Huang from the Emami lab; members of professional staff Tyson Aflalo and Spencer Kellis, postdoc Jorge A. Gamez de Leon, and graduate student Charles Guan from the Andersen lab; and Nader Pouratian from UCLA Neurosurgery. Funding was provided by the National Institutes of Health, Caltech's S2I, the T&C Chen Brain-Machine Interface Center at Caltech, the Boswell Foundation, the Braun Foundation, and the Heritage Medical Research Institute.

Previous articleNext article

POPULAR CATEGORY

corporate

10178

tech

11458

entertainment

12507

research

5665

misc

13249

wellness

10064

athletics

13223