In a world first, a team of engineers, neuroscientists and neurosurgeons at UC Davis and UC Davis Health has demonstrated that brain-computer interfaces, or BCIs, for translating brain signals into speech can also enable control of a computer cursor.
Their findings, published in the and supported by funding from the National Science Foundation and the National Institutes of Health, pave the way for feature-rich BCIs that restore functions to people with paralysis — and a level of autonomy previously thought impossible.
“Future steps in multimodal BCIs could include gesture decoding for all sorts of different things, enriching the types of interactions someone with paralysis can have with their environment beyond speech,” said Tyler Singer-Clark, a biomedical engineering Ph.D. student and first author on the paper.
Singer-Clark is a member of the UC Davis Neuroprosthetics Lab, which is co-directed by neuroscientist Sergey Stavisky and neurosurgeon David Brandman. Under Stavisky and Brandman, the lab previously created the most accurate BCI for speech ever recorded, an innovation that Singer-Clark’s project builds upon.
Advancing the standard
The lab’s BCI, implanted in the speech motor cortex, interprets electrical activity generated by thinking and outputs the activity into recognizable words displayed on a computer.
The team found something interesting about their speech BCI: The part of the brain it was analyzing for speech showed signs that it could support control of a cursor, a motor ability typically associated with a different brain area.
Following the team’s observation, Singer-Clark began coding cursor control software for the speech BCI.
While no one had ever coded such a system for cursor control driven by the speech motor cortex, Singer-Clark looked at decades of prior research on cursor control controlled by other parts of the brain. He was also able to base his architecture on the existing code used in the lab’s speech BCI.
“We didn't have to reinvent the pre-processing of the neural data,” he said. “For cursor control, it's actually the same pre-processing steps the speech BCI uses to get the neural features that are going to be useful for decoding the intention of the participant.”
Mapping functionality
With the cursor control software complete, Singer-Clark’s last step was to individualize the decoding architecture.
He worked with the same man who had participated in the original speech BCI research as part of the . Due to amyotrophic lateral sclerosis — also known as ALS or Lou Gehrig’s Disease, a disease of the nervous system that leads to gradual loss of movement — the man is unable to move.
Singer-Clark had the man watch a cursor moving around a screen and selecting targets. As the man thought about and observed the movement of the cursor, Singer-Clark watched the man’s neurons firing. He was able to map this data, such as the specific neurons that activate when the man wants to move left, to the cursor control software.
Once the software was enabled for his implanted BCI, it took the man less than 40 seconds to adapt to the cursor control functionality. Through thinking about moving the cursor, he was able to see the cursor move on his computer screen. He was able to click on apps and open links.
Singer-Clark clarified, however, that the BCI is not translating signals of abstract thoughts into movement. It’s more like intuition.
“That's his word, intuition,” Singer-Clark said. “I'll say, ‘What motor imagery are you using?’ And he says, ‘Intuition.’”
“Singer-Clark’s work is incredibly important for the field,” said Brandman, one of the co-directors of the UC Davis Neuroprosthetics Lab. “His work has not only empowered our BrainGate2 participant to use a computer cursor with his thoughts but has also led the way for multiple companies in this space to design their clinical trials.”
An autonomous future
For Singer-Clark, the impact of this project is manifold.
On the one hand, this project is proof that complex, multimodal BCIs are feasible.
“It hammers home a growing viewpoint that different body parts and their movements are represented in multiple areas of the motor cortex, as opposed to being all siloed in their own areas.”
On the other hand, there’s a very personal and powerful consequence of this research. There’s a man who has regained autonomy.
“There's a man with ALS who can control his computer independently without someone else helping him for hours and hours every day. It's like this great event, and we might not have tried if we didn't have that prior research encouraging us to do that.”
That’s what BCIs — and the UC Davis engineers, neuroscientists and neurosurgeons advancing the technology — are pushing for: A future where people with paralysis can gain a level of autonomy previously thought impossible.
Media Resources
Media Contact
- Andy Fell, News and Media Relations, 530-304-8888, ahfell@ucdavis.edu