Jack Loughran Fri 15 Aug 2025

Collected at: https://eandt.theiet.org/2025/08/13/brain-computer-interface-decodes-inner-speech-text-people-who-cannot-speak

Scientists have used a brain-computer interface (BCI) that could help people who are unable to speak communicate more easily.

A Stanford University team were able to translate inner speech – the silent monologue in people’s heads – with up to 74% accuracy.

“This is the first time we’ve managed to understand what brain activity looks like when you just think about speaking,” said lead author Erin Kunz. “For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally.”

Using sensors implanted in brain regions that control movement, BCI systems can decode movement-related neural signals and translate them into actions, such as moving a prosthetic hand. 

Research has shown that BCIs can even decode attempted speech among people with paralysis. When users physically attempt to speak out loud by engaging the muscles related to making sounds, BCIs can interpret the resulting brain activity and type out what they are attempting to say, even if the speech itself is unintelligible.

The team recorded neural activity from micro-electrodes implanted in the motor cortex – the brain region responsible for speaking – of four participants with severe paralysis from either amyotrophic lateral sclerosis (ALS) or a brainstem stroke. The researchers asked the participants to either attempt to speak or imagine saying a set of words. They found that attempted speech and inner speech activate overlapping regions in the brain and evoke similar patterns of neural activity, but inner speech tends to show a weaker magnitude of activation overall. 

Using the inner speech data, the team trained AI models to interpret imagined words. 

In a proof-of-concept demonstration, the BCI could decode imagined sentences from a vocabulary of up to 125,000 words with an accuracy rate as high as 74%. The BCI was also able to pick up what some inner speech participants were never instructed to say, such as numbers when the participants were asked to tally the pink circles on the screen. 

The team also found that while attempted speech and inner speech produce similar patterns of neural activity in the motor cortex, they were different enough to be reliably distinguished from each other. Senior author Frank Willett said researchers can use this distinction to train BCIs to ignore inner speech altogether.

But for users who may want to use inner speech as a method for faster or easier communication, the team also demonstrated a password-controlled mechanism that would prevent the BCI from decoding inner speech unless temporarily unlocked with a chosen keyword. In the experiment, users could think of the phrase “chitty chitty bang bang” to begin inner-speech decoding. The system recognised the password with more than 98% accuracy.

Leave a Reply

Your email address will not be published. Required fields are marked *

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments