Skip to content

Scientists in the United States successfully deciphered internal thoughts with a mental password technique, achieving a high accuracy rate of 74%

Revolutionary brain-computer interface (BCI) technology now translates a person's thoughts into spoken words based on a mentally set code phrase.

Scientists in the United States successfully deciphered interior thoughts using a mental password...
Scientists in the United States successfully deciphered interior thoughts using a mental password technology, achieving an accuracy rate of 74%

Scientists in the United States successfully deciphered internal thoughts with a mental password technique, achieving a high accuracy rate of 74%

In a groundbreaking development, a team of researchers at the University of California, San Francisco (UCSF) has created a brain-computer interface (BCI) that can translate a person's inner thoughts into words with an accuracy of up to 74%.

The BCI technology is not new, as it enables direct communication between the brain and external devices. However, this latest advancement represents a significant leap forward, providing individuals with severe speech and motor impairments a more natural and efficient means of communication.

The research, led by Erin Kunz from Stanford University, involved four participants with severe paralysis resulting from conditions such as amyotrophic lateral sclerosis (ALS) or brainstem stroke. To give users more control, the team created a mental password-controlled system.

Participants were instructed to either try to speak or imagine words. The neural activity was recorded using microelectrodes placed in the motor cortex, the part of the brain that controls speech. To initiate the decoding process, participants successfully used the phrase "chitty chitty bang bang."

Prior research has shown that BCIs can decode attempted speech in people with paralysis by interpreting the brain activity associated with trying to speak. However, this latest study has demonstrated that the patterns produced by inner speech are distinct enough for artificial intelligence to learn and interpret. Although the brain activity associated with inner speech was notably weaker, the system was highly accurate, recognizing the password with over 98% success.

The BCI could also pick up on unplanned thoughts, like numbers, when participants counted objects on a screen. For people with limited muscle control, decoding inner speech through BCI could be faster than older methods like eye-tracking systems.

The Emory BrainGate Team has also successfully decoded inner speech, but with a lower accuracy of up to 74%. This difference can be used to train BCIs to ignore inner speech if needed specifically. The technology has been used to help people with disabilities control prosthetic limbs by decoding brain signals related to movement.

The study's lead author, Erin Kunz, stated that this is the first time they've managed to understand what brain activity looks like when a person thinks about speaking. This advancement could revolutionize the way people with severe speech and motor impairments communicate, offering them a more natural and efficient means of expression.

Read also:

Latest