AI-based neural system taught to recognize and voice simple human thoughts

Neural networks taught to “read minds” in real time

AI-based neural system taught to recognize and voice simple human thoughts

Researchers have developed an interface that can read brain signals, analyze them using neural networks, and then use a speech synthesizer to translate simple thoughts into understandable words..

It has long been known that when people talk, listen, or think about something, characteristic patterns of activity are formed in the brain. Although scientists have developed various models for decoding signals, it turned out to be much more difficult to automate the process of their clear reproduction. As a result, the computer produced illegible words. However, neuroengineers at Columbia University recently made significant progress in this direction.

They created a system that translates thoughts into intelligible, recognizable speech. To do this, the researchers suggested that people with epilepsy who had already undergone surgery listen to audio recordings in which the numbers from 0 to 9 were pronounced. At the same time, signals from their brain were recorded, which were processed by a computer speech synthesis algorithm. Next, the sound produced by the voice encoder was analyzed and refined using neural networks, such as artificial intelligence. As a result, robotic speech was formed, repeating the sequence of numbers on the record..

AI-based neural system taught to recognize and voice simple human thoughts

In the course of objective verification of the results, in 75% of cases, people could understand artificially reproduced words. Next, a team of scientists will test the system on more complex terms and whole sentences, when a person just thinks about it. Ultimately, they hope to create an interface that will become part of the implant, allowing people who have lost their ability to speak as a result of injury or illness to start contacting others again. Although, according to the researchers, this may go away a decade.

In the long term, the researchers plan to develop non-invasive neural recording techniques and improve decoding techniques, since not everyone will agree to surgery. The technology can also be integrated into various devices and gadgets. For example, a smartphone that can translate the user’s thoughts into text messages.

Elon Musk’s startup Neuralink is also developing its own version of the AI-based neural interface.

text: Ilya Bauer, photo: st.kuchavsego