Neural Network Learns to Translate Brain Signals into Speech - Hitecher
Neural Network Learns to Translate Brain Signals into Speech

Neural Network Learns to Translate Brain Signals into Speech

Artificial intelligence developed by scientists from the University of California, San Francisco, has learned to translate human lip movements into speech. What is surprising, is that it works, even if a person does not move his lips, but only thinks about it.

Artificial intelligence developed by scientists from the University of California, San Francisco, has learned to translate human lip movements into speech. What is surprising, is that it works, even if a person does not move his lips, but only thinks about it.

Devices that synthesize brain activity into natural speech have long been spoken about. And we seem to have made an important step in that direction. First, the scientists identified the interconnection between mouth movements and the activation of certain areas in the cerebral cortex. It required a lot of work on the part of the researchers assisted by five volunteers who agreed to be experimented on.

Electrodes that read brain activity were implanted in the volunteers’ heads. As a result, scientists were able to develop two programs, one of which can transform thoughts into mouth movements, and the other one can “read lips”. The capabilities of the innovation were demonstrated on video: about 69% of the words were pronounced correctly with the accuracy directly dependent on the length of the sentence.

Share this with your friends!

Be the first to comment