Brain2Char algorithm turns brain activity into text - Hitecher
Brain2Char algorithm turns brain activity into text

Brain2Char algorithm turns brain activity into text

The technology works even if the user is saying words in their head.

The technology works even if the user is saying words in their head.

Scientists from UC San Francisco have trained a chain of neural networks to analyze brain activity in people saying words and to create an output with the original text. 

The Brain2Char algorithm works with electrocorticographic data. It is based on neural networks with a long short-term memory architecture and an open decoder. You can find a preprint of the article on arXiv.org. 

Brain2Char starts by analyzing signals received from an electrocorticography: the computer creates a speech model based on the characteristic changes in capacity timing and frequency. The resulting model is processed by a DeepSpeech algorithm, which operates based on convolutional neural networks and transforms digital signals into written text. The system also features an additional regulatory neural network that ‘cleans’ the resulting text, considering, among other things, the person’s particular mannerisms. 

The network was trained based on the brain activity of four patients with electrocorticography devices implanted into their brains. Two of the patients were given 450 sentences of text written using 1900 different words. The other two patients were asked to read descriptions of images made using two sets of 400 and 1200 words. 

One defining aspect of this development is the computer’s ability to detect words even if they are not said aloud, but only in the patient’s head. In the future, the algorithm will be used to build brain-computer interfaces for mute people.

Share this with your friends!

Be the first to comment