Meta has created a “brain decoder” that uses an AI model to decode brain activity and convert it into speech. The tool can be used for mental communication between users who pronounce words and phrases in their heads. The company spoke about the study in a corporate blog.
Meta technology differs from similar developments in that it is based on non-invasive methods for recording brain activity. The company used electroencephalography (EEG) and magnetoencephalography (MEG), which use only external sensors. Data from EEG and MEG is less accurate than information from sensors implanted in the brain, so a much larger amount of initial data had to be processed to be more efficient, says Dev.
AI recorded how the participants’ brains reacted to audiobooks and individual phrases, and then isolated the necessary words from the text and compiled a “dictionary”, according to which they implemented the reverse process – converting thoughts into speech. The researchers achieved an accuracy of 73% using a set of 793 words that are most frequently used in everyday life.
NIX Solutions adds taht in the future, the researchers plan to increase the accuracy of word and phrase definitions by using a wider source vocabulary. The development will not only help millions of people who are unable to speak or write, but will also allow a better understanding of the human brain.