Unlocking Speech Through Brain Implant & Machine Learning

AI Thread Summary
Researchers have successfully recorded the thoughts of a person with anarthria, a condition that prevents speech articulation, by implanting a network of electrodes in the brain. They utilized a machine learning algorithm to decode these thoughts into words displayed on a computer screen. The study achieved a median decoding rate of 15.2 words per minute and a word error rate of 25.6%. Post hoc analyses revealed that 98% of the participant's attempts to produce individual words were detected, with a classification accuracy of 47.1% for words based on stable cortical signals over an 81-week period. This advancement highlights significant progress in brain-computer interface technology and its potential applications for individuals with speech impairments.
gleem
Science Advisor
Education Advisor
Messages
2,701
Reaction score
2,177
TL;DR Summary
An experimental system connecting electrodes implanted in a brain can display words that a person thinks of saying.
In a collaboration of several institutions researchers have been able to record the thoughts of a person with a brain stem stroke resulting in anarthria (an inability to articulate speech) by mplanting network of electrodes in the brain and using a machine learning algorithm to decode the thoughts of words and display them on a computer screen.
https://www.nytimes.com/2021/07/14/...3473&user_id=799d953295eabb464706b7f2130bc45f

 
  • Like
Likes 256bits, berkeman, pbuk and 1 other person
Computer science news on Phys.org
I see the article as being behind a paywall
 
phinds said:
I see the article as being behind a paywall
I'm able to access the NYT article in the OP link.
 

RESULTS​

We decoded sentences from the participant’s cortical activity in real time at a median rate of 15.2 words per minute, with a median word error rate of 25.6%. In post hoc analyses, we detected 98% of the attempts by the participant to produce individual words, and we classified words with 47.1% accuracy using cortical signals that were stable throughout the 81-week study period.That's about the state of my typing skills.
 
  • Like
Likes Keith_McClary
This week, I saw a documentary done by the French called Les sacrifiés de l'IA, which was presented by a Canadian show Enquête. If you understand French I recommend it. Very eye-opening. I found a similar documentary in English called The Human Cost of AI: Data workers in the Global South. There is also an interview with Milagros Miceli (appearing in both documentaries) on Youtube: I also found a powerpoint presentation by the economist Uma Rani (appearing in the French documentary), AI...
Back
Top