Please disable your adblock and script blockers to view this page

?Neuroprosthesis? Restores Words to Man with Paralysis


University of California San FranciscoLearn
UCSF
CDC
More Natural Communication for People
UC San Francisco
the New England Journal of Medicine
Neurological Surgery
the UCSF Epilepsy Center
the UCSF Weill Institute
Neurosciences
Brain-Computer Interface Restoration of Arm and Voice
BS
Chang Lab
BA
MA
National Institutes of Health
Facebook Reality Labs
FRL
Facebook’s Sponsored Academic Research Agreement
the National Institutes of Health
NIH
the Weill Family Foundation
the William K. Bowes
Jr. Foundation
Kay Curci Foundation
The University of California
UCSF Health
UCSF School of Medicine
Fact Sheet
Subscribe
UCSF NewsVisit


Robin Marks
Edward Chang
Eddie Chang
Barbara
Joan
Sanford Weill Chair
Jeanne Robertson
David Moses
Karunesh Ganguly
Sean Metzger
Jessie Liu
settings.”Jessie Liu
David A. Moses
Sean L. Metzger
Jessie R. Liu
Gopala K. Anumanchipalli
Joseph G. Makin
Pengfei F. Sun
Josh Chartier
Maximilian E. Dougherty
Patricia M. Liu
Gary M. Abrams
Adelyn Tu-Chan
Edward F. Chang
Sandy Weill
Bill
Susan Oberndorf Foundation

No matching tags


the Bay Area

No matching tags


MD
UCSF
MS
Shurl
San Francisco
Fresno

No matching tags

Positivity     40.00%   
   Negativity   60.00%
The New York Times
SOURCE: https://www.ucsf.edu/news/2021/07/420946/neuroprosthesis-restores-words-man-paralysis
Write a review: Hacker News
Summary

Robin Marks Researchers at UC San Francisco have successfully developed a “speech neuroprosthesis” that has enabled a man with severe paralysis to communicate in sentences, translating signals from his brain to the vocal tract directly into words that appear as text on a screen.The achievement, which was developed in collaboration with the first participant of a clinical research trial, builds on more than a decade of effort by UCSF neurosurgeon Edward Chang, MD, to develop a technology that allows people with paralysis to communicate even if they are unable to speak on their own. Photo by Barbara Ries“To our knowledge, this is the first successful demonstration of direct decoding of full words from the brain activity of someone who is paralyzed and cannot speak,” said Chang, the Joan and Sanford Weill Chair of Neurological Surgery at UCSF, Jeanne Robertson Distinguished Professor, and senior author on the study. To translate those findings into speech recognition of full words, David Moses, PhD, a postdoctoral engineer in the Chang lab and one of the lead authors of the new study, developed new methods for real-time decoding of those patterns and statistical language models to improve accuracy.But their success in decoding speech in participants who were able to speak didn’t guarantee that the technology would work in a person whose vocal tract is paralyzed. In each session, BRAVO1 attempted to say each of the 50 vocabulary words many times while the electrodes recorded brain signals from his speech cortex.To translate the patterns of recorded neural activity into specific intended words, the other two lead authors of the study, Sean Metzger, MS and Jessie Liu, BS, both bioengineering doctoral students in the Chang Lab used custom neural network models, which are forms of artificial intelligence.

As said here by