Decoding Movement and Speech From the Brain of a Tetraplegic Person
upstart writes:
Decoding Movement and Speech from the Brain of a Tetraplegic Person - Technology Org:
Every year, the lives of hundreds of thousands of people are severely disrupted when they lose the ability to move or speak as a result of spinal injury, stroke, or neurological diseases.
At Caltech, neuroscientists in the laboratory of Richard Andersen, James G. Boswell Professor of Neuroscience, and Leadership Chair and Director of the Tianqiao & Chrissy Chen Brain-Machine Interface Center, are studying how the brain encodes movements and speech, in order to potentially restore these functions to those individuals who have lost them.
Brain-machine interfaces (BMIs) are devices that record brain signals and interpret them to issue commands that operate external assistive devices, such as computers or robotic limbs. Thus, an individual can control such machinery just with their thoughts.
For example, in 2015, the Andersen team and colleagues worked with a tetraplegic participant to implant recording electrodes into a part of the brain that forms intentions to move. The BMI enabled the participant to direct a robotic limb to reach out and grasp a cup, just by thinking about those actions.
Read more of this story at SoylentNews.