Interacting with the physical environment and manipulating objects is an essential part of daily life. This
ability is lost in upper-limb amputees as well as patients with spinal cord injury, stroke, ALS and other
movement disorders. These people know what they want to do as well as how they would do it if their arms
were functional. If such knowledge is decoded and sent to a prosthetic arm (or to the patient's own arm fitted
with functional electric stimulators) the lost motor function could be restored. The decoding is unlikely to be
perfect however the brain can adapt to an imperfect decoder using real-time feedback. Several groups
including ours have recently demonstrated that at least in principle this can be achieved. However, as is
often the case in science, the initial work has been done in idealized conditions and its applicability to
real-world usage scenarios remains an open question. The goal of this project is to bring movement control
brain-machine interfaces (BMIs) closer to helping the people who need them, and at the same time exploit
the rich datasets we collect in order to advance our understanding of sensorimotor control and learning. This
will be accomplished by creating hybrid BMIs which exploit information from multiple sources, combined with
modern algorithms from machine learning and automatic control.
RELEVANCE (See instructions):
Being able to interact with the physical environment and manipulate objects is an essential part of daily life.
Brain-machine interfaces are one way to restore this ability to patients who have lost it. The proposed
project will bring brain-machine interfaces closer to helping patients in real-worid object manipulation tasks.