Andrew Schwartz, PhD, McGowan Institute for Regenerative Medicine faculty member and Professor of Neurobiology at the University of Pittsburgh leads a team of neuroscientists who have significantly advanced brain-machine interface (BMI) technology to the point where severely handicapped people who cannot contract even one leg or arm muscle now can independently compose and send e-mails and operate a TV in their homes. They are using only their thoughts to execute these actions.
Thanks to the rapid pace of research on the BMI, one day these and other individuals may be able to feed themselves with a robotic arm and hand that moves according to their mental commands.
“Our work has shown how important the learning process is when using brain-controlled devices,” says Dr. Schwartz. “By permitting the subject to adaptively recode the generated neural activity, the overall performance of the device is dramatically increased.”
“Furthermore, as we have progressed in this work, it has become apparent that the basic idea of 'intention' during learning is very important and can be addressed by the direct observation of the neuronal transformations taking place during this fundamental processing,” Dr. Schwartz says.
At the University of Pittsburgh, scientists recently succeeded in developing the technology that allows a rhesus macaque monkey to mentally control a robotic arm to feed itself pieces of fruit. The robotic arm's fast and smooth movements were triggered by electrical signals that were generated in the monkey's brain when the animal thought about an action.
In previous studies, this lab developed the technology to tap a macaque monkey's motor cortical neural activity making it possible for the animal to use its thoughts to control a robotic arm to reach for food targets presented in 3D space.
In Dr. Schwartz’s latest studies, macaque monkeys not only mentally guided a robotic arm to pieces of food but also opened and closed the robotic arm's hand, or gripper, to retrieve them. Just by thinking about picking up and bringing the fruit to its mouth, the animal fed itself.
The monkey's own arm and hand did not move while it manipulated the two-finger gripper at the end of the robotic arm. The animal used its own sight for feedback about the accuracy of the robotic arm's actions as it mentally moved the gripper to within one-half centimeter of a piece of fruit.
"The monkey developed a great deal of skill using this physical device," says Meel Velliste, PhD. "We are in the process of extending this type of control to a more sophisticated wrist and hand for the performance of dexterous tasks."
Velliste and the other members of the Pittsburgh research team point out that imparting skill and dexterity to these devices will help amputees and paralyzed patients to perform everyday tasks.
The animal's thoughts emitted electrical signals that were recorded by tiny electrodes that the scientists had implanted in the monkey's motor cortex. A computer-decoding algorithm translated the signals into the robotic arm and gripper's movements.
BMI is already being tested in humans--with promising results, said Schwartz.
"The range of patients could be stroke patients, could be patients locked in with ALS, who can't move anything except for their eyes, it could be amputees-- you could use an artificial limb, it could be spinal cord injured patients that are paralyzed," he said.