Brain Machine Interface Allows Amputees to Control Bionic Hand

A research team from the University of Houston has created an algorithm that allowed a man to grasp a bottle and other objects with a prosthetic hand, powered only by his thoughts.

The technique, demonstrated with a 56-year-old man whose right hand had been amputated, uses non-invasive brain monitoring, capturing brain activity to determine what parts of the brain are involved in grasping an object. With that information, researchers created a computer program, or brain-machine interface (BMI), that harnessed the subject’s intentions and allowed him to successfully grasp objects, including a water bottle and a credit card. The subject grasped the selected objects 80 percent of the time using a high-tech bionic hand fitted to the amputee’s stump.

Previous studies involving either surgically implanted electrodes or myoelectric control, which relies upon electrical signals from muscles in the arm, have shown similar success rates, according to the researchers.

Jose Luis Contreras-Vidal, a neuroscientist and engineer at UH, said the non-invasive method offers several advantages: It avoids the risks of surgically implanting electrodes by measuring brain activity via scalp electroencephalogram, or EEG. And myoelectric systems aren’t an option for all people, because they require that neural activity from muscles relevant to hand grasping remain intact.

The results of the study were published March 30 in Frontiers in Neuroscience, in the Neuroprosthetics section.

Contreras-Vidal, Hugh Roy and Lillie Cranz Cullen Distinguished Professor of electrical and computer engineering at UH, was lead author of the paper, along with graduate students Harshavardhan Ashok Agashe, Andrew Young Paek and Yuhang Zhang.

The work, funded by the National Science Foundation, demonstrates for the first time EEG-based BMI control of a multi-fingered prosthetic hand for grasping by an amputee. It also could lead to the development of better prosthetics, Contreras-Vidal said.

Beyond demonstrating that prosthetic control is possible using non-invasive EEG, researchers said the study offers a new understanding of the neuroscience of grasping and will be applicable to rehabilitation for other types of injuries, including stroke and spinal cord injury.

T he study subjects – five able-bodied, right-handed men and women, all in their 20s, as well as the amputee – were tested using a 64-channel active EEG, with electrodes attached to the scalp to capture brain activity. Contreras-Vidal said brain activity was recorded in multiple areas, including the motor cortex and areas known to be used in action observation and decision-making, and occurred between 50 milliseconds and 90 milliseconds before the hand began to grasp.

That provided evidence that the brain predicted the movement, rather than reflecting it, he said.

“Current upper limb neuroprosthetics restore some degree of functional ability, but fail to approach the ease of use and dexterity of the natural hand, particularly for grasping movements,” the researchers wrote, noting that work with invasive cortical electrodes has been shown to allow some hand control but not at the level necessary for all daily activities.

This shows a person using the technology to control a prosthetic hand.
New UH research has demonstrated that an amputee can grasp with a bionic hand, powered only by his thoughts. Image adapted from the University of Houston press release.

“Further, the inherent risks associated with surgery required to implant electrodes, along with the long-term stability of recorded signals, is of concern. … Here we show that it is feasible to extract detailed information on intended grasping movements to various objects in a natural, intuitive manner, from a plurality of scalp EEG signals.”

Until now, this was thought to be possible only with brain signals acquired invasively inside or on the surface of the brain.

Researchers first recorded brain activity and hand movement in the able-bodied volunteers as they picked up five objects, each chosen to illustrate a different type of grasp: a soda can, a compact disc, a credit card, a small coin and a screwdriver. The recorded data were used to create decoders of neural activity into motor signals, which successfully reconstructed the grasping movements.

They then fitted the amputee subject with a computer-controlled neuroprosthetic hand and told him to observe and imagine himself controlling the hand as it moved and grasped the objects.

The subject’s EEG data, along with information about prosthetic hand movements gleaned from the able-bodied volunteers, were used to build the algorithm.

Contreras-Vidal said additional practice, along with refining the algorithm, could increase the success rate to 100 percent.

About this neuroprosthetics research

Contact: Jeannie Kever – University of Houston
Source: University of Houston press release
Image Source: The image is adapted from the University of Houston press release
Original Research: Abstract for “Global cortical activity predicts shape of hand during grasping” by Harshavardhan A. Agashe, Andrew Y. Paek, Yuhang Zhang, and Jose L. Contreras-Vidal in Frontiers in Neuroscience. Published online March 30 2015 doi:10.3389/fnins.2015.00121

Share this Neuroprosthetics News
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.
  1. Made a very important ( revolutionary and unpublished ) discovery – invention-the1.first practical device for reading human thoughts or Brain Computer Interface. My discovery is certainly ( exclusive ) exceptional, revolutionary nature! But I live in Russia, have a bad situation now and I can not to publish my discovery ( защитить свои авторские права как мне лично нужно ) and I invite partnership. Thank you. Сурен Акопов. Email : [email protected]
    Now i have to give some definitions of the device for reading human thoughts, АЧМ, human mind reading machine or Brain Computer Interface =АЧМ -BCI device.
    1d. Apparatus АЧМ-BCI instrument, that allows to get absolutely objective, reliable, probative, true knowledge about : the thoughts, meditation, reasoning, inner speech of man, when he closed his mouth, lips and tooth compressed, when external speech missing, when a man says to himself and not aloud, when a person is silent and can not be heard at all and so on. ..
    2d.Apparatus АЧМ-BCI is the main necessary structural element of the device for controlling the technigue of human thoughts, word, without uttering aloud command, without the use of external speech, silence, when the mouth is closed etc
    3d. Apparatus АЧМ-BCI is a very necessary tool for people with problems of the external speech, conversation, when inner speech is normal, has no serious violations, defections, for people with different kinds of diseases, in particular, a guadriplegic ( paralyzed ), as a renowned British physicist Stephen Hawking.
    Я сделал свое революционное открытие более 2 лет назад и никак не могу его обнародовать! Я не хочу критиковать специалистов, работающих в этих областях …
    У вашего физика Р.Фейнмана есть в одной его книге, переведенной на русский язык , хороший ответ…
    Мне добавить нечего …, а резких слов говорить нежелательно.
    У меня сейчас очень нехорошее положение. ..
    Извините, но я не знаю английского языка.

Comments are closed.