Thursday, April 02, 2015

Noninvasive Brain Machine Interface Demonstrated for Bionic Hand


A team of researchers from the University of Houston (UH) has developed an algorithm that enabled a man whose right hand had been amputated to grasp objects using a bionic hand controlled by his thoughts. While we've seen similar accomplishments in recent years, the new technique is non-invasive, capturing brain activity via a scalp EEG.

Research developments in recent years have given amputees much cause for hope with various thought-controlled prosthetic devices. Some have relied on surgically implanted electrodes, while others make use of electrical signals from muscles (known as myoelectric control).

But Jose Luis Contreras-Vidal, a neuroscientist and engineer at UH points out that such methods have their disadvantages. Surgery, particularly neurosurgery, is a risky business, while myoelectric systems require the brain activity from muscles related to the missing limb still be intact.

In an effort to avoid these problems, UH researchers attached electrodes of a 64-channel active EEG (electroencephalogram) to the scalps of five able-bodied, right-handed men and women in their 20s. These volunteers were then tasked with picking up five different objects – a soda can, a compact disc, a credit card, a small coin and a screwdriver – each intended to illustrate a different type of grasp. The data collected data was then used to create software to decode neural activity into motor signals that reconstructed the grasping movements.

The scalp EEG was then fitted to a 56-year-old man whose right hand had been amputated and a high-tech bionic hand fitted to the remaining stump. After being told to observe and visualize himself controlling the hand, he was able to grasp the various objects using his thoughts with an 80 percent success rate.

Contreras-Vidal says a delay of 50 to 90 milliseconds between the time the signals were recorded by the EEG and when the bionic hand began to grasp indicated that the brain predicted the movement, rather than reflected it.

The researchers claim this is the first time grasping has been demonstrated using EEG-based brain-machine interface (BMI) control of a multi-fingered prosthetic hand, with Contreras-Vidal saying it could lead to the development of better prosthetic devices.

No comments:

Post a Comment