Controlling an Exoskeleton with a Brain Computer Interface

Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user’s brain.

Using an electroencephalogram (EEG) cap, the system allows users to move forwards, turn left and right, sit and stand simply by staring at one of five flickering light emitting diodes (LEDs).

The results are published today (Tuesday 18th August) in the Journal of Neural Engineering.

Each of the five LEDs flickers at a different frequency, and when the user focusses their attention on a specific LED this frequency is reflected within the EEG readout. This signal is identified and used to control the exoskeleton.

A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton.

“Exoskeletons create lots of electrical ‘noise’” explains Klaus Muller, an author on the paper. “The EEG signal gets buried under all this noise – but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal.”

Although the paper reports tests on healthy individuals, the system has the potential to aid sick or disabled people.

“People with amyotrophic lateral sclerosis (ALS) [motor neuron disease], or high spinal cord injuries face difficulties communicating or using their limbs” continues Muller. “Decoding what they intend from their brain signals could offer means to communicate and walk again.”

This shows a person using the exoskeleton.
The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market. Image adapted from the SciNews Video.

The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market.

It only took volunteers a few minutes to be training how to operate the system. Because of the flickering LEDs they were carefully screened for epilepsy prior to taking part in the research. The researchers are now working to reduce the ‘visual fatigue’ associated with longer-term users of such systems.


Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user’s brain.

“We were driven to assist disabled people, and our study shows that this brain control interface can easily and intuitively control an exoskeleton system – despite the highly challenging artefacts from the exoskeleton itself” concludes Muller.

About this neurology and technology research

Funding: The research was supported by grants from the National Institute of Mental Health (RO1 MH 041083, F31 MH 100889).

Source: IOP Publishing
Image Source: The image is adapted from the SciNews video
Video Source: The video is available at the SciNews YouTube page
Original Research: Abstract for “A lower limb exoskeleton control system based on steady state visual evoked potentials” by No-Sang Kwak, Klaus-Robert Müller and Seong-Whan Lee in Journal of Neural Engineering. Published online August 18 2015 doi:10.1088/1741-2560/12/5/056009


Abstract

A lower limb exoskeleton control system based on steady state visual evoked potentials

Objective. We have developed an asynchronous brain–machine interface (BMI)-based lower limb exoskeleton control system based on steady-state visual evoked potentials (SSVEPs). Approach. By decoding electroencephalography signals in real-time, users are able to walk forward, turn right, turn left, sit, and stand while wearing the exoskeleton. SSVEP stimulation is implemented with a visual stimulation unit, consisting of five light emitting diodes fixed to the exoskeleton. A canonical correlation analysis (CCA) method for the extraction of frequency information associated with the SSVEP was used in combination with k-nearest neighbors. Main results. Overall, 11 healthy subjects participated in the experiment to evaluate performance. To achieve the best classification, CCA was first calibrated in an offline experiment. In the subsequent online experiment, our results exhibit accuracies of 91.3 ± 5.73%, a response time of 3.28 ± 1.82 s, an information transfer rate of 32.9 ± 9.13 bits/min, and a completion time of 1100 ± 154.92 s for the experimental parcour studied. Significance. The ability to achieve such high quality BMI control indicates that an SSVEP-based lower limb exoskeleton for gait assistance is becoming feasible.

“A lower limb exoskeleton control system based on steady state visual evoked potentials” by No-Sang Kwak, Klaus-Robert Müller and Seong-Whan Lee in Journal of Neural Engineering. Published online August 18 2015 doi:10.1088/1741-2560/12/5/056009

Feel free to share this neuroscience article.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.