The Machine as Extension of the Body

Summary: A new study discusses the key challenges of the fusion between neuroscience and robotics.

Source: TUM

Combining neuroscience and robotic research has gained impressive results in the rehabilitation of paraplegic patients. A research team led by Prof. Gordon Cheng from the Technical University of Munich (TUM) was able to show that exoskeleton training not only helped patients to walk, but also stimulated their healing process. With these findings in mind, Prof. Cheng wants to take the fusion of robotics and neuroscience to the next level.

Prof. Cheng, by training a paraplegic patient with the exoskeleton within your sensational study under the “Walk Again” project (LINK!), you found that patients regained a certain degree of control over the movement of their legs. Back then, this came as a complete surprise to you …

… and it somehow still is. Even though we had this breakthrough four years ago, this was only the beginning. To my regret, none of these patients is walking around freely and unaided yet. We have only touched the tip of the iceberg. To develop better medical devices, we need to dig deeper in understanding how the brain works and how to translate this into robotics.

In your paper published in Science Robotics this month, you and your colleague Prof. Nicolelis, a leading expert in neuroscience and in particular in the area of the human-machine interface, argue that some key challenges in the fusion of neuroscience and robotics need to be overcome in order to take the next steps. One of them is to “close the loop between the brain and the machine” – what do you mean by that?

The idea behind this is that the coupling between the brain and the machine should work in a way where the brain thinks of the machine as an extension of the body. Let’s take driving as an example. While driving a car, you don’t think about your moves, do you? But we still don’t know how this really works. My theory is that the brain somehow adapts to the car as if it is a part of the body. With this general idea in mind, it would be great to have an exoskeleton that would be embraced by the brain in the same way.

How could this be achieved in practice?

The exoskeleton that we were using for our research so far is actually just a big chunk of metal and thus rather cumbersome for the wearer. I want to develop a “soft” exoskeleton – something that you can just wear like a piece of clothing that can both sense the user’s movement intentions and provide instantaneous feedback. Integrating this with recent advances in brain-machine interfaces that allow real-time measurement of brain responses enables the seamless adaptation of such exoskeletons to the needs of individual users.

Given the recent technological advances and better understanding of how to decode the user’s momentary brain activity, the time is ripe for their integration into more human-centered or, better “brain-centered” solutions.

What other pieces are still missing? You talked about providing a “more realistic functional model” for both disciplines.

We have to facilitate the transfer through new developments, for example robots that are closer to human behaviour and the construction of the human body and thus lower the threshold for the use of robots in neuroscience. This is why we need more realistic functional models, which means that robots should be able to mimic human characteristics. Let’s take the example of a humanoid robot actuated with artificial muscles.

This natural construction mimicking muscles instead of the traditional motorized actuation would provide neuroscientists with a more realistic model for their studies. We think of this as a win-win situation to facilitate better cooperation between neuroscience and robotics in the future.

This shows a robotic looking face
The idea behind this is that the coupling between the brain and the machine should work in a way where the brain thinks of the machine as an extension of the body. Image is in the public domain

You are not alone in the mission of overcoming these challenges. In your Elite Graduate Program in Neuroengineering, the first and only one of its kind in Germany combining experimental and theoretical neuroscience with in-depth training in engineering, you are bringing together the best students in the field.

As described above, combining the two disciplines of robotics and neuroscience is a tough exercise, and therefore one of the main reasons why I created this master’s program in Munich. To me, it is important to teach the students to think more broadly and across disciplines, to find previously unimagined solutions. This is why lecturers from various fields, for example hospitals or the sports department, are teaching our students. We need to create a new community and a new culture in the field of engineering. From my standpoint, education is the key factor.

About this neuroscience and robotics research news

Source: TUM
Contact: Christine Lehner – TUM
Image: The image is in the public domain

Original Research: The study will appear in Science Robotics

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.
  1. What about programming technology through software, or mechanically through hardware, to embrace the body as if that body were a robot in itself? Humans are after all biological robots that blindly evolved and its the same for all other life forms on Earth, blindly that is until the advent of genetic engineering, synthetic genomics and in our case at present, the extended phenotype of prosthetics.

    Maybe an analogy can be drawn with the Human quest to engineer Artificial intelligence. Its only through technology that we developed writing in the sense that when at the most basic level, drawing in the dirt would have the stick as the equivalent of a mouse and a keyboard, and the substrate drawn on as a monitor screen. This of course was refined to paper and pen which is being superseded by our technological devices and peripherals. To do this we needed to train our neurons and have neuronally evolved to become a differentiated internally expressed polyphenism. As these computers are an extended phenotype of our educationally reprogrammed brains as biological computers that we are, are we already an evolved AI?

Comments are closed.