A robot head is shown from the Roboy AI.
In another robot, built at the Artificial Intelligence Laboratory of Zürich University, Puppy, a robot dog, is capable of adapting and “feeling” the texture of the terrain on which it is moving (slippery, viscous, rough, etc.) by exploring the sensorimotor contingencies that take place when walking.

Robots that Perceive the World Like Humans

Perceive first, act afterwards. The architecture of most of today’s robots is underpinned by this control strategy. The eSMCs project has set itself the aim of changing the paradigm and generating more dynamic computer models in which action is not a mere consequence of perception, but an integral part of the perception process. It is about improving robot behavior by means of perception models closer to those of humans. Philosophers at the UPV/EHU University of the Basque Country are working to improve the systems of perception of robots by applying human models.

“The concept of how science understands the mind when it comes to building a robot or looking at the brain is that you take a photo, which is then processed as if the mind were a computer, and a recognition of patterns is carried out. There are various types of algorithms and techniques for identifying an object, scenes, etc. However, organic perception, that of human beings, is much more active. The eye, for example, carries out a whole host of saccadic movements — small rapid ocular movements — that we do not see.Seeing is establishing and recognizing objects through this visual action, knowing how the relationship and sensation of my body changes with respect to movement,” explains XabierBarandiaran, a PhD-holder in Philosophy and researcher at IAS-Research (UPV/EHU) which under the leadership of Ikerbasque researcher Ezequiel di Paolo is part of the European project eSMCs (Extending Sensorimotor Contingencies to Cognition).

Until now, the belief has been that sensations were processed, and the perception was created,and this in turn then led to reasoning and action. As Barandiaran sees it, action is an integral part of perception: “Our basic idea is that when we perceive, what is there is active exploration, a particular co-ordination with the surroundings, like a kind of invisible dance than makes vision possible.”

The eSMCs project aims to apply this idea to the computer models used in robots, improve their behavior and thus understand the nature of the animal and human mind. For this purpose, the researchers are working on sensorimotor contingencies: regular relationships existing between actions and changes in the sensory variations associated with these actions.

A robot head is shown from the Roboy AI.
In another robot, built at the Artificial Intelligence Laboratory of Zürich University, Puppy, a robot dog, is capable of adapting and “feeling” the texture of the terrain on which it is moving (slippery, viscous, rough, etc.) by exploring the sensorimotor contingencies that take place when walking.

An example of this kind of contingency is when you drink water and speak at the same time, almost without realizing it. Interaction with the surroundings has taken place “without any need to internally represent that this is a glass and then compute needs and plan an action,” explains Barandiaran, “seeing the glass draws one’s attention, it is coordinated with thirst while the presence of the water itself on the table is enough for me to coordinate the visual-motor cycle that ends up with the glass at my lips. ”The same thing happens in the robots in the eSMCs project, “they are moving the whole time, they don’t stop to think; they think about the act using the body and the surroundings,” he adds.

The researchers in the eSMCs project maintain that actions play a key role not only in perception, but also in the development of more complex cognitive capacities. That is why they believe that sensorimotor contingencies can be used to specify habits, intentions, tendencies and mental structures, thus providing the robot with a more complex, fluid behavior.

So one of the experiments involves a robot simulation (developed by Thomas Buhrmann, who is also a member of this team at the UPV/EHU) in which an agent has to discriminate between what we could call an acne pimple and a bite or lump on the skin. “The acne has a tip, the bite doesn’t. Just as people do, our agent stays with the tip and recognizes the acne, and when it goes on to touch the lump, it ignores it. What we are seeking to model and explain is that moment of perception that is built with the active exploration of the skin, when you feel ‘ah! I’ve found the acne pimple’ and you go on sliding your finger across it,” says Barandiaran. The model tries to identify what kind of relationship is established between the movement and sensation cycles and the neurodynamic patterns that are simulated in the robot’s “mini brain”.

In another robot, built at the Artificial Intelligence Laboratory of Zürich University, Puppy, a robot dog, is capable of adapting and “feeling” the texture of the terrain on which it is moving (slippery, viscous, rough, etc.) by exploring the sensorimotor contingencies that take place when walking.

Notes about this robotics research

Contacts: Komunikazio Bulegoa – University of Basque
Xabier E. Barandiaran – Xabier Barandiaran website
Source: University of Bosque news release
Image Source: Robot image is an image of Roboy also developed by AI Laboratory at Department of Informatics, University of Zurich
Original Research: Research papers and descriptions of the robotics research experiments are available at these sites:
Extending Sensorimotor Contingencies to Cognition
IAS Research Centre for Life, Mind and Society
Xabier E. Barandiaran

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.