A paralyzed person is expected to make the first kick in the World Cup this year, thanks to a new brain controlled exoskeleton with a sense of touch, designed by researchers at TUM.
People tend to think voice-user AI interfaces such as Siri or Alexa are more competent and emotionally engaging if they exhibit social cues.
A new computer system is able to see hand poses and track multiple people in real time. Researchers say that being able to detect nuances of non-verbal communication, robots will be better able to perceive what humans around them are doing.
Artificial IntelligenceDeep LearningFeaturedMachine LearningNeuroscienceNeuroscience VideosNeurotechRobotics··3 min read
Combining deep learning algorithms with robotic engineering, researchers have developed a new robot able to combine vision and touch.
A robotic system that can detect the emotional state of young learners and improve the learning experience is one step closer to entering the classroom, researchers report.
People who were touched by conversational humanoid robots report positive emotional states and were more likely to comply with requests made by the robot.