Unlocking the Language of Emotion Through Body Movement

Summary: Researchers unlocked the language of human emotion expressed through body movements. By employing a blend of computing, psychology, and the arts, they curated a human movement dataset to enhance AI’s understanding of emotions via body language.

This dataset, annotated with the Laban Movement Analysis (LMA), can vastly improve human-machine interaction. It not only bridges communication gaps with assistive robots but also offers potential tools for psychiatric professionals.

Key Facts:

  1. The study introduced a paradigm for emotion understanding via motor element analysis.
  2. The team used 1,600 human video clips, annotating each with the Laban Movement Analysis.
  3. This research, supported by the National Science Foundation and Amazon Research Awards, can bolster safety and improve communication between humans and robots.

Source: Penn State

An individual may bring their hands to their face when feeling sad or jump into the air when feeling happy. Human body movements convey emotions, which plays a crucial role in everyday communication, according to a team led by Penn State researchers.

Combining computing, psychology and performing arts, the researchers developed an annotated human movement dataset that may improve the ability of artificial intelligence to recognize the emotions expressed through body language.

This shows a woman dancing.
“Emotion and LMA element labels are related, and the LMA labels are easier for deep neural networks to learn,” Wu said. Credit: Neuroscience News

The work — led by James Wang, distinguished professor in the College of Information Systems and Technology (IST) and carried out primarily by Chenyan Wu, a graduating doctoral student in Wang’s group — was published on Oct. 13 in the print edition of Patterns and featured on the journal’s cover. 

“People often move using specific motor patterns to convey emotions and those body movements carry important information about a person’s emotions or mental state,” Wang said. “By describing specific movements common to humans using their foundational patterns, known as motor elements, we can establish the relationship between these motor elements and bodily expressed emotion.”

According to Wang, augmenting machines’ understanding of bodily expressed emotion may help enhance communication between assistive robots and children or elderly users; provide psychiatric professionals with quantitative diagnostic and prognostic assistance; and bolster safety by preventing mishaps in human-machine interactions.

“In this work, we introduced a novel paradigm for bodily expressed emotion understanding that incorporates motor element analysis,” Wang said. “Our approach leverages deep neural networks — a type of artificial intelligence — to recognize motor elements, which are subsequently used as intermediate features for emotion recognition.”

The team created a dataset of the way body movements indicate emotion — body motor elements — using 1,600 human video clips. Each video clip was annotated using Laban Movement Analysis (LMA), a method and language for describing, visualizing, interpreting and documenting human movement.

Wu then designed a dual-branch, dual-task movement analysis network capable of using the labeled dataset to produce predictions for both bodily expressed emotion and LMA labels for new images or videos.

“Emotion and LMA element labels are related, and the LMA labels are easier for deep neural networks to learn,” Wu said.

According to Wang, LMA can study motor elements and emotions while simultaneously creating a “high-precision” dataset that demonstrates the effective learning of human movement and emotional expression.

“Incorporating LMA features has effectively enhanced body-expressed emotion understanding,” Wang said. “Extensive experiments using real-world video data revealed that our approach significantly outperformed baselines that considered only rudimentary body movement, showing promise for further advancements in the future.”

Other co-authors are Dolzodmaa Davaasuren, doctoral candidate in the College of IST; Tal Shafir, Emili Sagol Creative Arts Therapies Research Center, University of Haifa, Israel; and Rachelle Tsachor, School of Theatre and Music, University of Illinois at Chicago.  

Funding: The National Science Foundation and Amazon Research Awards supported this work.

About this neuroscience and emotion research news

Author: Adrienne Berard
Source: Penn State
Contact: Adrienne Berard – Penn State
Image: The image is credited to Neuroscience News

Original Research: Open access.
Bodily expressed emotion understanding through integrating Laban movement analysis” by James Wang et al. Patterns


Abstract

Bodily expressed emotion understanding through integrating Laban movement analysis

Body movements carry important information about a person’s emotions or mental state and are essential in everyday communication.

Enhancing machines’ ability to understand emotions expressed through body language can improve communication between assistive robots and children or elderly users, provide psychiatric professionals with quantitative diagnostic and prognostic assistance, and bolster safety by preventing mishaps in human-machine interactions.

This study develops a high-quality human motor element dataset based on the Laban movement analysis movement coding system and utilizes that to jointly learn about motor elements and emotions.

Our long-term ambition is to integrate knowledge from computing, psychology, and performing arts to enable automated understanding and analysis of emotion and mental state through body language.

This work serves as a launchpad for further research into recognizing emotions through the analysis of human movement.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.