This shows a teen girl.
It explored how well computers could capture authentic human emotions in youths aged 14–16 and their parents during everyday interactions. Credit: Neuroscience News

Teen Emotions Deciphered by AI and Head-cam

Summary: Researchers developed a method using wearable headcams and AI to analyze teenagers’ facial expressions, revealing subtle emotional nuances. This technology has shown potential in identifying emotions such as worry and happiness, even when they are masked.

The findings suggest that this approach could significantly enhance understanding and communication between teens and their parents, potentially serving as a valuable tool in therapy sessions to address mental health issues. By recording real interactions, the project has captured authentic emotional expressions during activities like card games, providing a new way to support positive family dynamics.

Key Facts:

  1. The study involved 110 families and focused on capturing genuine emotional expressions from teens and parents during everyday interactions using headcams.
  2. AI software was used to decode and quantify emotional nuances, predicting human judgments and providing detailed insights into mixed emotional states.
  3. This technology could soon be integrated into family therapy sessions, offering a novel approach to improving parent-teen relationships and addressing the rising mental health concerns among adolescents.

Source: King’s College London

As part of a joint study between King’s College London and Manchester Met, wearable headcams worn in real interactions and face decoding technology were used to read teens’ facial expressions, potentially uncovering hidden feelings and insights into relationships.

Recordings of the adolescents’ facial expressions were run through new AI software to detect and understand intricate details of emotions across minute time scales.

The project is outlined in a paper published in the journal Frontiers in Child and Adolescent Psychiatry and shows the technology is helping psychologists work with teens and their parents to foster better mutual understanding and communication.

For example, the algorithms can pinpoint that someone is 20% worried or 5% happy and could therefore identify when teenagers are masking their true feelings.

With the use of artificial intelligence, human judgements can be predicted based on the software’s readings—meaning computers can provide unique information on the timing and mixed presentation of emotions.

The protocols could soon be used in therapy sessions, helping to reduce mental health problems by promoting understanding and positive parent-adolescent interactions.

The project involved 110 families. It explored how well computers could capture authentic human emotions in youths aged 14–16 and their parents during everyday interactions. They captured their expressions on footage recorded from headcams worn during card game sessions designed to elicit emotional responses.

The protocol for using the cams at home and process of “coding” teen and parents’ emotions was co-designed in interactive workshops using theater and immersive film and a mobile research van with young people and communities.

Mental health is a rapidly growing issue for young people, with one in five young people identified as having a mental health problem in 2022, rising from 12% in 2017 to 20% in 2023.

“With the current mental health crisis in adolescence it is crucial we understand potential sources of resilience for young people. Human interaction is highly complex and multi-faceted. Our facial expressions serve as critical non-verbal social cues, communicating our emotions, intentions and support our social intentions.

“But a huge amount of individual variability exists in our expressions, much of which is outside our conscious control. Understanding how these variations are understood and responded to by parents may provide critical information to support relationships,” says Dr. Nicky Wright, psychology lecturer at Manchester Met and lead researcher.

“One of the most exciting things about this project is the potential to use headcam footage in family therapy sessions. For example, families could be asked to film themselves doing a task, either at home or in the session.

“The therapist could then review the task with the family, picking out positive moments in the interaction,” says Dr. Tom Jewell, lecturer in mental health nursing at King’s College London and senior author of the paper.

The research team plan to explore the use of automated facial coding as a tool for families and those who support them to improve communication and relationships between parents and teenagers, and better understand mood and mental health disorders and interactions.

About this AI, neurotech, and emotion research news

Author: Nicky Wright
Source: King’s College London
Contact: Nicky Wright – King’s College London
Image: The image is credited to Neuroscience News

Original Research: Open access.
Through each other’s eyes: initial results and protocol for the co-design of an observational measure of adolescent-parent interaction using first-person perspective” by Nicky Wright et al. Frontiers in Child and Adolescent Psychiatry 


Abstract

Through each other’s eyes: initial results and protocol for the co-design of an observational measure of adolescent-parent interaction using first-person perspective

Background: Current observational methods to understand adolescent-parent interaction are limited in terms of ecological and content validity.

We outline initial results and a protocol for future work from a programme of work to: (1) establish a new method for data capture of adolescent-parent interaction at home using wearable cameras and; (2) develop a new relevant and comprehensive observational micro-coding scheme.

In Part 1, we report our completed preliminary work, comprised of an initial scoping review, and public engagement work. In Part 2, we present a protocol for the development of the new measure.

Methods: Part 1—We searched Pubmed for existing observational measures of adolescent-parent interaction for the scoping review. We also undertook public engagement work utilising a mobile research van, taken to multiple locations around Bristol, UK to engage with a variety of populations through interactive methods.

Part 2—Our protocol describes plans for: (1) A systematic review of the psychometric properties of observational measures of adolescent-parent interaction; (2) Focussed public engagement workshops; (3) Harmonisation of information from existing coding schemes and literature with information from public engagement with adolescents and parents; (4) A pilot study to assess the acceptability and feasibility of the method; (5) Development of a coding scheme in consultation with expert and lay panels, and through real-life application to recorded videos from a pilot sample.

Results: Scoping review: we identified 21 adolescent-parent observational schemes, of which eight used micro-coding and 13 used globalcoding schemes. The majority of micro-coding schemes were not developed specifically for adolescents. Most studies used conflict or problem-solving tasks, which may not adequately capture positive adolescent-parent interactions.

The mobile van event received views from 234 young people and/or parents. Families were positive about taking part in research using headcams. “Trust” and “understanding” were most frequently reported as important adolescent-parent relationship constructs.

Conclusions: This work represents the first attempt to truly co-design a method to assess parenting in adolescence. We hope to develop an observational measure using novel technological methods that can be used across a range of research and therapeutic settings.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.