AI Reveals How Infants Learn to Interact with Environment

Summary: Researchers used artificial intelligence (AI) to study how infants transition from random movements to purposeful actions. By tracking infant movements in a baby-mobile experiment, AI models like 2D-CapsNet accurately classified these movements and identified significant changes in foot movements as infants learned to interact with their environment.

The study revealed that infants explore more after losing control of the mobile, suggesting a desire to reconnect with their surroundings. This research highlights the potential of AI to analyze early infant behavior and improve understanding of motor development and learning.

Key Facts:

  • AI classified infant movements with 86% accuracy, especially foot movements.
  • Infants explored more after losing control of the mobile, seeking reconnection.
  • AI offers new insights into early motor development and infant learning.

Source: FAU

Recent advances in computing and artificial intelligence, along with insights into infant learning, suggest that machine and deep learning techniques can help us study how infants transition from random exploratory movements to purposeful actions.

Most research has focused on babies’ spontaneous movements, distinguishing between fidgety and non-fidgety behaviors.

While early movements may seem chaotic, they reveal meaningful patterns as infants interact with their environment. However, we still lack understanding of how infants intentionally engage with their surroundings and the principles guiding their goal-directed actions.

This shows a baby.
Looking at how AI classification accuracy changes for each infant gives researchers a new way to understand when and how they start to engage with the world. Credit: Neuroscience News

By conducting a baby-mobile experiment, used in developmental research since the late 1960s, Florida Atlantic University researchers and collaborators investigated how infants begin to act purposefully.

The baby-mobile experiment uses a colorful mobile gently tethered to an infant’s foot. When the baby kicks, the mobile moves, linking their actions to what they see. This setup helps researchers understand how infants control their movements and discover their ability to influence their surroundings.

In this new work, researchers tested whether AI tools could pick up on complex changes in patterns of infant movement. Infant movements, tracked using a Vicon 3D motion capture system, were classified into different types – from spontaneous actions to reactions when the mobile moves.

By applying various AI techniques, researchers examined which methods best captured the nuances of infant behavior across different situations and how movements evolved over time.

Results of the study, published in Scientific Reports, underscore that AI is a valuable tool for understanding early infant development and interaction. Both machine and deep learning methods accurately classified five-second clips of 3D infant movements as belonging to different stages of the experiment.

Among these methods, the deep learning model, 2D-CapsNet, performed the best. Importantly, for all the methods tested, the movements of the feet had the highest accuracy rates, which means that, compared to other parts of the body, the movement patterns of the feet changed most dramatically across the stages of the experiment.

“This finding is significant because the AI systems were not told anything about the experiment or which part of the infant’s body was connected to the mobile.

“What this shows is that the feet – as end effectors – are the most affected by the interaction with the mobile,” said Scott Kelso, Ph.D., co-author and Glenwood and Martha Creech Eminent Scholar in Science at the Center for Complex Systems and Brain Sciences within FAU’s Charles E. Schmidt College of Science. 

“In other words, the way infants connect with their environment has the biggest impact at the points of contact with the world. Here, this was ‘feet first.’”

The 2D-CapsNet model achieved an accuracy of 86% when analyzing foot movements and was able to capture detailed relationships between different body parts during movement. Across all methods tested, foot movements consistently showed the highest accuracy rates – about 20% higher than movements of the hands, knees, or the whole body.

“We found that infants explored more after being disconnected from the mobile than they did before they had the chance to control it. It seems that losing the ability to control the mobile made them more eager to interact with the world to find a means of reconnecting,” said Aliza Sloan, Ph.D., co-author and a postdoctoral research scientist in FAU’s Center for Complex Systems and Brain Sciences.

“However, some infants showed movement patterns during this disconnected phase that contained hints of their earlier interactions with the mobile. This suggests that only certain infants understood their relationship with the mobile well enough to maintain those movement patterns, expecting that they would still produce a response from the mobile even after being disconnected.”

The researchers say that if the accuracy of infants’ movements remains high during the disconnection, it might indicate that the infants learned something during their earlier interactions. However, different types of movements might mean different things in terms of what the infants discovered.

“It’s important to note that studying infants is more challenging than studying adults because infants can’t communicate verbally,” said Nancy Aaron Jones, Ph.D., co-author, professor in FAU’s Department of Psychology, director of the FAU WAVES Lab, and a member of the Center for Complex Systems and Brain Sciences within the Charles E. Schmidt College of Science.

“Adults can follow instructions and explain their actions, while infants cannot. That’s where AI can help. AI can help researchers analyze subtle changes in infant movements, and even their stillness, to give us insights into how they think and learn, even before they can speak. Their movements can also help us make sense of the vast degree of individual variation that occurs as infants develop.”

Looking at how AI classification accuracy changes for each infant gives researchers a new way to understand when and how they start to engage with the world.

“While past AI methods mainly focused on classifying spontaneous movements linked to clinical outcomes, combining theory-based experiments with AI will help us create better assessments of infant behavior that are relevant to their specific contexts,” said Kelso. “This can improve how we identify risks, diagnose and treat disorders.”

Study co-authors are first author Massoud Khodadadzadeh, Ph.D., formerly at Ulster University in Derry, North Ireland and now at University of Bedfordshire, United Kingdom; and Damien Coyle, Ph.D., at the University of Bath, United Kingdom.  

Funding: The research was supported by Tier 2 High Performance Computing resources provided by the Northern Ireland High-Performance Computing facility funded by the U.K. Engineering and Physical Sciences Research Council; the U.K. Research and Innovation Turing AI Fellowship (2021-2025) funded by the Engineering and Physical Research Council, Vice Chancellor’s Research Scholarship; the Institute for Research in Applicable Computing at the University of Bedfordshire; the FAU Foundation (Eminent Scholar in Science); and United States National Institutes of Health.

About this AI and neurodevelopment research news

Author: Gisele Galoustian
Source: FAU
Contact: Gisele Galoustian – FAU
Image: The image is credited to Neuroscience News

Original Research: Open access.
Artificial intelligence detects awareness of functional relation with the environment in 3 month old babies” by Scott Kelso et al. Scientific Reports


Abstract

Artificial intelligence detects awareness of functional relation with the environment in 3 month old babies

A recent experiment probed how purposeful action emerges in early life by manipulating infants’ functional connection to an object in the environment (i.e., tethering an infant’s foot to a colorful mobile).

Vicon motion capture data from multiple infant joints were used here to create Histograms of Joint Displacements (HJDs) to generate pose-based descriptors for 3D infant spatial trajectories.

Using HJDs as inputs, machine and deep learning systems were tasked with classifying the experimental state from which snippets of movement data were sampled. The architectures tested included k-Nearest Neighbour (kNN), Linear Discriminant Analysis (LDA), Fully connected network (FCNet), 1D-Convolutional Neural Network (1D-Conv), 1D-Capsule Network (1D-CapsNet), 2D-Conv and 2D-CapsNet.

Sliding window scenarios were used for temporal analysis to search for topological changes in infant movement related to functional context. kNN and LDA achieved higher classification accuracy with single joint features, while deep learning approaches, particularly 2D-CapsNet, achieved higher accuracy on full-body features.

For each AI architecture tested, measures of foot activity displayed the most distinct and coherent pattern alterations across different experimental stages (reflected in the highest classification accuracy rate), indicating that interaction with the world impacts the infant behaviour most at the site of organism~world connection.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.