AI “Mind Control” Can Stop Animal Behaviors in a Split Second

Summary: Researchers developed an advanced AI system named YORU that can identify specific animal behaviors with over 90% accuracy across multiple species. By combining this high-speed recognition with optogenetics, the team successfully demonstrated the ability to shut down specific brain circuits in real-time using targeted light.

This breakthrough allowed scientists to silence a fruit fly’s “love song” mid-performance, proving that the system can isolate and control an individual’s neural activity within a social group. Ultimately, the tool is designed to help researchers worldwide map how specific brain cells drive complex social interactions in ants, mice, and fish.

Key Facts

  • Superior Speed and Accuracy: YORU recognizes entire behaviors from single video frames rather than tracking body parts, making it 30% faster than previous tools and maintaining 90-98% accuracy even when animals overlap.
  • Precision Optogenetic Targeting: Unlike older methods that illuminated entire chambers, this system uses an AI-driven light source to target individual animals, allowing for the manipulation of one subject’s neurons without affecting its neighbors.
  • Cross-Species Versatility: The system is “plug-and-play” across diverse species, having already successfully analyzed food-sharing in ants, zebrafish orientation, and mouse grooming with minimal training data required.

Source: Nagoya University

A male fruit fly in a laboratory chamber extends his wings and vibrates them to produce his species’ version of a love song. A female fly stays nearby listening. Suddenly, a green light flashes across the chamber for a fraction of a second. The male’s song cuts off mid-note and his wings fold. The female, not impressed by the interrupted serenade, walks away. The culprit? An AI system that watched the male begin his courtship dance and shut down his song-producing brain cells.

Developed by scientists at Nagoya University and their collaborators from Osaka University and Tohoku University, the AI can watch and recognize animal behaviors and control the specific brain circuits that drive them.

This image shows two flies under green light.
Precision Optogenetics: By combining high-speed behavioral recognition with light-sensitive proteins, researchers can now isolate and silence the neurons of a single individual within a group without disturbing its neighbors. Credit: Neuroscience News

Published in Science Advances, the study presents an advanced AI system that can identify which animal performs a behavior in a group and selectively target only that animal’s brain cells during social interactions. 

YORU (Your Optimal Recognition Utility) recognizes different behaviors across species with over 90% accuracy, including food-sharing between ants, social orientation in zebrafish, and grooming in mice. However, the real breakthrough came with fruit flies, when the research team combined YORU with brain control technology to shut off song-producing neurons during courtship, which reduced male mating success. 

Traditional behavior analysis tracks individual body parts frame by frame, similar to motion capture technology in video games. This method is challenging when multiple animals interact or overlap. Additionally, scientists needed faster tools for real-time experiments where split-second timing is critical. 

“Instead of tracking body points over time, YORU recognizes entire behaviors from their appearance in a single video frame. It spotted behaviors in flies, ants, and zebrafish with 90-98% accuracy and ran 30% faster than competing tools,” Hayato Yamanouchi, co-first author from Nagoya University’s Graduate School of Science said.  

Senior author Azusa Kamikouchi explained that the real breakthrough combines YORU with optogenetics, using light to control genetically engineered neurons. “We can silence fly courtship neurons the instant YORU detects wing extension. In a separate experiment, we used targeted light that followed individual flies and blocked just one fly’s hearing neurons while others moved freely nearby.” 

This individual-focused control solves a major challenge: previous methods could only illuminate entire chambers, which affected all animals at the same time and made it impossible to study an individual’s role during social interactions. 
 
How the brain control technology works

Step 1: Genetic engineering  
The scientists genetically modify the animals to have special light-sensitive proteins (called “opsins”) expressed in specific neurons in their brains. These proteins can turn neurons on or off, depending on the type. 

Step 2: Detection and response 
    •    YORU captures the animal’s behavior in real-time with a camera 
    •    When YORU’s AI detects the target behavior, it instantly sends an electrical signal to a light source 
    •    The light automatically turns on and shines on the target animal 

Step 3: Light controls the brain 
    •    The light hits the target animal and reaches those genetically modified neurons 
    •    The light-sensitive proteins respond to the light by opening an ion channel on the membrane of target neurons 
    •    This blocks or activates those specific neurons, changing the animal’s brain activity 
    •    The behavior is affected as a result 
 
YORU works across species, can be trained to recognize new behaviors with minimal training data, and requires no programming skills to use. The Nagoya team made YORU available online for scientists worldwide studying how brains control social interactions. 

Key Questions Answered:

Q: How does YORU differ from traditional animal tracking software?

A: Traditional software typically uses “body part tracking,” where the AI identifies specific points (like a nose, tail, or wing) and tracks their movement frame-by-frame. This often fails when animals huddle together or overlap. YORU treats the animal’s entire silhouette as a “behavior object.” By recognizing the overall shape of a specific posture in a single frame, it is 30% faster and remains accurate even in crowded social groups.

Q: Why is “individual-focused control” considered such a breakthrough?

A: In the past, if scientists wanted to use light to trigger neurons (optogenetics), they had to flash the entire laboratory chamber. This meant every animal in the group was affected simultaneously, making it impossible to see how one specific individual’s actions influenced the rest of the group. YORU’s speed allows it to aim a spotlight at a single fly or ant the exact millisecond it starts a behavior, leaving its neighbors completely undisturbed.

Q: Can this technology be used on animals other than fruit flies?

A: Yes, one of YORU’s strongest features is its cross-species versatility. The researchers proved it works with 90–98% accuracy across vastly different body types, including:
Ants: Tracking food-sharing interactions.
Zebrafish: Monitoring social orientation and swimming patterns.
Mice: Identifying specific grooming habits. Because it requires no programming skills and very little training data, it is designed to be a universal tool for biologists worldwide.

Editorial Notes:

  • This article was edited by a Neuroscience News editor.
  • Journal paper reviewed in full.
  • Additional context added by our staff.

About this AI and neuroscience research news

Author: Merle Naidoo
Source: Nagoya University
Contact: Merle Naidoo – Nagoya University
Image: The image is credited to Neuroscience News

Original Research: Open access.
YORU: animal behavior detection with object-based approach for real-time closed-loop feedback” by Azusa Kamikouchi et al. Science Advances


Abstract

YORU: animal behavior detection with object-based approach for real-time closed-loop feedback

The advent of deep learning methodologies for animal behavior analysis has revolutionized neuroethology studies. However, the analysis of social behaviors, characterized by dynamic interactions among multiple individuals, continues to represent a major challenge.

In this study, we present “YORU” (your optimal recognition utility), a behavior detection approach leveraging an object detection deep learning algorithm. Unlike conventional approaches, YORU directly identifies behaviors as “behavior objects” based on the animal’s shape, enabling robust and accurate detection.

YORU successfully classified several types of social behaviors in species ranging from vertebrates to insects. Furthermore, YORU enables real-time behavior analysis and closed-loop feedback.

In addition, we achieved real-time delivery of photostimulation feedback to specific individuals during social behaviors, even when multiple individuals are close together.

This system overcomes the challenges posed by conventional pose estimation methods and presents an alternative approach for behavioral analysis.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.