Summary: A new AI algorithm can independently discover and categorize an animal’s behavior by analyzing patterns of body movements.
Source: Carnegie Mellon University
To Eric Yttri, assistant professor of biological sciences and Neuroscience Institute faculty at Carnegie Mellon University, the best way to understand the brain is to watch how organisms interact with the world.
“Behavior drives everything we do,” Yttri said.
As a behavioral neuroscientist, Yttri studies what happens in the brain when animals walk, eat, sniff or do any action. This kind of research could help answer questions about neurological diseases or disorders like Parkinson’s disease or stroke. But identifying and predicting animal behavior is extremely difficult.
Now, a new unsupervised machine learning algorithm developed by Yttri and Alex Hsu, a biological sciences Ph.D. candidate in his lab, makes studying behavior much easier and more accurate. The researchers published a paper on the new tool, B-SOiD (Behavioral segmentation of open field in DeepLabCut), in “Nature Communications.“
Previously, the standard method to capture animal behavior was to track very simple actions, like whether a trained mouse pressed a lever or whether an animal was eating food or not. Alternatively, the experimenter could spend hours and hours manually identifying behavior, usually frame by frame on a video, a process prone to human error and bias.
Hsu realized he could let an unsupervised learning algorithm do the time-consuming work. B-SOiD discovers behaviors by identifying patterns in the position of an animal’s body. The algorithm works with computer vision software and can tell researchers what behavior is happening at every frame in a video.
“It uses an equation to consistently determine when a behavior starts,” Hsu explained. “Once you reach that threshold, the behavior is identified, every time. A human experimenter might toggle between two frames or several categories, try to decide where behavior begins and become fatigued over time.”
Yttri said B-SOiD provides a huge improvement and opens up several avenues for new research.
“It removes user bias and, more importantly, removes the time cost and arduous work,” he said. “We can accurately process hours of data in a matter of minutes.”
Additionally, B-SOiD is very user friendly and openly available to any researcher. Yttri’s lab and their collaborators have used the new algorithm in research on many important areas, including research to better understand chronic pain, obsessive compulsive disorder and more.
Collaborators have even begun to use B-SOiD to study human movement in Parkinson’s disease.
“We are beginning to see if this can be used as part of an objective test by a doctor to show how far a patient’s disease has progressed. The hope is that a patient anywhere in the world would be diagnosed with one standardized metric,” Yttri said.
This is a breakthrough in how scientists can study natural behavior and how it changes rather than the overly simplistic or subjective measures that predominate neuroscience and ethology.
Funding: This work was funded by grants from the Whitehall Foundation and the Ewing Marion Kauffman Foundation. Yttri and his collaborators made the code and paper open source, made possible through the article processing charge fund from the CMU Libraries.
B-SOiD, an open-source unsupervised algorithm for identification and fast prediction of behaviors
Studying naturalistic animal behavior remains a difficult objective. Recent machine learning advances have enabled limb localization; however, extracting behaviors requires ascertaining the spatiotemporal patterns of these positions. To provide a link from poses to actions and their kinematics, we developed B-SOiD – an open-source, unsupervised algorithm that identifies behavior without user bias.
By training a machine classifier on pose pattern statistics clustered using new methods, our approach achieves greatly improved processing speed and the ability to generalize across subjects or labs. Using a frameshift alignment paradigm, B-SOiD overcomes previous temporal resolution barriers.
Using only a single, off-the-shelf camera, B-SOiD provides categories of sub-action for trained behaviors and kinematic measures of individual limb trajectories in any animal model.
These behavioral and kinematic measures are difficult but critical to obtain, particularly in the study of rodent and other models of pain, OCD, and movement disorders.