skip to main content


Title: SLEAP: A deep learning system for multi-animal pose tracking
Abstract

The desire to understand how the brain generates and patterns behavior has driven rapid methodological innovation in tools to quantify natural animal behavior. While advances in deep learning and computer vision have enabled markerless pose estimation in individual animals, extending these to multiple animals presents unique challenges for studies of social behaviors or animals in their natural environments. Here we present Social LEAP Estimates Animal Poses (SLEAP), a machine learning system for multi-animal pose tracking. This system enables versatile workflows for data labeling, model training and inference on previously unseen data. SLEAP features an accessible graphical user interface, a standardized data model, a reproducible configuration system, over 30 model architectures, two approaches to part grouping and two approaches to identity tracking. We applied SLEAP to seven datasets across flies, bees, mice and gerbils to systematically evaluate each approach and architecture, and we compare it with other existing approaches. SLEAP achieves greater accuracy and speeds of more than 800 frames per second, with latencies of less than 3.5 ms at full 1,024 × 1,024 image resolution. This makes SLEAP usable for real-time applications, which we demonstrate by controlling the behavior of one animal on the basis of the tracking and detection of social interactions with another animal.

 
more » « less
Award ID(s):
1754476 1734030
NSF-PAR ID:
10366517
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more » ; « less
Publisher / Repository:
Nature Publishing Group
Date Published:
Journal Name:
Nature Methods
Volume:
19
Issue:
4
ISSN:
1548-7091
Page Range / eLocation ID:
p. 486-495
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The ability to automatize the analysis of video for monitoring animals and insects is of great interest for behavior science and ecology. In particular, honeybees play a crucial role in agriculture as natural pollinators. However, recent studies has shown that phenomena such as colony collapse disorder are causing the loss of many colonies. Due to the high number of interacting factors to explain these events, a multi-faceted analysis of the bees in their environment is required. We focus in our work in developing tools to help model and understand their behavior as individuals, in relation with the health and performance of the colony. In this paper, we report the development of a new system for the detection, localization and tracking of honeybee body parts from video on the entrance ramp of the colony. The proposed system builds on the recent advances in Convolutional Neural Networks (CNN) for Human pose estimation and evaluates the suitability for the detection of honeybee pose as shown in Figure 1. This opens the door for novel animal behavior analysis systems that take advantage of the precise detection and tracking of the insect pose. 
    more » « less
  2. Quantitative descriptions of animal behavior are essential to study the neural substrates of cognitive and emotional processes. Analyses of naturalistic behaviors are often performed by hand or with expensive, inflexible commercial software. Recently, machine learning methods for markerless pose estimation enabled automated tracking of freely moving animals, including in labs with limited coding expertise. However, classifying specific behaviors based on pose data requires additional computational analyses and remains a significant challenge for many groups. We developed BehaviorDEPOT (DEcoding behavior based on POsitional Tracking), a simple, flexible software program that can detect behavior from video timeseries and can analyze the results of experimental assays. BehaviorDEPOT calculates kinematic and postural statistics from keypoint tracking data and creates heuristics that reliably detect behaviors. It requires no programming experience and is applicable to a wide range of behaviors and experimental designs. We provide several hard-coded heuristics. Our freezing detection heuristic achieves above 90% accuracy in videos of mice and rats, including those wearing tethered head-mounts. BehaviorDEPOT also helps researchers develop their own heuristics and incorporate them into the software’s graphical interface. Behavioral data is stored framewise for easy alignment with neural data. We demonstrate the immediate utility and flexibility of BehaviorDEPOT using popular assays including fear conditioning, decision-making in a T-maze, open field, elevated plus maze, and novel object exploration. 
    more » « less
  3. Abstract

    Significant advances in computational ethology have allowed the quantification of behaviour in unprecedented detail. Tracking animals in social groups, however, remains challenging as most existing methods can either capture pose or robustly retain individual identity over time but not both.

    To capture finely resolved behaviours while maintaining individual identity, we built NAPS (NAPS is ArUco Plus SLEAP), a hybrid tracking framework that combines state‐of‐the‐art, deep learning‐based methods for pose estimation (SLEAP) with unique markers for identity persistence (ArUco). We show that this framework allows the exploration of the social dynamics of the common eastern bumblebee (Bombus impatiens).

    We provide a stand‐alone Python package for implementing this framework along with detailed documentation to allow for easy utilization and expansion. We show that NAPS can scale to long timescale experiments at a high frame rate and that it enables the investigation of detailed behavioural variation within individuals in a group.

    Expanding the toolkit for capturing the constituent behaviours of social groups is essential for understanding the structure and dynamics of social networks. NAPS provides a key tool for capturing these behaviours and can provide critical data for understanding how individual variation influences collective dynamics.

     
    more » « less
  4. Abstract

    Quantifying movement and demographic events of free‐ranging animals is fundamental to studying their ecology, evolution and conservation. Technological advances have led to an explosion in sensor‐based methods for remotely observing these phenomena. This transition to big data creates new challenges for data management, analysis and collaboration.

    We present the Movebank ecosystem of tools used by thousands of researchers to collect, manage, share, visualize, analyse and archive their animal tracking and other animal‐borne sensor data. Users add sensor data through file uploads or live data streams and further organize and complete quality control within the Movebank system. All data are harmonized to a data model and vocabulary. The public can discover, view and download data for which they have been given access to through the website, the Animal Tracker mobile app or by API. Advanced analysis tools are available through the EnvDATA System, the MoveApps platform and a variety of user‐developed applications. Data owners can share studies with select users or the public, with options for embargos, licenses and formal archiving in a data repository.

    Movebank is used by over 3,100 data owners globally, who manage over 6 billion animal location and sensor measurements across more than 6,500 studies, with thousands of active tags sending over 3 million new data records daily. These data underlie >700 published papers and reports. We present a case study demonstrating the use of Movebank to assess life‐history events and demography, and engage with citizen scientists to identify mortalities and causes of death for a migratory bird.

    A growing number of researchers, government agencies and conservation organizations use Movebank to manage research and conservation projects and to meet legislative requirements. The combination of historic and new data with collaboration tools enables broad comparative analyses and data acquisition and mapping efforts. Movebank offers an integrated system for real‐time monitoring of animals at a global scale and represents a digital museum of animal movement and behaviour. Resources and coordination across countries and organizations are needed to ensure that these data, including those that cannot be made public, remain accessible to future generations.

     
    more » « less
  5. Abstract Background

    Repetitive action, resistance to environmental change and fine motor disruptions are hallmarks of autism spectrum disorder (ASD) and other neurodevelopmental disorders, and vary considerably from individual to individual. In animal models, conventional behavioral phenotyping captures such fine-scale variations incompletely. Here we observed male and female C57BL/6J mice to methodically catalog adaptive movement over multiple days and examined two rodent models of developmental disorders against this dynamic baseline. We then investigated the behavioral consequences of a cerebellum-specific deletion in Tsc1 protein and a whole-brain knockout in Cntnap2 protein in mice. Both of these mutations are found in clinical conditions and have been associated with ASD.

    Methods

    We used advances in computer vision and deep learning, namely a generalized form of high-dimensional statistical analysis, to develop a framework for characterizing mouse movement on multiple timescales using a single popular behavioral assay, the open-field test. The pipeline takes virtual markers from pose estimation to find behavior clusters and generate wavelet signatures of behavior classes. We measured spatial and temporal habituation to a new environment across minutes and days, different types of self-grooming, locomotion and gait.

    Results

    Both Cntnap2 knockouts and L7-Tsc1 mutants showed forelimb lag during gait. L7-Tsc1 mutants and Cntnap2 knockouts showed complex defects in multi-day adaptation, lacking the tendency of wild-type mice to spend progressively more time in corners of the arena. In L7-Tsc1 mutant mice, failure to adapt took the form of maintained ambling, turning and locomotion, and an overall decrease in grooming. However, adaptation in these traits was similar between wild-type mice and Cntnap2 knockouts. L7-Tsc1 mutant and Cntnap2 knockout mouse models showed different patterns of behavioral state occupancy.

    Limitations

    Genetic risk factors for autism are numerous, and we tested only two. Our pipeline was only done under conditions of free behavior. Testing under task or social conditions would reveal more information about behavioral dynamics and variability.

    Conclusions

    Our automated pipeline for deep phenotyping successfully captures model-specific deviations in adaptation and movement as well as differences in the detailed structure of behavioral dynamics. The reported deficits indicate that deep phenotyping constitutes a robust set of ASD symptoms that may be considered for implementation in clinical settings as quantitative diagnosis criteria.

     
    more » « less