TRex, a fast multi-animal tracking system with markerless identification, and 2D estimation of posture and visual fields

  1. Tristan Walter  Is a corresponding author
  2. Iain D Couzin  Is a corresponding author
  1. Max Planck Institute of Animal Behavior, Germany

Abstract

Automated visual tracking of animals is rapidly becoming an indispensable tool for the study of behavior. It offers a quantitative methodology by which organisms' sensing and decision-making can be studied in a wide range of ecological contexts. Despite this, existing solutions tend to be challenging to deploy in practice, especially when considering long and/or high-resolution video-streams. Here, we present TRex, a fast and easy-to-use solution for tracking a large number of individuals simultaneously using background-subtraction with real-time (60Hz) tracking performance for up to approximately 256 individuals and estimates 2D visual-fields, outlines, and head/rear of bilateral animals, both in open and closed-loop contexts. Additionally, TRex offers highly-accurate, deep-learning-based visual identification of up to approximately 100 unmarked individuals, where it is between 2.5-46.7 times faster, and requires 2-10 times less memory, than comparable software (with relative performance increasing for more organisms/longer videos) and provides interactive data-exploration within an intuitive, platform-independent graphical user-interface.

Data availability

Video data that has been used in the evaluation of TRex has been deposited in MPG Open Access Data Repository (Edmond), under the Creative Commons BY 4.0 license, at https://dx.doi.org/10.17617/3.4yMost raw videos have been trimmed, since original files are each up to 200GB in size. Pre-processed versions (in PV format) are included, so that all steps after conversion can be reproduced directly (conversion speeds do not change with video length, so proportional results are reproducible as well). Full raw videos are made available upon reasonable request.All analysis scripts, scripts used to process the original videos, and the source code/pre-compiled binaries (linux-64) that have been used, are archived in this repository. Most intermediate data (PV videos, log files, tracking data, etc.) are included, and the binaries along with the scripts can be used to automatically generate all intermediate steps. The application source code is available for free under https://github.com/mooch443/trex.Videos 11, 12 and 13 are part of idtracker.ai's example videos: URL https://drive.google.com/file/d/1pAR6oJjrEn7jf_OU2yMdyT2UJZMTNoKC/view?usp=sharing (10_zebrafish.tar.gz) [Francisco Romero, 2018, Examples for idtracker.ai, Online, Accessed 23-Oct-2020];Video 7 (video_example_100fish_1min.avi): URL https://drive.google.com/file/d/1Tl64CHrQoc05PDElHvYGzjqtybQc4g37/view?usp=sharing [Francisco Romero, 2018, Examples for idtracker.ai, Online, Accessed 23-Oct-2020];V1 from Appendix 12: https://drive.google.com/drive/folders/1Nir2fzgxofz-fcojEiG_JCNXsGQXj_9k [Francisco Romero, 2018, Examples for idtracker.ai, Online, Accessed 09-Feb-2021];

The following data sets were generated

Article and author information

Author details

  1. Tristan Walter

    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    For correspondence
    a@trex.run
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-8604-7229
  2. Iain D Couzin

    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    For correspondence
    icouzin@ab.mpg.de
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-8556-4558

Funding

Division of Integrative Organismal Systems (IOS-1355061)

  • Iain D Couzin

Office of Naval Research (N00014-19-1-2556)

  • Iain D Couzin

Deutsche Forschungsgemeinschaft (EXC 2117-422037984)

  • Iain D Couzin

Max-Planck-Gesellschaft

  • Iain D Couzin

Struktur- und Innovationsfunds fuer die Forschung of the State of Baden-Wuerttemberg

  • Iain D Couzin

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: We herewith confirm that the care and use of animals described in this work is covered by the protocols 35-9185.81/G-17/162, 35-9185.81/G-17/88 and 35-9185.81/G-16/116 granted by the Regional Council of the State of Baden-Württemberg, Freiburg, Germany, to the Max Planck Institute of Animal Behavior in accordance with the German Animal Welfare Act (TierSchG) and the Regulation for the Protection of Animals Used for Experimental or Other Scientific Purposes (Animal Welfare Regulation Governing Experimental Animals - TierSchVersV).

Copyright

© 2021, Walter & Couzin

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 14,880
    views
  • 1,390
    downloads
  • 173
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Tristan Walter
  2. Iain D Couzin
(2021)
TRex, a fast multi-animal tracking system with markerless identification, and 2D estimation of posture and visual fields
eLife 10:e64000.
https://doi.org/10.7554/eLife.64000

Share this article

https://doi.org/10.7554/eLife.64000

Further reading

    1. Computational and Systems Biology
    2. Microbiology and Infectious Disease
    Priya M Christensen, Jonathan Martin ... Kelli L Palmer
    Research Article

    Bacterial membranes are complex and dynamic, arising from an array of evolutionary pressures. One enzyme that alters membrane compositions through covalent lipid modification is MprF. We recently identified that Streptococcus agalactiae MprF synthesizes lysyl-phosphatidylglycerol (Lys-PG) from anionic PG, and a novel cationic lipid, lysyl-glucosyl-diacylglycerol (Lys-Glc-DAG), from neutral glycolipid Glc-DAG. This unexpected result prompted us to investigate whether Lys-Glc-DAG occurs in other MprF-containing bacteria, and whether other novel MprF products exist. Here, we studied protein sequence features determining MprF substrate specificity. First, pairwise analyses identified several streptococcal MprFs synthesizing Lys-Glc-DAG. Second, a restricted Boltzmann machine-guided approach led us to discover an entirely new substrate for MprF in Enterococcus, diglucosyl-diacylglycerol (Glc2-DAG), and an expanded set of organisms that modify glycolipid substrates using MprF. Overall, we combined the wealth of available sequence data with machine learning to model evolutionary constraints on MprF sequences across the bacterial domain, thereby identifying a novel cationic lipid.

    1. Computational and Systems Biology
    2. Neuroscience
    Bernhard Englitz, Sahar Akram ... Shihab Shamma
    Research Article

    Perception can be highly dependent on stimulus context, but whether and how sensory areas encode the context remains uncertain. We used an ambiguous auditory stimulus – a tritone pair – to investigate the neural activity associated with a preceding contextual stimulus that strongly influenced the tritone pair’s perception: either as an ascending or a descending step in pitch. We recorded single-unit responses from a population of auditory cortical cells in awake ferrets listening to the tritone pairs preceded by the contextual stimulus. We find that the responses adapt locally to the contextual stimulus, consistent with human MEG recordings from the auditory cortex under the same conditions. Decoding the population responses demonstrates that cells responding to pitch-changes are able to predict well the context-sensitive percept of the tritone pairs. Conversely, decoding the individual pitch representations and taking their distance in the circular Shepard tone space predicts the opposite of the percept. The various percepts can be readily captured and explained by a neural model of cortical activity based on populations of adapting, pitch and pitch-direction cells, aligned with the neurophysiological responses. Together, these decoding and model results suggest that contextual influences on perception may well be already encoded at the level of the primary sensory cortices, reflecting basic neural response properties commonly found in these areas.