DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning

  1. Jacob M Graving  Is a corresponding author
  2. Daniel Chae
  3. Hemal Naik
  4. Liang Li
  5. Benjamin Koger
  6. Blair R Costelloe
  7. Iain D Couzin
  1. Max Planck Institute of Animal Behavior, Germany
  2. Princeton University, United States

Abstract

Quantitative behavioral measurements are important for answering questions across scientific disciplines-from neuroscience to ecology. State-of-the-art deep-learning methods offer major advances in data quality and detail by allowing researchers to automatically estimate locations of an animal's body parts directly from images or videos. However, currently-available animal pose estimation methods have limitations in speed and robustness. Here we introduce a new easy-to-use software toolkit, DeepPoseKit, that addresses these problems using an efficient multi-scale deep-learning model, called Stacked DenseNet, and a fast GPU-based peak-detection algorithm for estimating keypoint locations with subpixel precision. These advances improve processing speed >2× with no loss in accuracy compared to currently-available methods. We demonstrate the versatility of our methods with multiple challenging animal pose estimation tasks in laboratory and field settings-including groups of interacting individuals. Our work reduces barriers to using advanced tools for measuring behavior and has broad applicability across the behavioral sciences.

Data availability

Data used and generated for experiments and model comparisons are included in the supporting files. Posture datasets can be found at: https://github.com/jgraving/deepposekit-dataThe code for DeepPoseKit is publicly available at the URL we provided in the paper: https://github.com/jgraving/deepposekit/The reviewers should follow the provided instructions for installation in the README file https://github.com/jgraving/deepposekit/blob/master/README.md#installation. Example Jupyter notebooks for how to use the code are provided here: https://github.com/jgraving/deepposekit/tree/master/examples

The following data sets were generated
The following previously published data sets were used

Article and author information

Author details

  1. Jacob M Graving

    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    For correspondence
    jgraving@gmail.com
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-5826-467X
  2. Daniel Chae

    Department of Computer Science, Princeton University, Princeton, United States
    Competing interests
    No competing interests declared.
  3. Hemal Naik

    Department for Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    Competing interests
    No competing interests declared.
  4. Liang Li

    Department for Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    Competing interests
    No competing interests declared.
  5. Benjamin Koger

    Department for Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    Competing interests
    No competing interests declared.
  6. Blair R Costelloe

    Department for Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    Competing interests
    No competing interests declared.
  7. Iain D Couzin

    Department of Collective Behaviour, Max Planck Institute of Animal Behavior, Konstanz, Germany
    Competing interests
    Iain D Couzin, Reviewing editor, eLife.

Funding

National Science Foundation (IOS-1355061)

  • Iain D Couzin

Horizon 2020 Framework Programme (Marie Sklodowska-Curie grant agreement No. 748549)

  • Blair R Costelloe

Nvidia (GPU Grant)

  • Blair R Costelloe

Office of Naval Research (N00014-09-1-1074)

  • Iain D Couzin

Office of Naval Research (N00014-14-1-0635)

  • Iain D Couzin

Army Research Office (W911NG-11-1-0385)

  • Iain D Couzin

Army Research Office (W911NF14-1-0431)

  • Iain D Couzin

Deutsche Forschungsgemeinschaft (DFG Centre of Excellence 2117)

  • Iain D Couzin

University of Konstanz (Zukunftskolleg Investment Grant)

  • Blair R Costelloe

The Strukture-und Innovations fonds fur die Forschung of the State of Baden-Wurttemberg

  • Iain D Couzin

Max Planck Society

  • Iain D Couzin

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All procedures for collecting the zebra (E. grevyi) dataset were reviewed and approved by Ethikrat, the independent Ethics Council of the Max Planck Society. The zebra dataset was collected with the permission of Kenya's National Commission for Science, Technology and Innovation (NACOSTI/P/17/59088/15489 and NACOSTI/P/18/59088/21567) using drones operated by B.R.C. with the permission of the Kenya Civil Aviation Authority (authorization numbers: KCAA/OPS/2117/4 Vol. 2 (80), KCAA/OPS/2117/4 Vol. 2 (81), KCAA/OPS/2117/5 (86) and KCAA/OPS/2117/5 (87); RPAS Operator Certificate numbers: RPA/TP/0005 AND RPA/TP/000-0009).

Copyright

© 2019, Graving et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 25,239
    views
  • 2,452
    downloads
  • 420
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Jacob M Graving
  2. Daniel Chae
  3. Hemal Naik
  4. Liang Li
  5. Benjamin Koger
  6. Blair R Costelloe
  7. Iain D Couzin
(2019)
DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning
eLife 8:e47994.
https://doi.org/10.7554/eLife.47994

Share this article

https://doi.org/10.7554/eLife.47994

Further reading

    1. Neuroscience
    Franziska Auer, Katherine Nardone ... David Schoppik
    Research Article

    Cerebellar dysfunction leads to postural instability. Recent work in freely moving rodents has transformed investigations of cerebellar contributions to posture. However, the combined complexity of terrestrial locomotion and the rodent cerebellum motivate new approaches to perturb cerebellar function in simpler vertebrates. Here, we adapted a validated chemogenetic tool (TRPV1/capsaicin) to describe the role of Purkinje cells — the output neurons of the cerebellar cortex — as larval zebrafish swam freely in depth. We achieved both bidirectional control (activation and ablation) of Purkinje cells while performing quantitative high-throughput assessment of posture and locomotion. Activation modified postural control in the pitch (nose-up/nose-down) axis. Similarly, ablations disrupted pitch-axis posture and fin-body coordination responsible for climbs. Postural disruption was more widespread in older larvae, offering a window into emergent roles for the developing cerebellum in the control of posture. Finally, we found that activity in Purkinje cells could individually and collectively encode tilt direction, a key feature of postural control neurons. Our findings delineate an expected role for the cerebellum in postural control and vestibular sensation in larval zebrafish, establishing the validity of TRPV1/capsaicin-mediated perturbations in a simple, genetically tractable vertebrate. Moreover, by comparing the contributions of Purkinje cell ablations to posture in time, we uncover signatures of emerging cerebellar control of posture across early development. This work takes a major step towards understanding an ancestral role of the cerebellum in regulating postural maturation.

    1. Neuroscience
    Zhujun Shao, Mengya Zhang, Qing Yu
    Research Article

    When holding visual information temporarily in working memory (WM), the neural representation of the memorandum is distributed across various cortical regions, including visual and frontal cortices. However, the role of stimulus representation in visual and frontal cortices during WM has been controversial. Here, we tested the hypothesis that stimulus representation persists in the frontal cortex to facilitate flexible control demands in WM. During functional MRI, participants flexibly switched between simple WM maintenance of visual stimulus or more complex rule-based categorization of maintained stimulus on a trial-by-trial basis. Our results demonstrated enhanced stimulus representation in the frontal cortex that tracked demands for active WM control and enhanced stimulus representation in the visual cortex that tracked demands for precise WM maintenance. This differential frontal stimulus representation traded off with the newly-generated category representation with varying control demands. Simulation using multi-module recurrent neural networks replicated human neural patterns when stimulus information was preserved for network readout. Altogether, these findings help reconcile the long-standing debate in WM research, and provide empirical and computational evidence that flexible stimulus representation in the frontal cortex during WM serves as a potential neural coding scheme to accommodate the ever-changing environment.