DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning
Abstract
Quantitative behavioral measurements are important for answering questions across scientific disciplines-from neuroscience to ecology. State-of-the-art deep-learning methods offer major advances in data quality and detail by allowing researchers to automatically estimate locations of an animal's body parts directly from images or videos. However, currently-available animal pose estimation methods have limitations in speed and robustness. Here we introduce a new easy-to-use software toolkit, DeepPoseKit, that addresses these problems using an efficient multi-scale deep-learning model, called Stacked DenseNet, and a fast GPU-based peak-detection algorithm for estimating keypoint locations with subpixel precision. These advances improve processing speed >2× with no loss in accuracy compared to currently-available methods. We demonstrate the versatility of our methods with multiple challenging animal pose estimation tasks in laboratory and field settings-including groups of interacting individuals. Our work reduces barriers to using advanced tools for measuring behavior and has broad applicability across the behavioral sciences.
Data availability
Data used and generated for experiments and model comparisons are included in the supporting files. Posture datasets can be found at: https://github.com/jgraving/deepposekit-dataThe code for DeepPoseKit is publicly available at the URL we provided in the paper: https://github.com/jgraving/deepposekit/The reviewers should follow the provided instructions for installation in the README file https://github.com/jgraving/deepposekit/blob/master/README.md#installation. Example Jupyter notebooks for how to use the code are provided here: https://github.com/jgraving/deepposekit/tree/master/examples
-
Fast animal pose estimation using deep neural networkshttp://arks.princeton.edu/ark:/88435/dsp01pz50gz79z.
Article and author information
Author details
Funding
National Science Foundation (IOS-1355061)
- Iain D Couzin
Horizon 2020 Framework Programme (Marie Sklodowska-Curie grant agreement No. 748549)
- Blair R Costelloe
Nvidia (GPU Grant)
- Blair R Costelloe
Office of Naval Research (N00014-09-1-1074)
- Iain D Couzin
Office of Naval Research (N00014-14-1-0635)
- Iain D Couzin
Army Research Office (W911NG-11-1-0385)
- Iain D Couzin
Army Research Office (W911NF14-1-0431)
- Iain D Couzin
Deutsche Forschungsgemeinschaft (DFG Centre of Excellence 2117)
- Iain D Couzin
University of Konstanz (Zukunftskolleg Investment Grant)
- Blair R Costelloe
The Strukture-und Innovations fonds fur die Forschung of the State of Baden-Wurttemberg
- Iain D Couzin
Max Planck Society
- Iain D Couzin
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: All procedures for collecting the zebra (E. grevyi) dataset were reviewed and approved by Ethikrat, the independent Ethics Council of the Max Planck Society. The zebra dataset was collected with the permission of Kenya's National Commission for Science, Technology and Innovation (NACOSTI/P/17/59088/15489 and NACOSTI/P/18/59088/21567) using drones operated by B.R.C. with the permission of the Kenya Civil Aviation Authority (authorization numbers: KCAA/OPS/2117/4 Vol. 2 (80), KCAA/OPS/2117/4 Vol. 2 (81), KCAA/OPS/2117/5 (86) and KCAA/OPS/2117/5 (87); RPAS Operator Certificate numbers: RPA/TP/0005 AND RPA/TP/000-0009).
Copyright
© 2019, Graving et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 25,926
- views
-
- 2,456
- downloads
-
- 448
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Citations by DOI
-
- 448
- citations for umbrella DOI https://doi.org/10.7554/eLife.47994