Selfee, self-supervised features extraction of animal behaviors

  1. Yinjun Jia  Is a corresponding author
  2. Shuaishuai Li
  3. Xuan Guo
  4. Bo Lei
  5. Junqiang Hu
  6. Xiao-Hong Xu
  7. Wei Zhang  Is a corresponding author
  1. Tsinghua University, China
  2. Chinese Academy of Sciences, China

Abstract

Fast and accurately characterizing animal behaviors is crucial for neuroscience research. Deep learning models are efficiently used in laboratories for behavior analysis. However, it has not been achieved to use an end-to-end unsupervised neural network to extract comprehensive and discriminative features directly from social behavior video frames for annotation and analysis purposes. Here, we report a self-supervised feature extraction (Selfee) convolutional neural network with multiple downstream applications to process video frames of animal behavior in an end-to-end way. Visualization and classification of the extracted features (Meta-representations) validate that Selfee processes animal behaviors in a way similar to human perception. We demonstrate that Meta-representations can be efficiently used to detect anomalous behaviors that are indiscernible to human observation and hint in-depth analysis. Furthermore, time-series analyses of Meta-representations reveal the temporal dynamics of animal behaviors. In conclusion, we present a self-supervised learning approach to extract comprehensive and discriminative features directly from raw video recordings of animal behaviors and demonstrate its potential usage for various downstream applications.

Data availability

Major data used in this study were uploaded to Dryad, including pretrained weights. Data could be accessed via:https://doi.org/10.5061/dryad.brv15dvb8.With the uploaded dataset and pretrained weights, our experiments could be replicated. However, due to its huge size and the limited internet service resources, we are currently not able to share our full training dataset. The full dataset is as large as 400GB, which is hard to upload to a public server and will be difficult for others users to download.For training dataset, it would be available from the corresponding author upon reasonable request (wei_zhang@mail.tsinghua.edu.cn), and then we can discuss how to transfer the dataset. No project proposal is needed as long as the dataset is not used for any commercial purpose.Our Python scripts could be accessed on GitHub: https://github.com/EBGU/SelfeeOther software used in our project include ImageJ(https://imagej.net/software/fiji/) and GraphPad Prism(https://www.graphpad.com/).All data used to plot graphs and charts in the manuscript can be fully accessed on Dryad (DOI 10.5061/dryad.brv15dvb8).

The following data sets were generated
The following previously published data sets were used

Article and author information

Author details

  1. Yinjun Jia

    IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
    For correspondence
    jyj20@mails.tsinghua.edu.cn
    Competing interests
    The authors declare that no competing interests exist.
  2. Shuaishuai Li

    Institute of Neuroscience, Chinese Academy of Sciences, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  3. Xuan Guo

    IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  4. Bo Lei

    IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  5. Junqiang Hu

    IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  6. Xiao-Hong Xu

    Institute of Neuroscience, Chinese Academy of Sciences, Beijing, China
    Competing interests
    The authors declare that no competing interests exist.
  7. Wei Zhang

    IDG/McGovern Institute for Brain Research, Tsinghua University, Beijing, China
    For correspondence
    wei_zhang@mail.tsinghua.edu.cn
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-0512-3096

Funding

National Natural Science Foundation of China (32022029)

  • Wei Zhang

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All mating experiments were approved by the Animal Care and Use Committee of the Institute of Neuroscience, CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China (IACUC No. NA-016-2016)All studies and experimental protocols of CIS and OFT were approved by Institutional Animal Care and Use Committee (IACUC) at Tsinghua University (No. 19-ZY1). Experiments were performed using the principles outlined in the Guide for the Care and Use of Laboratory Animals of Tsinghua University.

Copyright

© 2022, Jia et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,842
    views
  • 742
    downloads
  • 14
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Yinjun Jia
  2. Shuaishuai Li
  3. Xuan Guo
  4. Bo Lei
  5. Junqiang Hu
  6. Xiao-Hong Xu
  7. Wei Zhang
(2022)
Selfee, self-supervised features extraction of animal behaviors
eLife 11:e76218.
https://doi.org/10.7554/eLife.76218

Share this article

https://doi.org/10.7554/eLife.76218

Further reading

    1. Neuroscience
    Mina Mišić, Noah Lee ... Herta Flor
    Research Article

    Chronic back pain (CBP) is a global health concern with significant societal and economic burden. While various predictors of back pain chronicity have been proposed, including demographic and psychosocial factors, neuroimaging studies have pointed to brain characteristics as predictors of CBP. However, large-scale, multisite validation of these predictors is currently lacking. In two independent longitudinal studies, we examined white matter diffusion imaging data and pain characteristics in patients with subacute back pain (SBP) over 6- and 12-month periods. Diffusion data from individuals with CBP and healthy controls (HC) were analyzed for comparison. Whole-brain tract-based spatial statistics analyses revealed that a cluster in the right superior longitudinal fasciculus (SLF) tract had larger fractional anisotropy (FA) values in patients who recovered (SBPr) compared to those with persistent pain (SBPp), and predicted changes in pain severity. The SLF FA values accurately classified patients at baseline and follow-up in a third publicly available dataset (Area under the Receiver Operating Curve ~0.70). Notably, patients who recovered had FA values larger than those of HC suggesting a potential role of SLF integrity in resilience to CBP. Structural connectivity-based models also classified SBPp and SBPr patients from the three data sets (validation accuracy 67%). Our results validate the right SLF as a robust predictor of CBP development, with potential for clinical translation. Cognitive and behavioral processes dependent on the right SLF, such as proprioception and visuospatial attention, should be analyzed in subacute stages as they could prove important for back pain chronicity.

    1. Neuroscience
    Lian Hollander-Cohen, Omer Cohen ... Berta Levavi-Sivan
    Research Article

    Life histories of oviparous species dictate high metabolic investment in the process of gonadal development leading to ovulation. In vertebrates, these two distinct processes are controlled by the gonadotropins follicle-stimulating hormone (FSH) and luteinizing hormone (LH), respectively. While it was suggested that a common secretagogue, gonadotropin-releasing hormone (GnRH), oversees both functions, the generation of loss-of-function fish challenged this view. Here, we reveal that the satiety hormone cholecystokinin (CCK) is the primary regulator of this axis in zebrafish. We found that FSH cells express a CCK receptor, and our findings demonstrate that mutating this receptor results in a severe hindrance to ovarian development. Additionally, it causes a complete shutdown of both gonadotropins secretion. Using in-vivo and ex-vivo calcium imaging of gonadotrophs, we show that GnRH predominantly activates LH cells, whereas FSH cells respond to CCK stimulation, designating CCK as the bona fide FSH secretagogue. These findings indicate that the control of gametogenesis in fish was placed under different neural circuits, that are gated by CCK.