Analysis of ultrasonic vocalizations from mice using computer vision and machine learning

  1. Antonio H O Fonseca  Is a corresponding author
  2. Gustavo M Santana  Is a corresponding author
  3. Gabriela M Bosque Ortiz  Is a corresponding author
  4. Sérgio Bampi  Is a corresponding author
  5. Marcelo O Dietrich  Is a corresponding author
  1. Yale University, United States
  2. Yale University School of Medicine, United States
  3. Universidade Federal do Rio Grande do Sul, Brazil

Abstract

Mice emit ultrasonic vocalizations (USV) that communicate socially-relevant information. To detect and classify these USVs, here we describe VocalMat. VocalMat is a software that uses image-processing and differential geometry approaches to detect USVs in audio files, eliminating the need for user-defined parameters. VocalMat also uses computational vision and machine learning methods to classify USVs into distinct categories. In a dataset of >4,000 USVs emitted by mice, VocalMat detected over 98% of manually labeled USVs and accurately classified ~86% of the USVs out of eleven USV categories. We then used dimensionality reduction tools to analyze the probability distribution of USV classification among different experimental groups, providing a robust method to quantify and qualify the vocal repertoire of mice. Thus, VocalMat makes it possible to perform automated, accurate, and quantitative analysis of USVs without the need for user inputs, opening the opportunity for detailed and high-throughput analysis of this behavior.

Data availability

All the data and code used in this work is publicly available and can be found in the links below: https://osf.io/bk2uj/https://www.dietrich-lab.org/vocalmatThis information is also present in the manuscript at section 4.12 (Code and data availability).

The following data sets were generated

Article and author information

Author details

  1. Antonio H O Fonseca

    Comparative Medicine, Yale University, New Haven, United States
    For correspondence
    antonio.fonseca@yale.edu
    Competing interests
    The authors declare that no competing interests exist.
  2. Gustavo M Santana

    Comparative Medicine, Yale University School of Medicine, New Haven, United States
    For correspondence
    gustavo.santana@yale.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-1897-1625
  3. Gabriela M Bosque Ortiz

    Comparative Medicine, Yale University School of Medicine, New Haven, United States
    For correspondence
    gabriela.borque@yale.edu
    Competing interests
    The authors declare that no competing interests exist.
  4. Sérgio Bampi

    Computer Sciences, Universidade Federal do Rio Grande do Sul, Porto Alegre, Brazil
    For correspondence
    bampi@inf.ufrgs.br
    Competing interests
    The authors declare that no competing interests exist.
  5. Marcelo O Dietrich

    Comparative Medicine, Yale University School of Medicine, New Haven, United States
    For correspondence
    marcelo.dietrich@yale.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9781-2221

Funding

National Institute of Diabetes and Digestive and Kidney Diseases

  • Marcelo O Dietrich

Howard Hughes Medical Institute (Gilliam Fellowship)

  • Gabriela M Bosque Ortiz
  • Marcelo O Dietrich

Brain and Behavior Research Foundation

  • Marcelo O Dietrich

Whitehall Foundation

  • Marcelo O Dietrich

Charles H. Hood Foundation

  • Marcelo O Dietrich

Foundation for Prader-Willi Research

  • Marcelo O Dietrich

Reginald and Michiko Spector Award in Neuroscience

  • Marcelo O Dietrich

Conselho Nacional de Desenvolvimento Científico e Tecnológico

  • Sérgio Bampi
  • Marcelo O Dietrich

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

  • Antonio H O Fonseca
  • Gustavo M Santana
  • Sérgio Bampi
  • Marcelo O Dietrich

Yale Center for Clinical Investigation Scholar Award

  • Marcelo O Dietrich

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Reviewing Editor

  1. Tali Kimchi, Weizmann Institute of Science, Israel

Ethics

Animal experimentation: This study was performed in strict accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. The protocol was reviewed and approved by the Yale University Institutional Animal Care and Use Committee (IACUC). All of the animals were handled according to the approved IACUC protocol (#2018-20042) of the Yale University School of Medicine.

Version history

  1. Received: May 21, 2020
  2. Accepted: March 30, 2021
  3. Accepted Manuscript published: March 31, 2021 (version 1)
  4. Version of Record published: April 20, 2021 (version 2)

Copyright

© 2021, Fonseca et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 6,570
    views
  • 584
    downloads
  • 58
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Antonio H O Fonseca
  2. Gustavo M Santana
  3. Gabriela M Bosque Ortiz
  4. Sérgio Bampi
  5. Marcelo O Dietrich
(2021)
Analysis of ultrasonic vocalizations from mice using computer vision and machine learning
eLife 10:e59161.
https://doi.org/10.7554/eLife.59161

Share this article

https://doi.org/10.7554/eLife.59161

Further reading

    1. Neuroscience
    Tianhao Chu, Zilong Ji ... Si Wu
    Research Article

    Hippocampal place cells in freely moving rodents display both theta phase precession and procession, which is thought to play important roles in cognition, but the neural mechanism for producing theta phase shift remains largely unknown. Here, we show that firing rate adaptation within a continuous attractor neural network causes the neural activity bump to oscillate around the external input, resembling theta sweeps of decoded position during locomotion. These forward and backward sweeps naturally account for theta phase precession and procession of individual neurons, respectively. By tuning the adaptation strength, our model explains the difference between ‘bimodal cells’ showing interleaved phase precession and procession, and ‘unimodal cells’ in which phase precession predominates. Our model also explains the constant cycling of theta sweeps along different arms in a T-maze environment, the speed modulation of place cells’ firing frequency, and the continued phase shift after transient silencing of the hippocampus. We hope that this study will aid an understanding of the neural mechanism supporting theta phase coding in the brain.

    1. Neuroscience
    Josue M Regalado, Ariadna Corredera Asensio ... Priyamvada Rajasethupathy
    Research Article

    Learning requires the ability to link actions to outcomes. How motivation facilitates learning is not well understood. We designed a behavioral task in which mice self-initiate trials to learn cue-reward contingencies and found that the anterior cingulate region of the prefrontal cortex (ACC) contains motivation-related signals to maximize rewards. In particular, we found that ACC neural activity was consistently tied to trial initiations where mice seek to leave unrewarded cues to reach reward-associated cues. Notably, this neural signal persisted over consecutive unrewarded cues until reward-associated cues were reached, and was required for learning. To determine how ACC inherits this motivational signal we performed projection-specific photometry recordings from several inputs to ACC during learning. In doing so, we identified a ramp in bulk neural activity in orbitofrontal cortex (OFC)-to-ACC projections as mice received unrewarded cues, which continued ramping across consecutive unrewarded cues, and finally peaked upon reaching a reward-associated cue, thus maintaining an extended motivational state. Cellular resolution imaging of OFC confirmed these neural correlates of motivation, and further delineated separate ensembles of neurons that sequentially tiled the ramp. Together, these results identify a mechanism by which OFC maps out task structure to convey an extended motivational state to ACC to facilitate goal-directed learning.