Rodent ultrasonic vocal interaction resolved with millimeter precision using hybrid beamforming

  1. Max L Sterling
  2. Ruben Teunisse
  3. Bernhard Englitz  Is a corresponding author
  1. Radboud University Nijmegen, Netherlands

Abstract

Ultrasonic vocalizations (USVs) fulfill an important role in communication and navigation in many species. Because of their social and affective significance, rodent USVs are increasingly used as a behavioral measure in neurodevelopmental and neurolinguistic research. Reliably attributing USVs to their emitter during close interactions has emerged as a difficult, key challenge. If addressed, all subsequent analyses gain substantial confidence. We present a hybrid ultrasonic tracking system, HyVL, that synergistically integrates a high-resolution acoustic camera with high-quality ultrasonic microphones. HyVL is the first to achieve millimeter precision (~3.4-4.8mm, 91% assigned) in localizing USVs, ~3x better than other systems, approaching the physical limits (mouse snout ~ 10mm). We analyze mouse courtship interactions and demonstrate that males and females vocalize in starkly different relative spatial positions, and that the fraction of female vocalizations has likely been overestimated previously due to imprecise localization. Further, we find that when two male mice interact with one female, one of the males takes a dominant role in the interaction both in terms of the vocalization rate and the location relative to the female. HyVL substantially improves the precision with which social communication between rodents can be studied. It is also affordable, open-source, easy to set up, can be integrated with existing setups, and reduces the required number of experiments and animals.

Data availability

All code necessary to implement the HyVL system has been deposited at https://github.com/benglitz/HyVL and https://doi.org/10.34973/7kgc-ta72.All data has been made available at https://doi.org/10.34973/7kgc-ta72.

Article and author information

Author details

  1. Max L Sterling

    Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
    Competing interests
    The authors declare that no competing interests exist.
  2. Ruben Teunisse

    Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
    Competing interests
    The authors declare that no competing interests exist.
  3. Bernhard Englitz

    Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen, Nijmegen, Netherlands
    For correspondence
    englitz@science.ru.nl
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9106-0356

Funding

DCN Internal Grant, Noldus IT

  • Bernhard Englitz

NWO VIDI grant (016.VIDI.189.052)

  • Bernhard Englitz

Technology Hotel Grant, ZonMW (40-43500-98-4141)

  • Bernhard Englitz

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All of the animals and experimental procedures were conducted according to the guidelines of the Animal Welfare Body of the Central Animal Facility at the Radboud University. The protocol was approved by the Dutch National Committee CCD (Permit Number: 2017-0041).

Copyright

© 2023, Sterling et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,409
    views
  • 210
    downloads
  • 10
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Max L Sterling
  2. Ruben Teunisse
  3. Bernhard Englitz
(2023)
Rodent ultrasonic vocal interaction resolved with millimeter precision using hybrid beamforming
eLife 12:e86126.
https://doi.org/10.7554/eLife.86126

Share this article

https://doi.org/10.7554/eLife.86126

Further reading

    1. Neuroscience
    Mohsen Alavash
    Insight

    Combining electrophysiological, anatomical and functional brain maps reveals networks of beta neural activity that align with dopamine uptake.

    1. Neuroscience
    Masahiro Takigawa, Marta Huelin Gorriz ... Daniel Bendor
    Research Article

    During rest and sleep, memory traces replay in the brain. The dialogue between brain regions during replay is thought to stabilize labile memory traces for long-term storage. However, because replay is an internally-driven, spontaneous phenomenon, it does not have a ground truth - an external reference that can validate whether a memory has truly been replayed. Instead, replay detection is based on the similarity between the sequential neural activity comprising the replay event and the corresponding template of neural activity generated during active locomotion. If the statistical likelihood of observing such a match by chance is sufficiently low, the candidate replay event is inferred to be replaying that specific memory. However, without the ability to evaluate whether replay detection methods are successfully detecting true events and correctly rejecting non-events, the evaluation and comparison of different replay methods is challenging. To circumvent this problem, we present a new framework for evaluating replay, tested using hippocampal neural recordings from rats exploring two novel linear tracks. Using this two-track paradigm, our framework selects replay events based on their temporal fidelity (sequence-based detection), and evaluates the detection performance using each event's track discriminability, where sequenceless decoding across both tracks is used to quantify whether the track replaying is also the most likely track being reactivated.