Integration of locomotion and auditory signals in the mouse inferior colliculus

  1. Yoonsun Yang
  2. Joonyeol Lee
  3. Gunsoo Kim  Is a corresponding author
  1. Institute for Basic Science, Republic of Korea

Abstract

The inferior colliculus (IC) is the major midbrain auditory integration center, where virtually all ascending auditory inputs converge. Although the IC has been extensively studied for sound processing, little is known about the neural activity of the IC in moving subjects, as frequently happens in natural hearing conditions. Here, by recording neural activity in walking mice, we show that the activity of IC neurons is strongly modulated by locomotion, even in the absence of sound stimuli. Similar modulation was also found in hearing-impaired mice, demonstrating that IC neurons receive non-auditory, locomotion-related neural signals. Sound-evoked activity was attenuated during locomotion, and this attenuation increased frequency selectivity across the neuronal population, while maintaining preferred frequencies. Our results suggest that during behavior, integrating movement-related and auditory information is an essential aspect of sound processing in the IC.

Data availability

All data generated or analysed during this study are included in the manuscript and supporting files. Source data files have been provided for Figures 1 through 4.

Article and author information

Author details

  1. Yoonsun Yang

    Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon, Republic of Korea
    Competing interests
    The authors declare that no competing interests exist.
  2. Joonyeol Lee

    Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon, Republic of Korea
    Competing interests
    The authors declare that no competing interests exist.
  3. Gunsoo Kim

    Center for Neuroscience Imaging Research, Institute for Basic Science, Suwon, Republic of Korea
    For correspondence
    kgunsoo@skku.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9318-8329

Funding

Institute for Basic Science (IBS-R015-D1)

  • Gunsoo Kim

Institute for Basic Science (IBS-R015-D1)

  • Joonyeol Lee

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: This study was performed in strict accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. All of the animals were handled according to the protocol (SKKUIACUC2018-02-09-1) approved by the institutional animal care and use committee (IACUC) of the Sungkyunkwan University. Surgeries were performed under isofluorane or ketamine/xylazine anesthesia, and every effort was made to minimize suffering.

Copyright

© 2020, Yang et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,762
    views
  • 517
    downloads
  • 47
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Yoonsun Yang
  2. Joonyeol Lee
  3. Gunsoo Kim
(2020)
Integration of locomotion and auditory signals in the mouse inferior colliculus
eLife 9:e52228.
https://doi.org/10.7554/eLife.52228

Share this article

https://doi.org/10.7554/eLife.52228

Further reading

    1. Computational and Systems Biology
    2. Neuroscience
    Cesare V Parise, Marc O Ernst
    Research Article

    Audiovisual information reaches the brain via both sustained and transient input channels, representing signals’ intensity over time or changes thereof, respectively. To date, it is unclear to what extent transient and sustained input channels contribute to the combined percept obtained through multisensory integration. Based on the results of two novel psychophysical experiments, here we demonstrate the importance of the transient (instead of the sustained) channel for the integration of audiovisual signals. To account for the present results, we developed a biologically inspired, general-purpose model for multisensory integration, the multisensory correlation detectors, which combines correlated input from unimodal transient channels. Besides accounting for the results of our psychophysical experiments, this model could quantitatively replicate several recent findings in multisensory research, as tested against a large collection of published datasets. In particular, the model could simultaneously account for the perceived timing of audiovisual events, multisensory facilitation in detection tasks, causality judgments, and optimal integration. This study demonstrates that several phenomena in multisensory research that were previously considered unrelated, all stem from the integration of correlated input from unimodal transient channels.

    1. Neuroscience
    Tirso RJ Gonzalez Alam, Katya Krieger-Redwood ... Elizabeth Jefferies
    Research Article

    Processing pathways between sensory and default mode network (DMN) regions support recognition, navigation, and memory but their organisation is not well understood. We show that functional subdivisions of visual cortex and DMN sit at opposing ends of parallel streams of information processing that support visually mediated semantic and spatial cognition, providing convergent evidence from univariate and multivariate task responses, intrinsic functional and structural connectivity. Participants learned virtual environments consisting of buildings populated with objects, drawn from either a single semantic category or multiple categories. Later, they made semantic and spatial context decisions about these objects and buildings during functional magnetic resonance imaging. A lateral ventral occipital to fronto-temporal DMN pathway was primarily engaged by semantic judgements, while a medial visual to medial temporal DMN pathway supported spatial context judgements. These pathways had distinctive locations in functional connectivity space: the semantic pathway was both further from unimodal systems and more balanced between visual and auditory-motor regions compared with the spatial pathway. When semantic and spatial context information could be integrated (in buildings containing objects from a single category), regions at the intersection of these pathways responded, suggesting that parallel processing streams interact at multiple levels of the cortical hierarchy to produce coherent memory-guided cognition.