The sifting of visual information in the superior colliculus

  1. Kyu Hyun Lee
  2. Alvita Tran
  3. Zeynep Turan
  4. Markus Meister  Is a corresponding author
  1. California Institute of Technology, United States

Abstract

Much of the early visual system is devoted to sifting the visual scene for the few bits of behaviorally relevant information. In the visual cortex of mammals a hierarchical system of brain areas leads eventually to the selective encoding of important features, like faces and objects. Here we report that a similar process occurs in the other major visual pathway, the superior colliculus. We investigate the visual response properties of collicular neurons in the awake mouse with large-scale electrophysiology. Compared to the superficial collicular layers, neuronal responses in the deeper layers become more selective for behaviorally relevant stimuli; more invariant to location of stimuli in the visual field; and more suppressed by repeated occurrence of a stimulus in the same location. The memory of familiar stimuli persists in complete absence of the visual cortex. Models of these neural computations lead to specific predictions for neural circuitry in the superior colliculus.

Data availability

The data used in the manuscript as well as the analysis codes have been made available on CaltechDATA, under the accession number 1401 (doi:10.22002/D1.1401). We have provided the code for generating Figure 2 (fig2.m).

The following data sets were generated

Article and author information

Author details

  1. Kyu Hyun Lee

    Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, United States
    Competing interests
    No competing interests declared.
  2. Alvita Tran

    Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, United States
    Competing interests
    No competing interests declared.
  3. Zeynep Turan

    Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, United States
    Competing interests
    No competing interests declared.
  4. Markus Meister

    Division of Biology and Biological Engineering, California Institute of Technology, Pasadena, United States
    For correspondence
    meister@caltech.edu
    Competing interests
    Markus Meister, Reviewing editor, eLife.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2136-6506

Funding

Simons Foundation (543015SPI)

  • Markus Meister

National Science Foundation (Graduate Research Fellowship)

  • Alvita Tran

National Institutes of Health (1R01NS111477)

  • Markus Meister

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: This study was performed according to approved institutional animal care and use committee (IACUC) protocols (#1656) of Caltech. All surgery was performed under isoflurane anesthesia and every effort was made to minimize suffering.

Copyright

© 2020, Lee et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 8,225
    views
  • 1,188
    downloads
  • 94
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Kyu Hyun Lee
  2. Alvita Tran
  3. Zeynep Turan
  4. Markus Meister
(2020)
The sifting of visual information in the superior colliculus
eLife 9:e50678.
https://doi.org/10.7554/eLife.50678

Share this article

https://doi.org/10.7554/eLife.50678

Further reading

    1. Evolutionary Biology
    2. Neuroscience
    Gregor Belušič
    Insight

    The first complete 3D reconstruction of the compound eye of a minute wasp species sheds light on the nuts and bolts of size reduction.

    1. Neuroscience
    Li Shen, Shuo Li ... Yi Jiang
    Research Article

    When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.