Hierarchical temporal prediction captures motion processing along the visual pathway

  1. Yosef Singer  Is a corresponding author
  2. Luke CL Taylor
  3. Ben DB Willmore
  4. Andrew J King  Is a corresponding author
  5. Nicol S Harper  Is a corresponding author
  1. University of Oxford, United Kingdom

Abstract

Visual neurons respond selectively to features that become increasingly complex from the eyes to the cortex. Retinal neurons prefer flashing spots of light, primary visual cortical (V1) neurons prefer moving bars, and those in higher cortical areas favor complex features like moving textures. Previously, we showed that V1 simple cell tuning can be accounted for by a basic model implementing temporal prediction - representing features that predict future sensory input from past input (Singer et al., 2018). Here we show that hierarchical application of temporal prediction can capture how tuning properties change across at least two levels of the visual system. This suggests that the brain does not efficiently represent all incoming information; instead, it selectively represents sensory inputs that help in predicting the future. When applied hierarchically, temporal prediction extracts time-varying features that depend on increasingly high-level statistics of the sensory input.

Data availability

All custom code used in this study was implemented in Python. The code for the models and analyses shown in Figures 1-8 and associated sections can be found at https://bitbucket.org/ox-ang/hierarchical_temporal_prediction/src/master/. The V1 neural response data (Ringach et al., 2002) used for comparison with the temporal prediction model in Figure 6 came from http://ringachlab.net/ ("Data & Code", "Orientation tuning in Macaque V1"). The V1 image response data used to test the models included in Figure 9 were downloaded with permission from https://github.com/sacadena/Cadena2019PlosCB (Cadena et al., 2019). The V1 movie response data used to test these models were collected in the Laboratory of Dario Ringach at UCLA and downloaded from https://crcns.org/data-sets/vc/pvc-1 (Nahaus and Ringach, 2007; Ringach and Nahaus, 2009). The code for the models and analyses shown in Figure 9 and the associated section can be found at https://github.com/webstorms/StackTP and https://github.com/webstorms/NeuralPred. The movies used for training the models in Figure 9 are available at https://figshare.com/articles/dataset/Natural_movies/24265498.

Article and author information

Author details

  1. Yosef Singer

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    yosef.singer@stcatz.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-4480-0574
  2. Luke CL Taylor

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
  3. Ben DB Willmore

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-2969-7572
  4. Andrew J King

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    andrew.king@dpag.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5180-7179
  5. Nicol S Harper

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    nicol.harper@dpag.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7851-4840

Funding

Wellcome Trust (WT108369/Z/2015/Z)

  • Ben DB Willmore
  • Andrew J King
  • Nicol S Harper

University of Oxford Clarendon Fund

  • Yosef Singer
  • Luke CL Taylor

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Reviewing Editor

  1. Stephanie E Palmer, University of Chicago, United States

Version history

  1. Preprint posted: March 13, 2019 (view preprint)
  2. Received: October 14, 2019
  3. Accepted: October 4, 2023
  4. Accepted Manuscript published: October 16, 2023 (version 1)
  5. Version of Record published: November 7, 2023 (version 2)

Copyright

© 2023, Singer et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 762
    views
  • 137
    downloads
  • 4
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Yosef Singer
  2. Luke CL Taylor
  3. Ben DB Willmore
  4. Andrew J King
  5. Nicol S Harper
(2023)
Hierarchical temporal prediction captures motion processing along the visual pathway
eLife 12:e52599.
https://doi.org/10.7554/eLife.52599

Share this article

https://doi.org/10.7554/eLife.52599

Further reading

    1. Neuroscience
    Ladan Shahshahani, Maedbh King ... Jörn Diedrichsen
    Research Article

    Functional magnetic resonance imaging (fMRI) studies have documented cerebellar activity across a wide array of tasks. However, the functional contribution of the cerebellum within these task domains remains unclear because cerebellar activity is often studied in isolation. This is problematic, as cerebellar fMRI activity may simply reflect the transmission of neocortical activity through fixed connections. Here, we present a new approach that addresses this problem. Rather than focus on task-dependent activity changes in the cerebellum alone, we ask if neocortical inputs to the cerebellum are gated in a task-dependent manner. We hypothesize that input is upregulated when the cerebellum functionally contributes to a task. We first validated this approach using a finger movement task, where the integrity of the cerebellum has been shown to be essential for the coordination of rapid alternating movements but not for force generation. While both neocortical and cerebellar activity increased with increasing speed and force, the speed-related changes in the cerebellum were larger than predicted by an optimized cortico-cerebellar connectivity model. We then applied the same approach in a cognitive domain, assessing how the cerebellum supports working memory. Enhanced gating was associated with the encoding of items in working memory, but not with the manipulation or retrieval of the items. Focusing on task-dependent gating of neocortical inputs to the cerebellum offers a promising approach for using fMRI to understand the specific contributions of the cerebellum to cognitive function.

    1. Neuroscience
    Anna Seggewisse, Michael Winding
    Insight

    The first neuronal wiring diagram of an insect nerve cord, which includes biological information on cell type and organisation, enables further investigation into premotor circuit function.