Hierarchical temporal prediction captures motion processing along the visual pathway

  1. Yosef Singer  Is a corresponding author
  2. Luke CL Taylor
  3. Ben DB Willmore
  4. Andrew J King  Is a corresponding author
  5. Nicol S Harper  Is a corresponding author
  1. University of Oxford, United Kingdom

Abstract

Visual neurons respond selectively to features that become increasingly complex from the eyes to the cortex. Retinal neurons prefer flashing spots of light, primary visual cortical (V1) neurons prefer moving bars, and those in higher cortical areas favor complex features like moving textures. Previously, we showed that V1 simple cell tuning can be accounted for by a basic model implementing temporal prediction - representing features that predict future sensory input from past input (Singer et al., 2018). Here we show that hierarchical application of temporal prediction can capture how tuning properties change across at least two levels of the visual system. This suggests that the brain does not efficiently represent all incoming information; instead, it selectively represents sensory inputs that help in predicting the future. When applied hierarchically, temporal prediction extracts time-varying features that depend on increasingly high-level statistics of the sensory input.

Data availability

All custom code used in this study was implemented in Python. The code for the models and analyses shown in Figures 1-8 and associated sections can be found at https://bitbucket.org/ox-ang/hierarchical_temporal_prediction/src/master/. The V1 neural response data (Ringach et al., 2002) used for comparison with the temporal prediction model in Figure 6 came from http://ringachlab.net/ ("Data & Code", "Orientation tuning in Macaque V1"). The V1 image response data used to test the models included in Figure 9 were downloaded with permission from https://github.com/sacadena/Cadena2019PlosCB (Cadena et al., 2019). The V1 movie response data used to test these models were collected in the Laboratory of Dario Ringach at UCLA and downloaded from https://crcns.org/data-sets/vc/pvc-1 (Nahaus and Ringach, 2007; Ringach and Nahaus, 2009). The code for the models and analyses shown in Figure 9 and the associated section can be found at https://github.com/webstorms/StackTP and https://github.com/webstorms/NeuralPred. The movies used for training the models in Figure 9 are available at https://figshare.com/articles/dataset/Natural_movies/24265498.

Article and author information

Author details

  1. Yosef Singer

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    yosef.singer@stcatz.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-4480-0574
  2. Luke CL Taylor

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
  3. Ben DB Willmore

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-2969-7572
  4. Andrew J King

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    andrew.king@dpag.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-5180-7179
  5. Nicol S Harper

    Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom
    For correspondence
    nicol.harper@dpag.ox.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7851-4840

Funding

Wellcome Trust (WT108369/Z/2015/Z)

  • Ben DB Willmore
  • Andrew J King
  • Nicol S Harper

University of Oxford Clarendon Fund

  • Yosef Singer
  • Luke CL Taylor

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2023, Singer et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,155
    views
  • 167
    downloads
  • 11
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Yosef Singer
  2. Luke CL Taylor
  3. Ben DB Willmore
  4. Andrew J King
  5. Nicol S Harper
(2023)
Hierarchical temporal prediction captures motion processing along the visual pathway
eLife 12:e52599.
https://doi.org/10.7554/eLife.52599

Share this article

https://doi.org/10.7554/eLife.52599

Further reading

    1. Neuroscience
    Damian Koevoet, Laura Van Zantwijk ... Christoph Strauch
    Research Article

    What determines where to move the eyes? We recently showed that pupil size, a well-established marker of effort, also reflects the effort associated with making a saccade (‘saccade costs’). Here, we demonstrate saccade costs to critically drive saccade selection: when choosing between any two saccade directions, the least costly direction was consistently preferred. Strikingly, this principle even held during search in natural scenes in two additional experiments. When increasing cognitive demand experimentally through an auditory counting task, participants made fewer saccades and especially cut costly directions. This suggests that the eye-movement system and other cognitive operations consume similar resources that are flexibly allocated among each other as cognitive demand changes. Together, we argue that eye-movement behavior is tuned to adaptively minimize saccade-inherent effort.

    1. Evolutionary Biology
    2. Neuroscience
    Jenny Chen, Phoebe R Richardson ... Hopi E Hoekstra
    Research Article

    Genetic variation is known to contribute to the variation of animal social behavior, but the molecular mechanisms that lead to behavioral differences are still not fully understood. Here, we investigate the cellular evolution of the hypothalamic preoptic area (POA), a brain region that plays a critical role in social behavior, across two sister species of deer mice (Peromyscus maniculatus and P. polionotus) with divergent social systems. These two species exhibit large differences in mating and parental care behavior across species and sex. Using single-nucleus RNA-sequencing, we build a cellular atlas of the POA for males and females of both Peromyscus species. We identify four cell types that are differentially abundant across species, two of which may account for species differences in parental care behavior based on known functions of these cell types. Our data further implicate two sex-biased cell types to be important for the evolution of sex-specific behavior. Finally, we show a remarkable reduction of sex-biased gene expression in P. polionotus, a monogamous species that also exhibits reduced sexual dimorphism in parental care behavior. Our POA atlas is a powerful resource to investigate how molecular neuronal traits may be evolving to give rise to innate differences in social behavior across animal species.