Using the past to estimate sensory uncertainty

  1. Ulrik Beierholm  Is a corresponding author
  2. Tim Rohe
  3. Ambra Ferrari
  4. Oliver Stegle
  5. Uta Noppeney
  1. Durham University, United Kingdom
  2. Friedrich Alexander University Erlangen-Nuernberg, Germany
  3. University of Birmingham, United Kingdom
  4. DKFZ, Germany

Abstract

To form a more reliable percept of the environment, the brain needs to estimate its own sensory uncertainty. Current theories of perceptual inference assume that the brain computes sensory uncertainty instantaneously and independently for each stimulus. We evaluated this assumption in four psychophysical experiments, in which human observers localized auditory signals that were presented synchronously with spatially disparate visual signals. Critically, the visual noise changed dynamically over time continuously or with intermittent jumps. Our results show that observers integrate audiovisual inputs weighted by sensory uncertainty estimates that combine information from past and current signals consistent with an optimal Bayesian learner that can be approximated by exponential discounting. Our results challenge leading models of perceptual inference where sensory uncertainty estimates depend only on the current stimulus. They demonstrate that the brain capitalizes on the temporal dynamics of the external world and estimates sensory uncertainty by combining past experiences with new incoming sensory signals.

Data availability

The human behavioral raw data and computational model predictions as well as the code for computational modelling and analyses scripts are available in an OSF repository: https://osf.io/gt4jb/

The following data sets were generated

Article and author information

Author details

  1. Ulrik Beierholm

    Psychology Department, Durham University, Durham, United Kingdom
    For correspondence
    ulrik.beierholm@durham.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7296-7996
  2. Tim Rohe

    Institute of Psychology, Friedrich Alexander University Erlangen-Nuernberg, Erlangen, Germany
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9713-3712
  3. Ambra Ferrari

    Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-1946-3884
  4. Oliver Stegle

    DKFZ, Heidelberg, Germany
    Competing interests
    The authors declare that no competing interests exist.
  5. Uta Noppeney

    Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.

Funding

H2020 European Research Council (ERC-multsens,309349)

  • Uta Noppeney

Max Planck Society

  • Tim Rohe
  • Uta Noppeney

Deutsche Forschungsgemeinschaft (DFG RO 5587/1-1)

  • Tim Rohe

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: All volunteers participated in the study after giving written informed consent. The study was approved by the human research review committee of the University of Tuebingen (approval number 432 2007 BO1) and the research review committee of the University of Birmingham (approval number ERN_15-1458AP1).

Copyright

© 2020, Beierholm et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,321
    views
  • 401
    downloads
  • 26
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Ulrik Beierholm
  2. Tim Rohe
  3. Ambra Ferrari
  4. Oliver Stegle
  5. Uta Noppeney
(2020)
Using the past to estimate sensory uncertainty
eLife 9:e54172.
https://doi.org/10.7554/eLife.54172

Share this article

https://doi.org/10.7554/eLife.54172

Further reading

    1. Neuroscience
    William Hockeimer, Ruo-Yah Lai ... James J Knierim
    Research Article

    The hippocampus is believed to encode episodic memory by binding information about the content of experience within a spatiotemporal framework encoding the location and temporal context of that experience. Previous work implies a distinction between positional inputs to the hippocampus from upstream brain regions that provide information about an animal’s location and nonpositional inputs which provide information about the content of experience, both sensory and navigational. Here, we leverage the phenomenon of ‘place field repetition’ to better understand the functional dissociation between positional and nonpositional information encoded in CA1. Rats navigated freely on a novel maze consisting of linear segments arranged in a rectilinear, city-block configuration, which combined elements of open-field foraging and linear-track tasks. Unlike typical results in open-field foraging, place fields were directionally tuned on the maze, even though the animal’s behavior was not constrained to extended, one-dimensional (1D) trajectories. Repeating fields from the same cell tended to have the same directional preference when the fields were aligned along a linear corridor of the maze, but they showed uncorrelated directional preferences when they were unaligned across different corridors. Lastly, individual fields displayed complex time dynamics which resulted in the population activity changing gradually over the course of minutes. These temporal dynamics were evident across repeating fields of the same cell. These results demonstrate that the positional inputs that drive a cell to fire in similar locations across the maze can be behaviorally and temporally dissociated from the nonpositional inputs that alter the firing rates of the cell within its place fields, offering a potential mechanism to increase the flexibility of the system to encode episodic variables within a spatiotemporal framework provided by place cells.

    1. Neuroscience
    Aida Bareghamyan, Changfeng Deng ... Don B Arnold
    Tools and Resources

    Recombinant optogenetic and chemogenetic proteins are potent tools for manipulating neuronal activity and controlling neural circuit function. However, there are few analogous tools for manipulating the structure of neural circuits. Here, we introduce three rationally designed genetically encoded tools that use E3 ligase-dependent mechanisms to trigger the degradation of synaptic scaffolding proteins, leading to functional ablation of synapses. First, we developed a constitutive excitatory synapse ablator, PFE3, analogous to the inhibitory synapse ablator GFE3. PFE3 targets the RING domain of the E3 ligase Mdm2 and the proteasome-interacting region of Protocadherin 10 to the scaffolding protein PSD-95, leading to efficient ablation of excitatory synapses. In addition, we developed a light-inducible version of GFE3, paGFE3, using a novel photoactivatable complex based on the photocleavable protein PhoCl2c. paGFE3 degrades Gephyrin and ablates inhibitory synapses in response to 400 nm light. Finally, we developed a chemically inducible version of GFE3, chGFE3, which degrades inhibitory synapses when combined with the bio-orthogonal dimerizer HaloTag ligand-trimethoprim. Each tool is specific, reversible, and capable of breaking neural circuits at precise locations.