State-dependent representations of mixtures by the olfactory bulb

  1. Aliya Mari Adefuin
  2. Sander Lindeman
  3. Janine K Reinert
  4. Izumi Fukunaga  Is a corresponding author
  1. Okinawa Institute of Science and Technology Graduate University, Japan

Abstract

Sensory systems are often tasked to analyse complex signals from the environment, separating relevant from irrelevant parts. This process of decomposing signals is challenging when a mixture of signals does not equal the sum of its parts, leading to an unpredictable corruption of signal patterns. In olfaction, nonlinear summation is prevalent at various stages of sensory processing. Here, we investigate how the olfactory system deals with binary mixtures of odours under different brain states, using two-photon imaging of olfactory bulb (OB) output neurons. Unlike previous studies using anaesthetised animals, we found that mixture summation is more linear in the early phase of evoked responses in awake, head-fixed mice performing an odour detection task, due to dampened responses. Despite this, and responses being more variable, decoding analyses indicated that the data from behaving mice was well discriminable. Curiously, the time course of decoding accuracy did not correlate strictly with the linearity of summation. Further, a comparison with naïve mice indicated that learning to accurately perform the mixture detection task is not accompanied by more linear mixture summation. Finally, using a simulation, we demonstrate that, while saturating sublinearity tends to degrade the discriminability, the extent of the impairment may depend on other factors, including pattern decorrelation. Altogether, our results demonstrate that the mixture representation in the primary olfactory area is state-dependent, but the analytical perception may not strictly correlate with linearity in summation.

Data availability

The files consist of individual data to compare linear sum vs. observed mixture responses (300 - 1000 ms after odour onset).

The following data sets were generated
    1. Fukunaga I
    (2022) Figure 6C
    Dryad Digital Repository, doi:10.5061/dryad.p2ngf1vrh.

Article and author information

Author details

  1. Aliya Mari Adefuin

    Sensory and Behavioural Neuroscience Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
    Competing interests
    The authors declare that no competing interests exist.
  2. Sander Lindeman

    Sensory and Behavioural Neuroscience Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
    Competing interests
    The authors declare that no competing interests exist.
  3. Janine K Reinert

    Sensory and Behavioural Neuroscience Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
    Competing interests
    The authors declare that no competing interests exist.
  4. Izumi Fukunaga

    Sensory and Behavioural Neuroscience Unit, Okinawa Institute of Science and Technology Graduate University, Okinawa, Japan
    For correspondence
    izumi.fukunaga@oist.jp
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-1860-5377

Funding

Okinawa Institute of Science and Technology Graduate University

  • Aliya Mari Adefuin
  • Sander Lindeman
  • Janine K Reinert
  • Izumi Fukunaga

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All procedures described in this study have been approved by the OIST Graduate University's Animal Care and Use Committee (Protocol 2016-151 and 2020-310)

Copyright

© 2022, Adefuin et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,553
    views
  • 221
    downloads
  • 6
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

Share this article

https://doi.org/10.7554/eLife.76882

Further reading

    1. Computational and Systems Biology
    2. Neuroscience
    Brian DePasquale, Carlos D Brody, Jonathan W Pillow
    Research Article Updated

    Accumulating evidence to make decisions is a core cognitive function. Previous studies have tended to estimate accumulation using either neural or behavioral data alone. Here, we develop a unified framework for modeling stimulus-driven behavior and multi-neuron activity simultaneously. We applied our method to choices and neural recordings from three rat brain regions—the posterior parietal cortex (PPC), the frontal orienting fields (FOF), and the anterior-dorsal striatum (ADS)—while subjects performed a pulse-based accumulation task. Each region was best described by a distinct accumulation model, which all differed from the model that best described the animal’s choices. FOF activity was consistent with an accumulator where early evidence was favored while the ADS reflected near perfect accumulation. Neural responses within an accumulation framework unveiled a distinct association between each brain region and choice. Choices were better predicted from all regions using a comprehensive, accumulation-based framework and different brain regions were found to differentially reflect choice-related accumulation signals: FOF and ADS both reflected choice but ADS showed more instances of decision vacillation. Previous studies relating neural data to behaviorally inferred accumulation dynamics have implicitly assumed that individual brain regions reflect the whole-animal level accumulator. Our results suggest that different brain regions represent accumulated evidence in dramatically different ways and that accumulation at the whole-animal level may be constructed from a variety of neural-level accumulators.

    1. Genetics and Genomics
    2. Neuroscience
    Tanya Wolff, Mark Eddison ... Gerald M Rubin
    Research Article

    The central complex (CX) plays a key role in many higher-order functions of the insect brain including navigation and activity regulation. Genetic tools for manipulating individual cell types, and knowledge of what neurotransmitters and neuromodulators they express, will be required to gain mechanistic understanding of how these functions are implemented. We generated and characterized split-GAL4 driver lines that express in individual or small subsets of about half of CX cell types. We surveyed neuropeptide and neuropeptide receptor expression in the central brain using fluorescent in situ hybridization. About half of the neuropeptides we examined were expressed in only a few cells, while the rest were expressed in dozens to hundreds of cells. Neuropeptide receptors were expressed more broadly and at lower levels. Using our GAL4 drivers to mark individual cell types, we found that 51 of the 85 CX cell types we examined expressed at least one neuropeptide and 21 expressed multiple neuropeptides. Surprisingly, all co-expressed a small molecule neurotransmitter. Finally, we used our driver lines to identify CX cell types whose activation affects sleep, and identified other central brain cell types that link the circadian clock to the CX. The well-characterized genetic tools and information on neuropeptide and neurotransmitter expression we provide should enhance studies of the CX.