Audiovisual task switching rapidly modulates sound encoding in mouse auditory cortex
Abstract
In everyday behavior, sensory systems are in constant competition for attentional resources, but the cellular and circuit-level mechanisms of modality-selective attention remain largely uninvestigated. We conducted translaminar recordings in mouse auditory cortex (AC) during an audiovisual (AV) attention shifting task. Attending to sound elements in an AV stream reduced both pre-stimulus and stimulus-evoked spiking activity, primarily in deep layer neurons and neurons without spectrotemporal tuning. Despite reduced spiking, stimulus decoder accuracy was preserved, suggesting improved sound encoding efficiency. Similarly, task-irrelevant mapping stimuli during intertrial intervals evoked fewer spikes without impairing stimulus encoding, indicating that attentional modulation generalized beyond training stimuli. Importantly, spiking reductions predicted trial-to-trial behavioral accuracy during auditory attention, but not visual attention. Together, these findings suggest auditory attention facilitates sound discrimination by filtering sound-irrelevant background activity in AC, and that the deepest cortical layers serve as a hub for integrating extramodal contextual information.
Data availability
Physiology and behavior data supporting all figures in this manuscript have been submitted to Dryad with DOI: 10.7272/Q6BV7DVM
-
Audiovisual task switching rapidly modulates sound encoding in mouse auditory cortexDryad Digital Repository, doi:10.7272/Q6BV7DVM.
Article and author information
Author details
Funding
National Institutes of Health (R01NS116598)
- Andrea R Hasenstaub
National Institutes of Health (R01DC014101)
- Andrea R Hasenstaub
National Science Foundation (GFRP)
- Ryan James Morrill
Hearing Research Incorporated
- Andrea R Hasenstaub
Klingenstein Foundation
- Andrea R Hasenstaub
Coleman Memorial Fund
- Andrea R Hasenstaub
National Institutes of Health (F32DC016846)
- James Bigelow
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Reviewing Editor
- Brice Bathellier, CNRS, France
Ethics
Animal experimentation: All experiments were approved by the Institutional Animal Care and Use Committee at the University of California, San Francisco.
Version history
- Preprint posted: November 11, 2021 (view preprint)
- Received: November 24, 2021
- Accepted: August 17, 2022
- Accepted Manuscript published: August 18, 2022 (version 1)
- Version of Record published: August 30, 2022 (version 2)
Copyright
© 2022, Morrill et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,171
- views
-
- 290
- downloads
-
- 1
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Probing memory of a complex visual image within a few hundred milliseconds after its disappearance reveals significantly greater fidelity of recall than if the probe is delayed by as little as a second. Classically interpreted, the former taps into a detailed but rapidly decaying visual sensory or ‘iconic’ memory (IM), while the latter relies on capacity-limited but comparatively stable visual working memory (VWM). While iconic decay and VWM capacity have been extensively studied independently, currently no single framework quantitatively accounts for the dynamics of memory fidelity over these time scales. Here, we extend a stationary neural population model of VWM with a temporal dimension, incorporating rapid sensory-driven accumulation of activity encoding each visual feature in memory, and a slower accumulation of internal error that causes memorized features to randomly drift over time. Instead of facilitating read-out from an independent sensory store, an early cue benefits recall by lifting the effective limit on VWM signal strength imposed when multiple items compete for representation, allowing memory for the cued item to be supplemented with information from the decaying sensory trace. Empirical measurements of human recall dynamics validate these predictions while excluding alternative model architectures. A key conclusion is that differences in capacity classically thought to distinguish IM and VWM are in fact contingent upon a single resource-limited WM store.
-
- Neuroscience
Our ability to recall details from a remembered image depends on a single mechanism that is engaged from the very moment the image disappears from view.