Causal links between parietal alpha activity and spatial auditory attention
Abstract
Both visual and auditory spatial selective attention result in lateralized alpha (8-14 Hz) oscillatory power in parietal cortex: alpha increases in the hemisphere ipsilateral to attentional focus. Brain stimulation studies suggest a causal relationship between parietal alpha and suppression of the representation of contralateral visual space. However, there is no evidence that parietal alpha controls auditory spatial attention. Here, we performed high definition transcranial alternating current stimulation (HD-tACS) on human subjects performing an auditory task in which they directed attention based on either spatial or nonspatial features. Alpha (10 Hz) but not theta (6 Hz) HD-tACS of right parietal cortex interfered with attending left but not right auditory space. Parietal stimulation had no effect for nonspatial auditory attention. Moreover, performance in post-stimulation trials returned rapidly to baseline. These results demonstrate a causal, frequency-, hemispheric-, and task-specific effect of parietal alpha brain stimulation on top-down control of auditory spatial attention.
Data availability
Data are available from Dryad at https://dx.doi.org/10.5061/dryad.c031nv7
-
Data from: Causal links between parietal alpha activity and spatial auditory attentionDryad Digital Repository, doi:10.5061/dryad.c031nv7.
Article and author information
Author details
Funding
National Institutes of Health (R01 DC015988)
- Barbara G Shinn-Cunningham
Office of Naval Research (N000141812069)
- Barbara G Shinn-Cunningham
National Institutes of Health (R01 MH-114877)
- Robert MG Reinhart
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: All subjects gave informed consent, as approved by the Boston University Charles River Campus IRB, under protocol 3597E.
Reviewing Editor
- Huan Luo, Peking University, China
Version history
- Received: August 19, 2019
- Accepted: November 28, 2019
- Accepted Manuscript published: November 29, 2019 (version 1)
- Version of Record published: December 10, 2019 (version 2)
Copyright
© 2019, Deng et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 1,855
- Page views
-
- 274
- Downloads
-
- 38
- Citations
Article citation count generated by polling the highest count across the following sources: Scopus, Crossref, PubMed Central.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
How does the human brain combine information across the eyes? It has been known for many years that cortical normalization mechanisms implement ‘ocularity invariance’: equalizing neural responses to spatial patterns presented either monocularly or binocularly. Here, we used a novel combination of electrophysiology, psychophysics, pupillometry, and computational modeling to ask whether this invariance also holds for flickering luminance stimuli with no spatial contrast. We find dramatic violations of ocularity invariance for these stimuli, both in the cortex and also in the subcortical pathways that govern pupil diameter. Specifically, we find substantial binocular facilitation in both pathways with the effect being strongest in the cortex. Near-linear binocular additivity (instead of ocularity invariance) was also found using a perceptual luminance matching task. Ocularity invariance is, therefore, not a ubiquitous feature of visual processing, and the brain appears to repurpose a generic normalization algorithm for different visual functions by adjusting the amount of interocular suppression.
-
- Neuroscience
Tastes typically evoke innate behavioral responses that can be broadly categorized as acceptance or rejection. However, research in Drosophila melanogaster indicates that taste responses also exhibit plasticity through experience-dependent changes in mushroom body circuits. In this study, we develop a novel taste learning paradigm using closed-loop optogenetics. We find that appetitive and aversive taste memories can be formed by pairing gustatory stimuli with optogenetic activation of sensory neurons or dopaminergic neurons encoding reward or punishment. As with olfactory memories, distinct dopaminergic subpopulations drive the parallel formation of short- and long-term appetitive memories. Long-term memories are protein synthesis-dependent and have energetic requirements that are satisfied by a variety of caloric food sources or by direct stimulation of MB-MP1 dopaminergic neurons. Our paradigm affords new opportunities to probe plasticity mechanisms within the taste system and understand the extent to which taste responses depend on experience.