Statistical learning attenuates visual activity only for attended stimuli
Abstract
Perception and behavior can be guided by predictions, which are often based on learned statistical regularities. Neural responses to expected stimuli are frequently found to be attenuated after statistical learning. However, whether this sensory attenuation following statistical learning occurs automatically or depends on attention remains unknown. In the present fMRI study, we exposed human volunteers to sequentially presented object stimuli, in which the first object predicted the identity of the second object. We observed a reliable attenuation of neural activity for expected compared to unexpected stimuli in the ventral visual stream. Crucially, this sensory attenuation was only apparent when stimuli were attended, and vanished when attention was directed away from the predictable objects. These results put important constraints on neurocomputational theories that cast perception as a process of probabilistic integration of prior knowledge and sensory information.
Data availability
All data and code necessary to replicate the reported results are available via the following URL: http://hdl.handle.net/11633/aacg3rkw
-
Attentional modulation of perceptual predictionsdi.dccn.DSC_3018028.03_962.
Article and author information
Author details
Funding
Nederlandse Organisatie voor Wetenschappelijk Onderzoek (Vidi Grant 452-13-016)
- Floris P de Lange
Horizon 2020 Framework Programme (ERC Starting Grant 678286)
- Floris P de Lange
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: The study followed institutional guidelines of the local ethics committee (CMO region Arnhem-Nijmegen, The Netherlands; Protocol CMO2014/288), including informed consent of all participants.
Reviewing Editor
- Thorsten Kahnt, Northwestern University Feinberg School of Medicine, United States
Version history
- Received: April 23, 2019
- Accepted: August 21, 2019
- Accepted Manuscript published: August 23, 2019 (version 1)
- Version of Record published: September 6, 2019 (version 2)
Copyright
© 2019, Richter & de Lange
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,475
- Page views
-
- 318
- Downloads
-
- 33
- Citations
Article citation count generated by polling the highest count across the following sources: Scopus, Crossref, PubMed Central.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
The lateral geniculate nucleus (LGN), a retinotopic relay center where visual inputs from the retina are processed and relayed to the visual cortex, has been proposed as a potential target for artificial vision. At present, it is unknown whether optogenetic LGN stimulation is sufficient to elicit behaviorally relevant percepts, and the properties of LGN neural responses relevant for artificial vision have not been thoroughly characterized. Here, we demonstrate that tree shrews pretrained on a visual detection task can detect optogenetic LGN activation using an AAV2-CamKIIα-ChR2 construct and readily generalize from visual to optogenetic detection. Simultaneous recordings of LGN spiking activity and primary visual cortex (V1) local field potentials (LFP) during optogenetic LGN stimulation show that LGN neurons reliably follow optogenetic stimulation at frequencies up to 60 Hz, and uncovered a striking phase locking between the V1 local field potential (LFP) and the evoked spiking activity in LGN. These phase relationships were maintained over a broad range of LGN stimulation frequencies, up to 80 Hz, with spike field coherence values favoring higher frequencies, indicating the ability to relay temporally precise information to V1 using light activation of the LGN. Finally, V1 LFP responses showed sensitivity values to LGN optogenetic activation that were similar to the animal's behavioral performance. Taken together, our findings confirm the LGN as a potential target for visual prosthetics in a highly visual mammal closely related to primates.
-
- Neuroscience
Hippocampal place cell sequences have been hypothesized to serve as diverse purposes as the induction of synaptic plasticity, formation and consolidation of long-term memories, or navigation and planning. During spatial behaviors of rodents, sequential firing of place cells at the theta timescale (known as theta sequences) encodes running trajectories, which can be considered as one-dimensional behavioral sequences of traversed locations. In a two-dimensional space, however, each single location can be visited along arbitrary one-dimensional running trajectories. Thus, a place cell will generally take part in multiple different theta sequences, raising questions about how this two-dimensional topology can be reconciled with the idea of hippocampal sequences underlying memory of (one-dimensional) episodes. Here, we propose a computational model of cornu ammonis 3 (CA3) and dentate gyrus (DG), where sensorimotor input drives the direction-dependent (extrinsic) theta sequences within CA3 reflecting the two-dimensional spatial topology, whereas the intrahippocampal CA3-DG projections concurrently produce intrinsic sequences that are independent of the specific running trajectory. Consistent with experimental data, intrinsic theta sequences are less prominent, but can nevertheless be detected during theta activity, thereby serving as running-direction independent landmark cues. We hypothesize that the intrinsic sequences largely reflect replay and preplay activity during non-theta states.