Interacting rhythms enhance sensitivity of target detection in a fronto-parietal computational model of visual attention

  1. Amélie Aussel  Is a corresponding author
  2. Ian C Fiebelkorn
  3. Sabine Kastner
  4. Nancy J Kopell
  5. Benjamin Rafael Pittman-Polletta PhD
  1. Boston University, United States
  2. University of Rochester, United States
  3. Princeton University, United States

Abstract

Even during sustained attention, enhanced processing of attended stimuli waxes and wanes rhythmically, with periods of enhanced and relatively diminished visual processing (and subsequent target detection) alternating at 4 or 8 Hz in a sustained visual attention task. These alternating attentional states occur alongside alternating dynamical states, in which lateral intraparietal cortex (LIP), the frontal eye field (FEF), and the mediodorsal pulvinar (mdPul) exhibit different activity and functional connectivity at α, β and γ frequencies-rhythms associated with visual processing, working memory, and motor suppression. To assess whether and how these multiple interacting rhythms contribute to periodicity in attention, we propose a detailed computational model of FEF and LIP. When driven by θ-rhythmic inputs simulating experimentally-observed mdPul activity, this model reproduced the rhythmic dynamics and behavioral consequences of observed attentional states, revealing that the frequencies and mechanisms of the observed rhythms allow for peak sensitivity in visual target detection while maintaining functional flexibility.

Data availability

The current manuscript is a computational study, so no data have been generated for this manuscript. Modelling code is available on the ModelDB open repositories.

The following data sets were generated

Article and author information

Author details

  1. Amélie Aussel

    Cognitive Rhythms Collaborative, Boston University, Boston, United States
    For correspondence
    aaussel@bu.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-0498-2905
  2. Ian C Fiebelkorn

    Department of Neuroscience, University of Rochester, Rochester, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Sabine Kastner

    Princeton Neuroscience Institute, Princeton University, Princeton, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-9742-965X
  4. Nancy J Kopell

    Cognitive Rhythms Collaborative, Boston University, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
  5. Benjamin Rafael Pittman-Polletta PhD

    Department of Mathematics and Statistics, Boston University, Boston, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-6798-7191

Funding

National Institutes of Health (P50 MH109429)

  • Ian C Fiebelkorn
  • Sabine Kastner
  • Nancy J Kopell
  • Benjamin Rafael Pittman-Polletta PhD

National Institute of Mental Health (RO1-MH64043)

  • Ian C Fiebelkorn
  • Sabine Kastner

National Eye Institute (RO1-EY017699)

  • Ian C Fiebelkorn
  • Sabine Kastner

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2023, Aussel et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 975
    views
  • 207
    downloads
  • 5
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Amélie Aussel
  2. Ian C Fiebelkorn
  3. Sabine Kastner
  4. Nancy J Kopell
  5. Benjamin Rafael Pittman-Polletta PhD
(2023)
Interacting rhythms enhance sensitivity of target detection in a fronto-parietal computational model of visual attention
eLife 12:e67684.
https://doi.org/10.7554/eLife.67684

Share this article

https://doi.org/10.7554/eLife.67684

Further reading

    1. Neuroscience
    Li Shen, Shuo Li ... Yi Jiang
    Research Article

    When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.

    1. Evolutionary Biology
    2. Neuroscience
    Gregor Belušič
    Insight

    The first complete 3D reconstruction of the compound eye of a minute wasp species sheds light on the nuts and bolts of size reduction.