Visual attention modulates the integration of goal-relevant evidence and not value

  1. Pradyumna Sepulveda  Is a corresponding author
  2. Marius Usher
  3. Ned Davies
  4. Amy A Benson
  5. Pietro Ortoleva
  6. Benedetto De Martino  Is a corresponding author
  1. University College London, United Kingdom
  2. Tel Aviv University, Israel
  3. Princeton University, United States

Abstract

When choosing between options, such as food items presented in plain view, people tend to choose the option they spend longer looking at. The prevailing interpretation is that visual attention increases value. However, in previous studies, 'value' was coupled to a behavioural goal, since subjects had to choose the item they preferred. This makes it impossible to discern if visual attention has an effect on value, or, instead, if attention modulates the information most relevant for the goal of the decision-maker. Here we present the results of two independent studies—a perceptual and a value-based task—that allow us to decouple value from goal-relevant information using specific task-framing. Combining psychophysics with computational modelling, we show that, contrary to the current interpretation, attention does not boost value, but instead it modulates goal-relevant information. This work provides a novel and more general mechanism by which attention interacts with choice.

Data availability

Data and the codes used for this study have been deposited at the Brain Decision Modelling Lab GitHub (https://github.com/BDMLab).

The following data sets were generated

Article and author information

Author details

  1. Pradyumna Sepulveda

    Institute of Cognitive Neuroscience, University College London, London, United Kingdom
    For correspondence
    p.sepulveda@ucl.ac.uk
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-0159-6777
  2. Marius Usher

    School of Psychological Sciences and Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-8041-9060
  3. Ned Davies

    Institute of Cognitive Neuroscience, University College London, London, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
  4. Amy A Benson

    Institute of Cognitive Neuroscience, University College London, London, United Kingdom
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-8239-5266
  5. Pietro Ortoleva

    Department of Economics and Woodrow Wilson School, Princeton University, Princeton, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Benedetto De Martino

    Institute Cognitive of Neuroscience, University College London, London, United Kingdom
    For correspondence
    benedettodemartino@gmail.com
    Competing interests
    The authors declare that no competing interests exist.

Funding

Chilean National Agency for Research and Development (Graduate student scholarship - DOCTORADO BECAS CHILE/2017 - 72180193)

  • Pradyumna Sepulveda

Wellcome Trust (Sir Henry Dale Fellowship (102612 /A/13/Z))

  • Benedetto De Martino

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: All participants signed a consent form and both studies were done following the approval given by the University College London, Division of Psychology and Language Sciences ethics committee (project ID number 1825/003).

Copyright

© 2020, Sepulveda et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,654
    views
  • 349
    downloads
  • 52
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Pradyumna Sepulveda
  2. Marius Usher
  3. Ned Davies
  4. Amy A Benson
  5. Pietro Ortoleva
  6. Benedetto De Martino
(2020)
Visual attention modulates the integration of goal-relevant evidence and not value
eLife 9:e60705.
https://doi.org/10.7554/eLife.60705

Share this article

https://doi.org/10.7554/eLife.60705

Further reading

    1. Neuroscience
    Steven S Hou, Yuya Ikegawa ... Masato Maesako
    Tools and Resources

    γ-Secretase plays a pivotal role in the central nervous system. Our recent development of genetically encoded Förster resonance energy transfer (FRET)-based biosensors has enabled the spatiotemporal recording of γ-secretase activity on a cell-by-cell basis in live neurons in culture. Nevertheless, how γ-secretase activity is regulated in vivo remains unclear. Here, we employ the near-infrared (NIR) C99 720–670 biosensor and NIR confocal microscopy to quantitatively record γ-secretase activity in individual neurons in living mouse brains. Intriguingly, we uncovered that γ-secretase activity may influence the activity of γ-secretase in neighboring neurons, suggesting a potential ‘cell non-autonomous’ regulation of γ-secretase in mouse brains. Given that γ-secretase plays critical roles in important biological events and various diseases, our new assay in vivo would become a new platform that enables dissecting the essential roles of γ-secretase in normal health and diseases.

    1. Neuroscience
    Julien Rossato, François Hug ... Simon Avrillon
    Tools and Resources

    Decoding the activity of individual neural cells during natural behaviours allows neuroscientists to study how the nervous system generates and controls movements. Contrary to other neural cells, the activity of spinal motor neurons can be determined non-invasively (or minimally invasively) from the decomposition of electromyographic (EMG) signals into motor unit firing activities. For some interfacing and neuro-feedback investigations, EMG decomposition needs to be performed in real time. Here, we introduce an open-source software that performs real-time decoding of motor neurons using a blind-source separation approach for multichannel EMG signal processing. Separation vectors (motor unit filters) are optimised for each motor unit from baseline contractions and then re-applied in real time during test contractions. In this way, the firing activity of multiple motor neurons can be provided through different forms of visual feedback. We provide a complete framework with guidelines and examples of recordings to guide researchers who aim to study movement control at the motor neuron level. We first validated the software with synthetic EMG signals generated during a range of isometric contraction patterns. We then tested the software on data collected using either surface or intramuscular electrode arrays from five lower limb muscles (gastrocnemius lateralis and medialis, vastus lateralis and medialis, and tibialis anterior). We assessed how the muscle or variation of contraction intensity between the baseline contraction and the test contraction impacted the accuracy of the real-time decomposition. This open-source software provides a set of tools for neuroscientists to design experimental paradigms where participants can receive real-time feedback on the output of the spinal cord circuits.