Monkeys exhibit human-like gaze biases in economic decisions

  1. Shira M Lupkin
  2. Vincent B McGinty  Is a corresponding author
  1. Rutgers, The State University of New Jersey, United States

Abstract

In economic decision-making individuals choose between items based on their perceived value. For both humans and nonhuman primates, these decisions are often carried out while shifting gaze between the available options. Recent studies in humans suggest that these shifts in gaze actively influence choice, manifesting as a bias in favor of the items that are viewed first, viewed last, or viewed for the overall longest duration in a given trial. This suggests a mechanism that links gaze behavior to the neural computations underlying value-based choices. In order to identify this mechanism, it is first necessary to develop and validate a suitable animal model of this behavior. To this end, we have created a novel value-based choice task for macaque monkeys that captures the essential features of the human paradigms in which gaze biases have been observed. Using this task, we identified gaze biases in the monkeys that were both qualitatively and quantitatively similar to those in humans. In addition, the monkeys' gaze biases were well-explained using a sequential sampling model framework previously used to describe gaze biases in humans-the first time this framework has been used to assess value-based decision mechanisms in nonhuman primates. Together, these findings suggest a common mechanism that can explain gaze-related choice biases across species, and open the way for mechanistic studies to identify the neural origins of this behavior.

Data availability

All data and code used for the analyses and figures included in the present manuscript have been uploaded as an Open Science Framework project (and a linked GitHub account). These files can be accessed at: https://osf.io/hkgmn/

The following data sets were generated
    1. Lupkin SM
    2. McGinty VB
    (2022) NHP-Gaze-Bias
    Open Science Framework: DOI 10.17605/OSF.IO/HKGMN.
The following previously published data sets were used

Article and author information

Author details

  1. Shira M Lupkin

    Center for Molecular and Behavioral Neuroscience, Rutgers, The State University of New Jersey, Newark, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-3792-5571
  2. Vincent B McGinty

    Center for Molecular and Behavioral Neuroscience, Rutgers, The State University of New Jersey, Newark, United States
    For correspondence
    VINCE.MCGINTY@RUTGERS.EDU
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-0883-4301

Funding

Rutgers, The State University of New Jersey (Deans Dissertation Fellowship)

  • Shira M Lupkin

Rutgers, The State University of New Jersey (Academic Advancement Fund)

  • Shira M Lupkin

Rutgers, The State University of New Jersey (Graduate Assistantship through the Behavioral and Neural Sciences Graduate Program)

  • Shira M Lupkin

Whitehall Foundation

  • Vincent B McGinty

Biomedical Research Foundation (Busch Biomedical Research Foundation)

  • Vincent B McGinty

National Institute on Drug Abuse (K01-DA-036659-01)

  • Vincent B McGinty

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All procedures were in accordance with the Guide for the Care and Use of Laboratory Animals (2011)and were approved by the Institutional Animal Care and Use Committees of both Stanford University (APLAC Protocol #9720) and Rutgers University-Newark (PROTO999900861). Surgeries to implant orthopedic head restraints were conducted using full surgical anesthesia using aseptic techniques and instruments, and with analgesics and antibiotics given pre-, intra-, and post-operatively as appropriate.

Copyright

© 2023, Lupkin & McGinty

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 711
    views
  • 104
    downloads
  • 3
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Shira M Lupkin
  2. Vincent B McGinty
(2023)
Monkeys exhibit human-like gaze biases in economic decisions
eLife 12:e78205.
https://doi.org/10.7554/eLife.78205

Share this article

https://doi.org/10.7554/eLife.78205

Further reading

    1. Neuroscience
    Proloy Das, Mingjian He, Patrick L Purdon
    Tools and Resources

    Modern neurophysiological recordings are performed using multichannel sensor arrays that are able to record activity in an increasingly high number of channels numbering in the 100s to 1000s. Often, underlying lower-dimensional patterns of activity are responsible for the observed dynamics, but these representations are difficult to reliably identify using existing methods that attempt to summarize multivariate relationships in a post hoc manner from univariate analyses or using current blind source separation methods. While such methods can reveal appealing patterns of activity, determining the number of components to include, assessing their statistical significance, and interpreting them requires extensive manual intervention and subjective judgment in practice. These difficulties with component selection and interpretation occur in large part because these methods lack a generative model for the underlying spatio-temporal dynamics. Here, we describe a novel component analysis method anchored by a generative model where each source is described by a bio-physically inspired state-space representation. The parameters governing this representation readily capture the oscillatory temporal dynamics of the components, so we refer to it as oscillation component analysis. These parameters – the oscillatory properties, the component mixing weights at the sensors, and the number of oscillations – all are inferred in a data-driven fashion within a Bayesian framework employing an instance of the expectation maximization algorithm. We analyze high-dimensional electroencephalography and magnetoencephalography recordings from human studies to illustrate the potential utility of this method for neuroscience data.

    1. Neuroscience
    Sihan Yang, Anastasia Kiyonaga
    Insight

    A neural signature of serial dependence has been found, which mirrors the attractive bias of visual information seen in behavioral experiments.