Inferential eye movement control while following dynamic gaze

  1. Nicole Xiao Han  Is a corresponding author
  2. Miguel Patricio Eckstein
  1. University of California, Santa Barbara, United States

Abstract

Attending to other people's gaze is evolutionary important to make inferences about intentions and actions. Gaze influences covert attention and triggers eye movements. However, we know little about how the brain controls the fine-grain dynamics of eye movements during gaze following. Observers followed people's gaze shifts in videos during search and we related the observer eye movement dynamics to the time course of gazer head movements extracted by a deep neural network. We show that the observers' brains use information in the visual periphery to execute predictive saccades that anticipate the information in the gazer's head direction by 190-350 ms. The brain simultaneously monitors moment-to-moment changes in the gazer's head velocity to dynamically alter eye movements and re-fixate the gazer (reverse saccades) when the head accelerates before the initiation of the first forward gaze-following saccade. Using saccade-contingent manipulations of the videos, we experimentally show that the reverse saccades are planned concurrently with the first forward gaze-following saccade and have a functional role in reducing subsequent errors fixating on the gaze goal. Together, our findings characterize the inferential and functional nature of social attention's fine-grain eye movement dynamics.

Data availability

All data generated or analyzed during this study are deposited at https://osf.io/g9bzt/

The following data sets were generated

Article and author information

Author details

  1. Nicole Xiao Han

    Department of Psychological and Brain Sciences, University of California, Santa Barbara, Santa Barabra, United States
    For correspondence
    xhan01@ucsb.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2860-2743
  2. Miguel Patricio Eckstein

    Department of Psychological and Brain Sciences, University of California, Santa Barbara, Santa Barabra, United States
    Competing interests
    The authors declare that no competing interests exist.

Funding

Army Research Office (W911NF-19-D-0001)

  • Miguel Patricio Eckstein

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Reviewing Editor

  1. Miriam Spering, The University of British Columbia, Canada

Ethics

Human subjects: The experiment protocol was approved by the University of California Internal Review Board with protocol number 12-22-0667. All participants signed consent forms to participate in the experiment and to include their images in resulting publications.

Version history

  1. Received: September 2, 2022
  2. Preprint posted: September 27, 2022 (view preprint)
  3. Accepted: July 31, 2023
  4. Accepted Manuscript published: August 24, 2023 (version 1)
  5. Version of Record published: September 1, 2023 (version 2)

Copyright

© 2023, Han & Eckstein

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 426
    views
  • 58
    downloads
  • 0
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Nicole Xiao Han
  2. Miguel Patricio Eckstein
(2023)
Inferential eye movement control while following dynamic gaze
eLife 12:e83187.
https://doi.org/10.7554/eLife.83187

Share this article

https://doi.org/10.7554/eLife.83187

Further reading

    1. Neuroscience
    Zilu Liang, Simeng Wu ... Chao Liu
    Research Article

    People form impressions about others during daily social encounters and infer personality traits from others' behaviors. Such trait inference is thought to rely on two universal dimensions: competence and warmth. These two dimensions can be used to construct a ‘social cognitive map’ organizing massive information obtained from social encounters efficiently. Originating from spatial cognition, the neural codes supporting the representation and navigation of spatial cognitive maps have been widely studied. Recent studies suggest similar neural mechanism subserves the map-like architecture in social cognition as well. Here we investigated how spatial codes operate beyond the physical environment and support the representation and navigation of social cognitive map. We designed a social value space defined by two dimensions of competence and warmth. Behaviorally, participants were able to navigate to a learned location from random starting locations in this abstract social space. At the neural level, we identified the representation of distance in the precuneus, fusiform gyrus, and middle occipital gyrus. We also found partial evidence of grid-like representation patterns in the medial prefrontal cortex and entorhinal cortex. Moreover, the intensity of grid-like response scaled with the performance of navigating in social space and social avoidance trait scores. Our findings suggest a neurocognitive mechanism by which social information can be organized into a structured representation, namely cognitive map and its relevance to social well-being.

    1. Neuroscience
    Alina Tetereva, Narun Pat
    Research Article

    One well-known biomarker candidate that supposedly helps capture fluid cognition is Brain Age, or a predicted value based on machine-learning models built to predict chronological age from brain MRI. To formally evaluate the utility of Brain Age for capturing fluid cognition, we built 26 age-prediction models for Brain Age based on different combinations of MRI modalities, using the Human Connectome Project in Aging (n=504, 36–100 years old). First, based on commonality analyses, we found a large overlap between Brain Age and chronological age: Brain Age could uniquely add only around 1.6% in explaining variation in fluid cognition over and above chronological age. Second, the age-prediction models that performed better at predicting chronological age did NOT necessarily create better Brain Age for capturing fluid cognition over and above chronological age. Instead, better-performing age-prediction models created Brain Age that overlapped larger with chronological age, up to around 29% out of 32%, in explaining fluid cognition. Third, Brain Age missed around 11% of the total variation in fluid cognition that could have been explained by the brain variation. That is, directly predicting fluid cognition from brain MRI data (instead of relying on Brain Age and chronological age) could lead to around a 1/3-time improvement of the total variation explained. Accordingly, we demonstrated the limited utility of Brain Age as a biomarker for fluid cognition and made some suggestions to ensure the utility of Brain Age in explaining fluid cognition and other phenotypes of interest.