Emotional faces guide the eyes in the absence of awareness

  1. Petra Vetter  Is a corresponding author
  2. Stephanie Badde
  3. Elizabeth A Phelps
  4. Marisa Carrasco  Is a corresponding author
  1. Royal Holloway, University of London, United Kingdom
  2. New York University, United States

Abstract

The ability to act quickly to a threat is a key skill for survival. Under awareness, threat-related emotional information, such as an angry or fearful face, has not only perceptual advantages but also guides rapid actions such as eye movements. Emotional information that is suppressed from awareness still confers perceptual and attentional benefits. However, it is unknown whether suppressed emotional information can directly guide actions, or whether emotional information has to enter awareness to do so. We suppressed emotional faces from awareness using continuous flash suppression and tracked eye gaze position. Under successful suppression, as indicated by objective and subjective measures, gaze moved towards fearful faces, but away from angry faces. Our findings reveal that: (1) threat-related emotional stimuli can guide eye movements in the absence of visual awareness; (2) threat-related emotional face information guides distinct oculomotor actions depending on the type of threat conveyed by the emotional expression.

Data availability

Source data and all analyses are available on Github (https://github.com/StephBadde/EyeMovementsSuppressedEmotionalFaces).

Article and author information

Author details

  1. Petra Vetter

    Department of Psychology, Royal Holloway, University of London, Egham, United Kingdom
    For correspondence
    petra.vetter@rhul.ac.uk
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-6516-4637
  2. Stephanie Badde

    Department of Psychology, New York University, New York, United States
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-4005-5503
  3. Elizabeth A Phelps

    Department of Psychology, New York University, New York, United States
    Competing interests
    No competing interests declared.
  4. Marisa Carrasco

    Department of Psychology, New York University, New York, United States
    For correspondence
    marisa.carrasco@nyu.edu
    Competing interests
    Marisa Carrasco, Reviewing editor, eLife.

Funding

Deutsche Forschungsgemeinschaft (VE 739/1-1)

  • Petra Vetter

National Institutes of Health (NIH-RO1-EY016200)

  • Marisa Carrasco

Deutsche Forschungsgemeinschaft (BA 5600/1-1)

  • Stephanie Badde

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: All participants took part in the experiment in exchange for course credits and signed an informed consent form. The experiment was conducted according to the guidelines of the Declaration of Helsinki and approved by the ethics committee of New York University (IRB# 13-9582).

Copyright

© 2019, Vetter et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 4,823
    views
  • 457
    downloads
  • 22
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Petra Vetter
  2. Stephanie Badde
  3. Elizabeth A Phelps
  4. Marisa Carrasco
(2019)
Emotional faces guide the eyes in the absence of awareness
eLife 8:e43467.
https://doi.org/10.7554/eLife.43467

Share this article

https://doi.org/10.7554/eLife.43467

Further reading

    1. Neuroscience
    Proloy Das, Mingjian He, Patrick L Purdon
    Tools and Resources

    Modern neurophysiological recordings are performed using multichannel sensor arrays that are able to record activity in an increasingly high number of channels numbering in the 100s to 1000s. Often, underlying lower-dimensional patterns of activity are responsible for the observed dynamics, but these representations are difficult to reliably identify using existing methods that attempt to summarize multivariate relationships in a post hoc manner from univariate analyses or using current blind source separation methods. While such methods can reveal appealing patterns of activity, determining the number of components to include, assessing their statistical significance, and interpreting them requires extensive manual intervention and subjective judgment in practice. These difficulties with component selection and interpretation occur in large part because these methods lack a generative model for the underlying spatio-temporal dynamics. Here, we describe a novel component analysis method anchored by a generative model where each source is described by a bio-physically inspired state-space representation. The parameters governing this representation readily capture the oscillatory temporal dynamics of the components, so we refer to it as oscillation component analysis. These parameters – the oscillatory properties, the component mixing weights at the sensors, and the number of oscillations – all are inferred in a data-driven fashion within a Bayesian framework employing an instance of the expectation maximization algorithm. We analyze high-dimensional electroencephalography and magnetoencephalography recordings from human studies to illustrate the potential utility of this method for neuroscience data.

    1. Neuroscience
    Sihan Yang, Anastasia Kiyonaga
    Insight

    A neural signature of serial dependence has been found, which mirrors the attractive bias of visual information seen in behavioral experiments.