Emotional faces guide the eyes in the absence of awareness
Abstract
The ability to act quickly to a threat is a key skill for survival. Under awareness, threat-related emotional information, such as an angry or fearful face, has not only perceptual advantages but also guides rapid actions such as eye movements. Emotional information that is suppressed from awareness still confers perceptual and attentional benefits. However, it is unknown whether suppressed emotional information can directly guide actions, or whether emotional information has to enter awareness to do so. We suppressed emotional faces from awareness using continuous flash suppression and tracked eye gaze position. Under successful suppression, as indicated by objective and subjective measures, gaze moved towards fearful faces, but away from angry faces. Our findings reveal that: (1) threat-related emotional stimuli can guide eye movements in the absence of visual awareness; (2) threat-related emotional face information guides distinct oculomotor actions depending on the type of threat conveyed by the emotional expression.
Data availability
Source data and all analyses are available on Github (https://github.com/StephBadde/EyeMovementsSuppressedEmotionalFaces).
Article and author information
Author details
Funding
Deutsche Forschungsgemeinschaft (VE 739/1-1)
- Petra Vetter
National Institutes of Health (NIH-RO1-EY016200)
- Marisa Carrasco
Deutsche Forschungsgemeinschaft (BA 5600/1-1)
- Stephanie Badde
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Human subjects: All participants took part in the experiment in exchange for course credits and signed an informed consent form. The experiment was conducted according to the guidelines of the Declaration of Helsinki and approved by the ethics committee of New York University (IRB# 13-9582).
Copyright
© 2019, Vetter et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 4,996
- views
-
- 477
- downloads
-
- 24
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Research on brain plasticity, particularly in the context of deafness, consistently emphasizes the reorganization of the auditory cortex. But to what extent do all individuals with deafness show the same level of reorganization? To address this question, we examined the individual differences in functional connectivity (FC) from the deprived auditory cortex. Our findings demonstrate remarkable differentiation between individuals deriving from the absence of shared auditory experiences, resulting in heightened FC variability among deaf individuals, compared to more consistent FC in the hearing group. Notably, connectivity to language regions becomes more diverse across individuals with deafness. This does not stem from delayed language acquisition; it is found in deaf native signers, who are exposed to natural language since birth. However, comparing FC diversity between deaf native signers and deaf delayed signers, who were deprived of language in early development, we show that language experience also impacts individual differences, although to a more moderate extent. Overall, our research points out the intricate interplay between brain plasticity and individual differences, shedding light on the diverse ways reorganization manifests among individuals. It joins findings of increased connectivity diversity in blindness and highlights the importance of considering individual differences in personalized rehabilitation for sensory loss.
-
- Computational and Systems Biology
- Neuroscience
Fiber photometry has become a popular technique to measure neural activity in vivo, but common analysis strategies can reduce the detection of effects because they condense within-trial signals into summary measures, and discard trial-level information by averaging across-trials. We propose a novel photometry statistical framework based on functional linear mixed modeling, which enables hypothesis testing of variable effects at every trial time-point, and uses trial-level signals without averaging. This makes it possible to compare the timing and magnitude of signals across conditions while accounting for between-animal differences. Our framework produces a series of plots that illustrate covariate effect estimates and statistical significance at each trial time-point. By exploiting signal autocorrelation, our methodology yields joint 95% confidence intervals that account for inspecting effects across the entire trial and improve the detection of event-related signal changes over common multiple comparisons correction strategies. We reanalyze data from a recent study proposing a theory for the role of mesolimbic dopamine in reward learning, and show the capability of our framework to reveal significant effects obscured by standard analysis approaches. For example, our method identifies two dopamine components with distinct temporal dynamics in response to reward delivery. In simulation experiments, our methodology yields improved statistical power over common analysis approaches. Finally, we provide an open-source package and analysis guide for applying our framework.