Cortical encoding of acoustic and linguistic rhythms in spoken narratives

  1. Cheng Luo
  2. Nai Ding  Is a corresponding author
  1. Zhejiang University, China

Abstract

Speech contains rich acoustic and linguistic information. Using highly controlled speech materials, previous studies have demonstrated that cortical activity is synchronous to the rhythms of perceived linguistic units, e.g., words and phrases, on top of basic acoustic features, e.g., the speech envelope. When listening to natural speech, it remains unclear, however, how cortical activity jointly encodes acoustic and linguistic information. Here, we investigate the neural encoding of words using electroencephalography, and observe neural activity synchronous to multi-syllabic words when participants naturally listen to narratives. An amplitude modulation (AM) cue for word rhythm enhances the word-level response, but the effect is only observed during passive listening. Furthermore, words and the AM cue are encoded by spatially separable neural responses that are differentially modulated by attention. These results suggest that bottom-up acoustic cues and top-down linguistic knowledge separately contribute to cortical encoding of linguistic units in spoken narratives.

Data availability

The EEG data and analysis code (in MatLab) were uploaded as Source data files.

Article and author information

Author details

  1. Cheng Luo

    College of Biomedical Engineering and Instrument Sciences, Zhejiang University, Hangzhou, China 310027, Zhejiang University, Hangzhou, China
    Competing interests
    The authors declare that no competing interests exist.
  2. Nai Ding

    Key Laboratory for Biomedical Engineering of Ministry of Education, College of Biomedical Engineering and Instrument Sciences, Zhejiang University, Hangzhou, China 310027, Zhejiang University, Hangzhou, China
    For correspondence
    ding_nai@zju.edu.cn
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-3428-2723

Funding

National Natural Science Foundation of China (31771248)

  • Nai Ding

Major Scientific Research Project of Zhejiang Lab (2019KB0AC02)

  • Nai Ding

National Key R & D Program of China (2019YFC0118200)

  • Nai Ding

Zhejiang Provincial Natural Science Foundation of China (LGF19H090020)

  • Cheng Luo

Fundamental Research Funds for the Central Universities (2020FZZX001-05)

  • Nai Ding

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: The experimental procedures were approved by the Research Ethics Committee of the College of Medicine, Zhejiang University (2019-047). All participants provided written informed consent prior to the experiment and were paid.

Copyright

© 2020, Luo & Ding

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,710
    views
  • 327
    downloads
  • 19
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Cheng Luo
  2. Nai Ding
(2020)
Cortical encoding of acoustic and linguistic rhythms in spoken narratives
eLife 9:e60433.
https://doi.org/10.7554/eLife.60433

Share this article

https://doi.org/10.7554/eLife.60433

Further reading

    1. Neuroscience
    Alessandro Piccin, Anne-Emilie Allain ... Angelo Contarino
    Research Article

    Substance-induced social behavior deficits dramatically worsen the clinical outcome of substance use disorders; yet, the underlying mechanisms remain poorly understood. Herein, we investigated the role for the corticotropin-releasing factor receptor 1 (CRF1) in the acute sociability deficits induced by morphine and the related activity of oxytocin (OXY)- and arginine-vasopressin (AVP)-expressing neurons of the paraventricular nucleus of the hypothalamus (PVN). For this purpose, we used both the CRF1 receptor-preferring antagonist compound antalarmin and the genetic mouse model of CRF1 receptor-deficiency. Antalarmin completely abolished sociability deficits induced by morphine in male, but not in female, C57BL/6J mice. Accordingly, genetic CRF1 receptor-deficiency eliminated morphine-induced sociability deficits in male mice. Ex vivo electrophysiology studies showed that antalarmin also eliminated morphine-induced firing of PVN neurons in male, but not in female, C57BL/6J mice. Likewise, genetic CRF1 receptor-deficiency reduced morphine-induced firing of PVN neurons in a CRF1 gene expression-dependent manner. The electrophysiology results consistently mirrored the behavioral results, indicating a link between morphine-induced PVN activity and sociability deficits. Interestingly, in male mice antalarmin abolished morphine-induced firing in neurons co-expressing OXY and AVP, but not in neurons expressing only AVP. In contrast, in female mice antalarmin did not affect morphine-induced firing of neurons co-expressing OXY and AVP or only OXY, indicating a selective sex-specific role for the CRF1 receptor in opiate-induced PVN OXY activity. The present findings demonstrate a major, sex-linked, role for the CRF1 receptor in sociability deficits and related brain alterations induced by morphine, suggesting new therapeutic strategy for opiate use disorders.

    1. Neuroscience
    Li Shen, Shuo Li ... Yi Jiang
    Research Article

    When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.