EEG-based detection of the locus of auditory attention with convolutional neural networks

Abstract

In a multi-speaker scenario, the human auditory system is able to attend to one particular speaker of interest and ignore the others. It has been demonstrated that it is possible to use electroencephalography (EEG) signals to infer to which speaker someone is attending by relating the neural activity to the speech signals. However, classifying auditory attention within a short time interval remains the main challenge. We present a convolutional neural network-based approach to extract the locus of auditory attention (left/right) without knowledge of the speech envelopes. Our results show that it is possible to decode the locus of attention within 1 to 2 s, with a median accuracy of around 81%. These results are promising for neuro-steered noise suppression in hearing aids, in particular in scenarios where per-speaker envelopes are unavailable.

Data availability

Code used for training and evaluating the network has been made available at https://github.com/exporl/locus-of-auditory-attention-cnn. The CNN models used to generate the results shown in the paper are also available at that location. The dataset used in this study had been made available earlier at https://zenodo.org/record/3377911.

The following previously published data sets were used

Article and author information

Author details

  1. Servaas Vandecappelle

    Department of Neurosciences, Katholieke Universiteit Leuven, Leuven, Belgium
    For correspondence
    servaas.vandecappelle@gmail.com
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-0266-7293
  2. Lucas Deckers

    Department of Neurosciences, Katholieke Universiteit Leuven, Leuven, Belgium
    Competing interests
    The authors declare that no competing interests exist.
  3. Neetha Das

    Department of Neurosciences, Katholieke Universiteit Leuven, Leuven, Belgium
    Competing interests
    The authors declare that no competing interests exist.
  4. Amir Hossein Ansari

    Department of Electrical Engineering, Katholieke Universiteit Leuven, Leuven, Belgium
    Competing interests
    The authors declare that no competing interests exist.
  5. Alexander Bertrand

    Department of Electrical Engineering, Katholieke Universiteit Leuven, Leuven, Belgium
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-4827-8568
  6. Tom Francart

    Dept. of Neurosciences, Katholieke Universiteit Leuven, Leuven, Belgium
    For correspondence
    tom.francart@kuleuven.be
    Competing interests
    The authors declare that no competing interests exist.

Funding

KU Leuven Special Research Fund (C14/16/057)

  • Tom Francart

KU Leuven Special Research Fund (C24/18/099)

  • Alexander Bertrand

Research Foundation Flanders (1.5.123.16N)

  • Alexander Bertrand

Research Foundation Flanders (G0A4918N)

  • Alexander Bertrand

European Research Council (637424)

  • Tom Francart

European Research Council (802895)

  • Alexander Bertrand

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: The experiment was approved by the Ethics Committee Research UZ/KU Leuven (S57102) and every participant signed an informed consent form approved by the same commitee.

Copyright

© 2021, Vandecappelle et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 3,722
    views
  • 524
    downloads
  • 58
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Servaas Vandecappelle
  2. Lucas Deckers
  3. Neetha Das
  4. Amir Hossein Ansari
  5. Alexander Bertrand
  6. Tom Francart
(2021)
EEG-based detection of the locus of auditory attention with convolutional neural networks
eLife 10:e56481.
https://doi.org/10.7554/eLife.56481

Share this article

https://doi.org/10.7554/eLife.56481

Further reading

    1. Neuroscience
    Sam E Benezra, Kripa B Patel ... Randy M Bruno
    Research Article

    Learning alters cortical representations and improves perception. Apical tuft dendrites in cortical layer 1, which are unique in their connectivity and biophysical properties, may be a key site of learning-induced plasticity. We used both two-photon and SCAPE microscopy to longitudinally track tuft-wide calcium spikes in apical dendrites of layer 5 pyramidal neurons in barrel cortex as mice learned a tactile behavior. Mice were trained to discriminate two orthogonal directions of whisker stimulation. Reinforcement learning, but not repeated stimulus exposure, enhanced tuft selectivity for both directions equally, even though only one was associated with reward. Selective tufts emerged from initially unresponsive or low-selectivity populations. Animal movement and choice did not account for changes in stimulus selectivity. Enhanced selectivity persisted even after rewards were removed and animals ceased performing the task. We conclude that learning produces long-lasting realignment of apical dendrite tuft responses to behaviorally relevant dimensions of a task.

    1. Neuroscience
    Rongxin Fang, Aaron Halpern ... Xiaowei Zhuang
    Tools and Resources

    Multiplexed error-robust fluorescence in situ hybridization (MERFISH) allows genome-scale imaging of RNAs in individual cells in intact tissues. To date, MERFISH has been applied to image thin-tissue samples of ~10 µm thickness. Here, we present a thick-tissue three-dimensional (3D) MERFISH imaging method, which uses confocal microscopy for optical sectioning, deep learning for increasing imaging speed and quality, as well as sample preparation and imaging protocol optimized for thick samples. We demonstrated 3D MERFISH on mouse brain tissue sections of up to 200 µm thickness with high detection efficiency and accuracy. We anticipate that 3D thick-tissue MERFISH imaging will broaden the scope of questions that can be addressed by spatial genomics.