Combining magnetoencephalography with magnetic resonance imaging enhances learning of surrogate-biomarkers

Abstract

Electrophysiological methods, i.e., M/EEG provide unique views into brain health. Yet, when building predictive models from brain data, it is often unclear how electrophysiology should be combined with other neuroimaging methods. Information can be redundant, useful common representations of multimodal data may not be obvious and multimodal data collection can be medically contraindicated, which reduces applicability. Here, we propose a multimodal model to robustly combine MEG, MRI and fMRI for prediction. We focus on age prediction as a surrogate biomarker in 674 subjects from the Cam-CAN dataset. Strikingly, MEG, fMRI and MRI showed additive effects supporting distinct brain-behavior associations. Moreover, the contribution of MEG was best explained by cortical power spectra between 8 and 30 Hz. Finally, we demonstrate that the model preserves benefits of stacking when some data is missing. The proposed framework, hence, enables multimodal learning for a wide range of biomarkers from diverse types of brain signals.

Data availability

We used the publicly available Cam-CAN dataset. All software and code necessary to obtain the derivative data is shared on github: https://github.com/dengemann/meg-mri-surrogate-biomarkers-aging-2020

The following previously published data sets were used

Article and author information

Author details

  1. Denis Alexander Engemann

    Parietal, Inria Saclay, Palaiseau, France
    For correspondence
    denis-alexander.engemann@inria.fr
    Competing interests
    No competing interests declared.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-7223-1014
  2. Oleh Kozynets

    Parietal, Inria Saclay, Palaiseau, France
    Competing interests
    No competing interests declared.
  3. David Sabbagh

    Parietal, Inria Saclay, Palaiseau, France
    Competing interests
    No competing interests declared.
  4. Guillaume Lemaître

    Parietal, Inria Saclay, Palaiseau, France
    Competing interests
    No competing interests declared.
  5. Gaël Varoquaux

    Parietal, Inria Saclay, Palaiseau, France
    Competing interests
    Gaël Varoquaux, Reviewing editor, eLife.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-1076-5122
  6. Franziskus Liem

    Dynamics of Healthy Aging, University of Zürich, Zürich, Switzerland
    Competing interests
    No competing interests declared.
  7. Alexandre Gramfort

    Parietal, Inria Saclay, Palaiseau, France
    Competing interests
    No competing interests declared.

Funding

H2020 European Research Council (SLAB ERC-YStG-676943)

  • Alexandre Gramfort

French National Institute of Computer Science (Medecine Numerique)

  • Denis Alexander Engemann

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Human subjects: This study is conducted in compliance with the Helsinki Declaration. No experiments on living beings were performed for this study. The data that we used was acquired by the Cam-CAN consortium and has been approved by the local ethics committee, Cambridgeshire 2 Research Ethics Committee (reference: 10/H0308/50).

Copyright

© 2020, Engemann et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 5,302
    views
  • 602
    downloads
  • 83
    citations

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Denis Alexander Engemann
  2. Oleh Kozynets
  3. David Sabbagh
  4. Guillaume Lemaître
  5. Gaël Varoquaux
  6. Franziskus Liem
  7. Alexandre Gramfort
(2020)
Combining magnetoencephalography with magnetic resonance imaging enhances learning of surrogate-biomarkers
eLife 9:e54055.
https://doi.org/10.7554/eLife.54055

Share this article

https://doi.org/10.7554/eLife.54055

Further reading

    1. Evolutionary Biology
    2. Neuroscience
    Gregor Belušič
    Insight

    The first complete 3D reconstruction of the compound eye of a minute wasp species sheds light on the nuts and bolts of size reduction.

    1. Neuroscience
    Li Shen, Shuo Li ... Yi Jiang
    Research Article

    When observing others’ behaviors, we continuously integrate their movements with the corresponding sounds to enhance perception and develop adaptive responses. However, how the human brain integrates these complex audiovisual cues based on their natural temporal correspondence remains unclear. Using electroencephalogram (EEG), we demonstrated that rhythmic cortical activity tracked the hierarchical rhythmic structures in audiovisually congruent human walking movements and footstep sounds. Remarkably, the cortical tracking effects exhibit distinct multisensory integration modes at two temporal scales: an additive mode in a lower-order, narrower temporal integration window (step cycle) and a super-additive enhancement in a higher-order, broader temporal window (gait cycle). Furthermore, while neural responses at the lower-order timescale reflect a domain-general audiovisual integration process, cortical tracking at the higher-order timescale is exclusively engaged in the integration of biological motion cues. In addition, only this higher-order, domain-specific cortical tracking effect correlates with individuals’ autistic traits, highlighting its potential as a neural marker for autism spectrum disorder. These findings unveil the multifaceted mechanism whereby rhythmic cortical activity supports the multisensory integration of human motion, shedding light on how neural coding of hierarchical temporal structures orchestrates the processing of complex, natural stimuli across multiple timescales.