Neuroscout, a unified platform for generalizable and reproducible fMRI research

  1. Alejandro de la Vega  Is a corresponding author
  2. Roberta Rocca
  3. Ross W Blair
  4. Christopher J Markiewicz
  5. Jeff Mentch
  6. James D Kent
  7. Peer Herholz
  8. Satrajit S Ghosh
  9. Russell A Poldrack
  10. Tal Yarkoni
  1. The University of Texas at Austin, United States
  2. Aarhus University, Denmark
  3. Stanford University, United States
  4. Massachusetts Institute of Technology, United States
  5. McGill University, Canada

Abstract

Functional magnetic resonance imaging (fMRI) has revolutionized cognitive neuroscience, but methodological barriers limit the generalizability of findings from the lab to the real world. Here, we present Neuroscout, an end-to-end platform for analysis of naturalistic fMRI data designed to facilitate the adoption of robust and generalizable research practices. Neuroscout leverages state-of-the-art machine learning models to automatically annotate stimuli from dozens of fMRI studies using naturalistic stimuli-such as movies and narratives-allowing researchers to easily test neuroscientific hypotheses across multiple ecologically-valid datasets. In addition, Neuroscout builds on a robust ecosystem of open tools and standards to provide an easy-to-use analysis builder and a fully automated execution engine that reduce the burden of reproducible research. Through a series of meta-analytic case studies, we validate the automatic feature extraction approach and demonstrate its potential to support more robust fMRI research. Owing to its ease of use and a high degree of automation, Neuroscout makes it possible to overcome modeling challenges commonly arising in naturalistic analysis and to easily scale analyses within and across datasets, democratizing generalizable fMRI research.

Data availability

All code from our processing pipeline and core infrastructure is available online (https://www.github.com/neuroscout/neuroscout). An online supplement including all analysis code and resulting images is available as a public GitHub repository (https://github.com/neuroscout/neuroscout-paper).All analysis results are made publicly available in a public GitHub repository

The following previously published data sets were used
    1. Hanke M
    2. et al
    (2014) studyforrest
    OpenNeuro, doi:10.18112/ openneuro.ds000113 .v1.3.0.
    1. Nastase SA
    2. et al
    (2021) Narratives
    OpenNeuro, doi:10.18112/openneuro.ds002345 .v1.1.4.

Article and author information

Author details

  1. Alejandro de la Vega

    Department of Psychology, The University of Texas at Austin, Austin, United States
    For correspondence
    delavega@utexas.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0001-9062-3778
  2. Roberta Rocca

    Interacting Minds Centre, Aarhus University, Aarhus, Denmark
    Competing interests
    The authors declare that no competing interests exist.
  3. Ross W Blair

    Department of Psychology, Stanford University, Stanford, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Christopher J Markiewicz

    Department of Psychology, Stanford University, Stanford, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-6533-164X
  5. Jeff Mentch

    McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. James D Kent

    Department of Psychology, The University of Texas at Austin, Austin, United States
    Competing interests
    The authors declare that no competing interests exist.
  7. Peer Herholz

    Montreal Neurological Institute, McGill University, Montreal, Canada
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-9840-6257
  8. Satrajit S Ghosh

    McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-5312-6729
  9. Russell A Poldrack

    Department of Psychology, Stanford University, Stanford, United States
    Competing interests
    The authors declare that no competing interests exist.
  10. Tal Yarkoni

    Department of Psychology, The University of Texas at Austin, Austin, United States
    Competing interests
    The authors declare that no competing interests exist.

Funding

National Institute of Mental Health (R01MH109682)

  • Alejandro de la Vega
  • Roberta Rocca
  • Ross W Blair
  • Christopher J Markiewicz
  • Jeff Mentch
  • James D Kent
  • Peer Herholz
  • Satrajit S Ghosh
  • Russell A Poldrack
  • Tal Yarkoni

National Institute of Mental Health (R01MH096906)

  • Alejandro de la Vega
  • James D Kent
  • Tal Yarkoni

National Institute of Mental Health (R24MH117179)

  • Peer Herholz
  • Satrajit S Ghosh

National Institute of Mental Health (R24MH117179)

  • Ross W Blair
  • Christopher J Markiewicz
  • Russell A Poldrack

Canada First Research Excellence Fund

  • Peer Herholz

Brain Canada Fondation

  • Peer Herholz

Unifying Neuroscience and Artificial Intelligence - Québec

  • Peer Herholz

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Copyright

© 2022, de la Vega et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Alejandro de la Vega
  2. Roberta Rocca
  3. Ross W Blair
  4. Christopher J Markiewicz
  5. Jeff Mentch
  6. James D Kent
  7. Peer Herholz
  8. Satrajit S Ghosh
  9. Russell A Poldrack
  10. Tal Yarkoni
(2022)
Neuroscout, a unified platform for generalizable and reproducible fMRI research
eLife 11:e79277.
https://doi.org/10.7554/eLife.79277

Share this article

https://doi.org/10.7554/eLife.79277

Further reading

    1. Neuroscience
    Karl S Muller, Kathryn Bonnen ... Mary M Hayhoe
    Research Article

    Relatively little is known about the way vision is used to guide locomotion in the natural world. What visual features are used to choose paths in natural complex terrain? To answer this question, we measured eye and body movements while participants walked in natural outdoor environments. We incorporated measurements of the three-dimensional (3D) terrain structure into our analyses and reconstructed the terrain along the walker’s path, applying photogrammetry techniques to the eye tracker’s scene camera videos. Combining these reconstructions with the walker’s body movements, we demonstrate that walkers take terrain structure into account when selecting paths through an environment. We find that they change direction to avoid taking steeper steps that involve large height changes, instead of choosing more circuitous, relatively flat paths. Our data suggest walkers plan the location of individual footholds and plan ahead to select flatter paths. These results provide evidence that locomotor behavior in natural environments is controlled by decision mechanisms that account for multiple factors, including sensory and motor information, costs, and path planning.

    1. Neuroscience
    Simon Avrillon, François Hug ... Dario Farina
    Research Article

    Movements are performed by motoneurons transforming synaptic inputs into an activation signal that controls muscle force. The control signal emerges from interactions between ionotropic and neuromodulatory inputs to motoneurons. Critically, these interactions vary across motoneuron pools and differ between muscles. To provide the most comprehensive framework to date of motor unit activity during isometric contractions, we identified the firing activity of extensive samples of motor units in the tibialis anterior (129 ± 44 per participant; n=8) and the vastus lateralis (130 ± 63 per participant; n=8) muscles during isometric contractions of up to 80% of maximal force. From this unique dataset, the rate coding of each motor unit was characterised as the relation between its instantaneous firing rate and the applied force, with the assumption that the linear increase in isometric force reflects a proportional increase in the net synaptic excitatory inputs received by the motoneuron. This relation was characterised with a natural logarithm function that comprised two stages. The initial stage was marked by a steep acceleration of firing rate, which was greater for low- than medium- and high-threshold motor units. The second stage comprised a linear increase in firing rate, which was greater for high- than medium- and low-threshold motor units. Changes in firing rate were largely non-linear during the ramp-up and ramp-down phases of the task, but with significant prolonged firing activity only evident for medium-threshold motor units. Contrary to what is usually assumed, our results demonstrate that the firing rate of each motor unit can follow a large variety of trends with force across the pool. From a neural control perspective, these findings indicate how motor unit pools use gain control to transform inputs with limited bandwidths into an intended muscle force.