Accelerating with FlyBrainLab the discovery of the functional logic of the Drosophila brain in the connectomic era
In recent years, a wealth of Drosophila neuroscience data have become available including cell type, connectome/synaptome datasets for both the larva and adult fly. To facilitate integration across data modalities and to accelerate the understanding of the functional logic of the fly brain, we have developed FlyBrainLab, a unique open-source computing platform that integrates 3D exploration and visualization of diverse datasets with interactive exploration of the functional logic of modeled executable brain circuits. FlyBrainLab's User Interface, Utilities Libraries and Circuit Libraries bring together neuroanatomical, neurogenetic and electrophysiological datasets with computational models of different researchers for validation and comparison within the same platform. Seeking to transcend the limitations of the connectome/synaptome, FlyBrainLab also provides libraries for molecular transduction arising in sensory coding in vision/olfaction. Together with sensory neuron activity data, these libraries serve as entry points for the exploration, analysis, comparison and evaluation of circuit functions of the fruit fly brain.
Code Availability and InstallationStable and tested FlyBrainLab installation instructions for user-side components and utility libraries are available at https://github.com/FlyBrainLab/FlyBrainLab for Linux, MacOS and Windows. The installation and use of FlyBrainLab does not require a GPU, but a service-side backend must be running, for example, on a cloud service, that the user-side of FlyBrainLab can connect to. By default, the user-side-only installation will access the backend services hosted on our public servers. Note that users do not have write permission to the NeuroArch Database, nor will they be able to access a Neurokernel Server for execution. The server-side backend codebase is publicly available at https://github.com/fruitflybrain and https://github.com/neurokernel.A full installation of FlyBrainLab, including all backend and frontend components, is available as a Docker image at https://hub.docker.com/r/fruitflybrain/fbl. The image requires a Linux host with at least 1 CUDA-enabled GPU and the nvidia-docker package (https://github.com/NVIDIA/nvidia-docker) installed. For a custom installation of the complete FlyBrainLab platform, a shell script is available at https://github.com/FlyBrainLab/FlyBrainLab.To help users get started, a number of tutorials are available written as Jupyter notebooks at https://github.com/FlyBrainLab/Tutorials, including a reference to English queries at https://github.com/FlyBrainLab/Tutorials/blob/master/tutorials/getting_started/1b_nlp_queries.ipynb. An overview of the FlyBrainLab resources is available at https://github.com/FlyBrainLab/FlyBrainLab/wiki/FlyBrainLab-Resources.Data AvailabilityThe NeuroArch Database created from publicly available FlyCircuit, Hemibrain and Larva L1EM datasets can be downloaded from https://github.com/FlyBrainLab/dataset. The same repository provides Jupyter notebooks for loading publicly available datasets, such as the FlyCircuit dataset with inferred connectivity, the Hemibrain dataset and the Larva L1 EM dataset, into the NeuroArch Database.
Article and author information
Air Force Office of Scientific Research (FA9550-16-1-0410)
- Mehmet Kerem Turkcan
Defense Advanced Research Projects Agency (HR0011-19-9-0035)
- Aurel A Lazar
- Tingkai Liu
- Mehmet Kerem Turkcan
- Yiyin Zhou
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
- Upinder Singh Bhalla, Tata Institute of Fundamental Research, India
- Received: August 22, 2020
- Accepted: February 21, 2021
- Accepted Manuscript published: February 22, 2021 (version 1)
- Version of Record published: March 31, 2021 (version 2)
© 2021, Lazar et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
- Page views
Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
The treatment of neurodegenerative diseases is hindered by lack of interventions capable of steering multimodal whole-brain dynamics towards patterns indicative of preserved brain health. To address this problem, we combined deep learning with a model capable of reproducing whole-brain functional connectivity in patients diagnosed with Alzheimer’s disease (AD) and behavioral variant frontotemporal dementia (bvFTD). These models included disease-specific atrophy maps as priors to modulate local parameters, revealing increased stability of hippocampal and insular dynamics as signatures of brain atrophy in AD and bvFTD, respectively. Using variational autoencoders, we visualized different pathologies and their severity as the evolution of trajectories in a low-dimensional latent space. Finally, we perturbed the model to reveal key AD- and bvFTD-specific regions to induce transitions from pathological to healthy brain states. Overall, we obtained novel insights on disease progression and control by means of external stimulation, while identifying dynamical mechanisms that underlie functional alterations in neurodegeneration.
Previous research has associated alpha-band [8–12 Hz] oscillations with inhibitory functions: for instance, several studies showed that visual attention increases alpha-band power in the hemisphere ipsilateral to the attended location. However, other studies demonstrated that alpha oscillations positively correlate with visual perception, hinting at different processes underlying their dynamics. Here, using an approach based on traveling waves, we demonstrate that there are two functionally distinct alpha-band oscillations propagating in different directions. We analyzed EEG recordings from three datasets of human participants performing a covert visual attention task (one new dataset with N = 16, two previously published datasets with N = 16 and N = 31). Participants were instructed to detect a brief target by covertly attending to the screen’s left or right side. Our analysis reveals two distinct processes: allocating attention to one hemifield increases top-down alpha-band waves propagating from frontal to occipital regions ipsilateral to the attended location, both with and without visual stimulation. These top-down oscillatory waves correlate positively with alpha-band power in frontal and occipital regions. Yet, different alpha-band waves propagate from occipital to frontal regions and contralateral to the attended location. Crucially, these forward waves were present only during visual stimulation, suggesting a separate mechanism related to visual processing. Together, these results reveal two distinct processes reflected by different propagation directions, demonstrating the importance of considering oscillations as traveling waves when characterizing their functional role.