Multiphoton imaging of neural structure and activity in Drosophila through the intact cuticle
Abstract
We developed a multiphoton imaging method to capture neural structure and activity in behaving flies through the intact cuticles. Our measurements show that the fly head cuticle has surprisingly high transmission at wavelengths > 900 nm, and the difficulty of through-cuticle imaging is due to the air sacs and/or fat tissue underneath the head cuticle. By compressing or removing the air sacs, we performed multiphoton imaging of the fly brain through the intact cuticle. Our anatomical and functional imaging results show that 2- and 3-photon imaging are comparable in superficial regions such as the mushroom body, but 3-photon imaging is superior in deeper regions such as the central complex and beyond. We further demonstrated 2-photon through-cuticle functional imaging of odor-evoked calcium responses from the mushroom body g-lobes in behaving flies short-term and long-term. The through-cuticle imaging method developed here extends the time limits of in vivo imaging in flies and opens new ways to capture neural structure and activity from the fly brain.
Data availability
All data generated or analyzed during this study are included in the manuscript and supporting file; Source Data files have been provided.
Article and author information
Author details
Funding
National Science Foundation (DBI-1707312)
- Nilay Yapici
National Institute of General Medical Sciences (R35 GM133698)
- Nilay Yapici
Pew Charitable Trusts (Scholars Award)
- Nilay Yapici
Alfred P. Sloan Foundation (Scholars Award)
- Nilay Yapici
American Federation for Aging Research (Grants for Junior Faculty)
- Nilay Yapici
National Science Foundation (DBI-1707312)
- Chris Xu
The funders had no role in study design, data collection, and interpretation, or the decision to submit the work for publication.
Reviewing Editor
- Upinder Singh Bhalla, Tata Institute of Fundamental Research, India
Version history
- Preprint posted: October 9, 2019 (view preprint)
- Received: April 4, 2021
- Accepted: January 23, 2022
- Accepted Manuscript published: January 24, 2022 (version 1)
- Accepted Manuscript updated: January 25, 2022 (version 2)
- Version of Record published: February 15, 2022 (version 3)
Copyright
© 2022, Aragon et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 4,129
- Page views
-
- 528
- Downloads
-
- 10
- Citations
Article citation count generated by polling the highest count across the following sources: PubMed Central, Crossref, Scopus.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Analysis of neuronal activity in the hippocampus of behaving animals has revealed cells acting as ‘Time Cells’, which exhibit selective spiking patterns at specific time intervals since a triggering event, and ‘Distance Cells’, which encode the traversal of specific distances. Other neurons exhibit a combination of these features, alongside place selectivity. This study aims to investigate how the task performed by animals during recording sessions influences the formation of these representations. We analyzed data from a treadmill running study conducted by Kraus et al., 2013, in which rats were trained to run at different velocities. The rats were recorded in two trial contexts: a ‘fixed time’ condition, where the animal ran on the treadmill for a predetermined duration before proceeding, and a ‘fixed distance’ condition, where the animal ran a specific distance on the treadmill. Our findings indicate that the type of experimental condition significantly influenced the encoding of hippocampal cells. Specifically, distance-encoding cells dominated in fixed-distance experiments, whereas time-encoding cells dominated in fixed-time experiments. These results underscore the flexible coding capabilities of the hippocampus, which are shaped by over-representation of salient variables associated with reward conditions.
-
- Neuroscience
Because of their close relationship with humans, non-human apes (chimpanzees, bonobos, gorillas, orangutans, and gibbons, including siamangs) are of great scientific interest. The goal of understanding their complex behavior would be greatly advanced by the ability to perform video-based pose tracking. Tracking, however, requires high-quality annotated datasets of ape photographs. Here we present OpenApePose, a new public dataset of 71,868 photographs, annotated with 16 body landmarks of six ape species in naturalistic contexts. We show that a standard deep net (HRNet-W48) trained on ape photos can reliably track out-of-sample ape photos better than networks trained on monkeys (specifically, the OpenMonkeyPose dataset) and on humans (COCO) can. This trained network can track apes almost as well as the other networks can track their respective taxa, and models trained without one of the six ape species can track the held-out species better than the monkey and human models can. Ultimately, the results of our analyses highlight the importance of large, specialized databases for animal tracking systems and confirm the utility of our new ape database.