For many organisms, searching for relevant targets such as food or mates entails active, strategic sampling of the environment. Finding odorous targets may be the most ancient search problem that motile organisms evolved to solve. While chemosensory navigation has been well characterized in micro-organisms and invertebrates, spatial olfaction in vertebrates is poorly understood. We have established an olfactory search assay in which freely-moving mice navigate noisy concentration gradients of airborne odor. Mice solve this task using concentration gradient cues and do not require stereo olfaction for performance. During task performance, respiration and nose movement are synchronized with tens of milliseconds precision. This synchrony is present during trials and largely absent during inter-trial intervals, suggesting that sniff-synchronized nose movement is a strategic behavioral state rather than simply a constant accompaniment to fast breathing. To reveal the spatiotemporal structure of these active sensing movements, we used machine learning methods to parse motion trajectories into elementary movement motifs. Motifs fall into two clusters, which correspond to investigation and approach states. Investigation motifs lock precisely to sniffing, such that the individual motifs preferentially occur at specific phases of the sniff cycle. The allocentric structure of investigation and approach indicate an advantage to sampling both sides of the sharpest part of the odor gradient, consistent with a serial sniff strategy for gradient sensing. This work clarifies sensorimotor strategies for mouse olfactory search and guides ongoing work into the underlying neural mechanisms.
Source code is available on github at https://github.com/Smear-Lab/Olfactory_Search, and source data files are uploaded to Dryad.
Sniff-synchronized, gradient-guided olfactory search by freely-moving miceDryad Digital Repository, doi:10.5061/dryad.r7sqv9sc0.
- Matthew C Smear
- Matthew C Smear
- Matthew C Smear
- Matthew C Smear
- Teresa M Findley
- Morgan A Brown
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Animal experimentation: his study was performed in strict accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. All of the animals were handled according to approved institutional animal care and use committee (IACUC) protocols (AUP-17-23) of the University of Oregon. All surgery was performed under sodium isofluorane anesthesia, and every effort was made to minimize suffering.
- Upinder Singh Bhalla, Tata Institute of Fundamental Research, India
© 2021, Findley et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Deciphering patterns of connectivity between neurons in the brain is a critical step toward understanding brain function. Imaging-based neuroanatomical tracing identifies area-to-area or sparse neuron-to-neuron connectivity patterns, but with limited throughput. Barcode-based connectomics maps large numbers of single-neuron projections, but remains a challenge for jointly analyzing single-cell transcriptomics. Here, we established a rAAV2-retro barcode-based multiplexed tracing method that simultaneously characterizes the projectome and transcriptome at the single neuron level. We uncovered dedicated and collateral projection patterns of ventromedial prefrontal cortex (vmPFC) neurons to five downstream targets and found that projection-defined vmPFC neurons are molecularly heterogeneous. We identified transcriptional signatures of projection-specific vmPFC neurons, and verified Pou3f1 as a marker gene enriched in neurons projecting to the lateral hypothalamus, denoting a distinct subset with collateral projections to both dorsomedial striatum and lateral hypothalamus. In summary, we have developed a new multiplexed technique whose paired connectome and gene expression data can help reveal organizational principles that form neural circuits and process information.
Blindness affects millions of people around the world. A promising solution to restoring a form of vision for some individuals are cortical visual prostheses, which bypass part of the impaired visual pathway by converting camera input to electrical stimulation of the visual system. The artificially induced visual percept (a pattern of localized light flashes, or ‘phosphenes’) has limited resolution, and a great portion of the field’s research is devoted to optimizing the efficacy, efficiency, and practical usefulness of the encoding of visual information. A commonly exploited method is non-invasive functional evaluation in sighted subjects or with computational models by using simulated prosthetic vision (SPV) pipelines. An important challenge in this approach is to balance enhanced perceptual realism, biologically plausibility, and real-time performance in the simulation of cortical prosthetic vision. We present a biologically plausible, PyTorch-based phosphene simulator that can run in real-time and uses differentiable operations to allow for gradient-based computational optimization of phosphene encoding models. The simulator integrates a wide range of clinical results with neurophysiological evidence in humans and non-human primates. The pipeline includes a model of the retinotopic organization and cortical magnification of the visual cortex. Moreover, the quantitative effects of stimulation parameters and temporal dynamics on phosphene characteristics are incorporated. Our results demonstrate the simulator’s suitability for both computational applications such as end-to-end deep learning-based prosthetic vision optimization as well as behavioral experiments. The modular and open-source software provides a flexible simulation framework for computational, clinical, and behavioral neuroscientists working on visual neuroprosthetics.