1. Neuroscience
Download icon

Local processing in neurites of VGluT3-expressing amacrine cells differentially organizes visual information

  1. Jen-Chun Hsiang
  2. Keith Johnson
  3. Linda Madisen
  4. Hongkui Zeng
  5. Daniel Kerschensteiner  Is a corresponding author
  1. Washington University School of Medicine, United States
  2. Allen Institute for Brain Science, United States
Short Report
  • Cited 8
  • Views 1,865
  • Annotations
Cite this article as: eLife 2017;6:e31307 doi: 10.7554/eLife.31307

Abstract

Neurons receive synaptic inputs on extensive neurite arbors. How information is organized across arbors and how local processing in neurites contributes to circuit function is mostly unknown. Here, we used two-photon Ca2+ imaging to study visual processing in VGluT3-expressing amacrine cells (VG3‑ACs) in the mouse retina. Contrast preferences (ON vs. OFF) varied across VG3‑AC arbors depending on the laminar position of neurites, with ON responses preferring larger stimuli than OFF responses. Although arbors of neighboring cells overlap extensively, imaging population activity revealed continuous topographic maps of visual space in the VG3‑AC plexus. All VG3‑AC neurites responded strongly to object motion, but remained silent during global image motion. Thus, VG3‑AC arbors limit vertical and lateral integration of contrast and location information, respectively. We propose that this local processing enables the dense VG3‑AC plexus to contribute precise object motion signals to diverse targets without distorting target-specific contrast preferences and spatial receptive fields.

Article and author information

Author details

  1. Jen-Chun Hsiang

    Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, Saint Louis, United States
    Competing interests
    The authors declare that no competing interests exist.
  2. Keith Johnson

    Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Linda Madisen

    Allen Institute for Brain Science, Seattle, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Hongkui Zeng

    Allen Institute for Brain Science, Seattle, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-0326-5878
  5. Daniel Kerschensteiner

    Department of Ophthalmology and Visual Sciences, Washington University School of Medicine, St. Louis, United States
    For correspondence
    kerschensteinerd@wustl.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0002-6794-9056

Funding

National Eye Institute (EY023341)

  • Daniel Kerschensteiner

Research to Prevent Blindness

  • Daniel Kerschensteiner

National Eye Institute (EY026978)

  • Daniel Kerschensteiner

National Eye Institute (EY 027411)

  • Daniel Kerschensteiner

McDonnell International Scholars Academy

  • Jen-Chun Hsiang

National Institute of General Medical Sciences (GM008151-32)

  • Keith Johnson

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: All procedures in this study were approved by the Institutional Animal Care and Use Committee of Washington University School of Medicine (Protocol # 20170033 and were performed in compliance with the National Institutes of Health Guide for the Care and Use of Laboratory Animals.

Reviewing Editor

  1. Fred Rieke, Howard Hughes Medical Institute, University of Washington, United States

Publication history

  1. Received: August 16, 2017
  2. Accepted: October 11, 2017
  3. Accepted Manuscript published: October 12, 2017 (version 1)
  4. Version of Record published: October 23, 2017 (version 2)
  5. Version of Record updated: October 26, 2017 (version 3)

Copyright

© 2017, Hsiang et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,865
    Page views
  • 326
    Downloads
  • 8
    Citations

Article citation count generated by polling the highest count across the following sources: PubMed Central, Crossref, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Download citations (links to download the citations from this article in formats compatible with various reference manager tools)

Open citations (links to open the citations from this article in various online reference manager services)

Further reading

    1. Neuroscience
    Casey Paquola et al.
    Tools and Resources Updated

    Neuroimaging stands to benefit from emerging ultrahigh-resolution 3D histological atlases of the human brain; the first of which is ‘BigBrain’. Here, we review recent methodological advances for the integration of BigBrain with multi-modal neuroimaging and introduce a toolbox, ’BigBrainWarp’, that combines these developments. The aim of BigBrainWarp is to simplify workflows and support the adoption of best practices. This is accomplished with a simple wrapper function that allows users to easily map data between BigBrain and standard MRI spaces. The function automatically pulls specialised transformation procedures, based on ongoing research from a wide collaborative network of researchers. Additionally, the toolbox improves accessibility of histological information through dissemination of ready-to-use cytoarchitectural features. Finally, we demonstrate the utility of BigBrainWarp with three tutorials and discuss the potential of the toolbox to support multi-scale investigations of brain organisation.

    1. Neuroscience
    Gabriella R Sterne et al.
    Tools and Resources Updated

    Neural circuits carry out complex computations that allow animals to evaluate food, select mates, move toward attractive stimuli, and move away from threats. In insects, the subesophageal zone (SEZ) is a brain region that receives gustatory, pheromonal, and mechanosensory inputs and contributes to the control of diverse behaviors, including feeding, grooming, and locomotion. Despite its importance in sensorimotor transformations, the study of SEZ circuits has been hindered by limited knowledge of the underlying diversity of SEZ neurons. Here, we generate a collection of split-GAL4 lines that provides precise genetic targeting of 138 different SEZ cell types in adult Drosophila melanogaster, comprising approximately one third of all SEZ neurons. We characterize the single-cell anatomy of these neurons and find that they cluster by morphology into six supergroups that organize the SEZ into discrete anatomical domains. We find that the majority of local SEZ interneurons are not classically polarized, suggesting rich local processing, whereas SEZ projection neurons tend to be classically polarized, conveying information to a limited number of higher brain regions. This study provides insight into the anatomical organization of the SEZ and generates resources that will facilitate further study of SEZ neurons and their contributions to sensory processing and behavior.