Spatial cue reliability drives frequency tuning in the Barn Owl's midbrain

  1. Fanny Cazettes  Is a corresponding author
  2. Brian J Fischer
  3. Jose L Pena
  1. Albert Einstein College of Medicine, United States
  2. Seattle University, United States

Abstract

The robust representation of the environment from unreliable sensory cues is vital for the efficient function of the brain. However, how the neural processing captures the most reliable cues is unknown. The interaural time difference (ITD) is the primary cue to localize sound in horizontal space. ITD is encoded in the firing rate of neurons that detect interaural phase difference (IPD). Due to the filtering effect of the head, IPD for a given location varies depending on the environmental context. We found that, in barn owls, at each location there is a frequency range where the head filtering yields the most reliable IPDs across contexts. Remarkably, the frequency tuning of space-specific neurons in the owl's midbrain varies with their preferred sound location, matching the range that carries the most reliable IPD. Thus, frequency tuning in the owl's space-specific neurons reflects a higher-order feature of the code that captures cue reliability.

Article and author information

Author details

  1. Fanny Cazettes

    Department of Neuroscience, Albert Einstein College of Medicine, New York, United States
    For correspondence
    fanny.cazettes@phd.einstein.yu.edu
    Competing interests
    The authors declare that no competing interests exist.
  2. Brian J Fischer

    Department of Mathematics, Seattle University, Seattle, United States
    Competing interests
    The authors declare that no competing interests exist.
  3. Jose L Pena

    Department of Neuroscience, Albert Einstein College of Medicine, Ney York, United States
    Competing interests
    The authors declare that no competing interests exist.

Ethics

Animal experimentation: This study was performed in accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the National Institutes of Health. Animals were handled according to approved institutional animal care and use committee protocol (#20140409) of the Albert Einstein College of Medicine. Surgery was performed under anesthesia and every effort was made to minimize discomfort. Albert Einstein College of Medicine is fully accredited by the Association of Assessment and Accreditation of Laboratory Animal Care. Owls were held under a Scientific Collecting Permit from the US Fish & Wildlife Service (#MB06168B-0).

Reviewing Editor

  1. Ronald L Calabrese, Emory University, United States

Publication history

  1. Received: September 21, 2014
  2. Accepted: December 21, 2014
  3. Accepted Manuscript published: December 22, 2014 (version 1)
  4. Version of Record published: January 14, 2015 (version 2)

Copyright

© 2014, Cazettes et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 1,453
    Page views
  • 181
    Downloads
  • 21
    Citations

Article citation count generated by polling the highest count across the following sources: Scopus, PubMed Central, Crossref.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Fanny Cazettes
  2. Brian J Fischer
  3. Jose L Pena
(2014)
Spatial cue reliability drives frequency tuning in the Barn Owl's midbrain
eLife 3:e04854.
https://doi.org/10.7554/eLife.04854

Further reading

    1. Neuroscience
    Stefanie Engert et al.
    Research Article

    Gustatory sensory neurons detect caloric and harmful compounds in potential food and convey this information to the brain to inform feeding decisions. To examine the signals that gustatory neurons transmit and receive, we reconstructed gustatory axons and their synaptic sites in the adult Drosophila melanogaster brain, utilizing a whole-brain electron microscopy volume. We reconstructed 87 gustatory projections from the proboscis labellum in the right hemisphere and 57 from the left, representing the majority of labellar gustatory axons. Gustatory neurons contain a nearly equal number of interspersed pre-and post-synaptic sites, with extensive synaptic connectivity among gustatory axons. Morphology- and connectivity-based clustering revealed six distinct groups, likely representing neurons recognizing different taste modalities. The vast majority of synaptic connections are between neurons of the same group. This study resolves the anatomy of labellar gustatory projections, reveals that gustatory projections are segregated based on taste modality, and uncovers synaptic connections that may alter the transmission of gustatory signals.

    1. Neuroscience
    Vladislav Ayzenberg, Stella Lourenco
    Research Article

    Categorization of everyday objects requires that humans form representations of shape that are tolerant to variations among exemplars. Yet, how such invariant shape representations develop remains poorly understood. By comparing human infants (6–12 months; N=82) to computational models of vision using comparable procedures, we shed light on the origins and mechanisms underlying object perception. Following habituation to a never-before-seen object, infants classified other novel objects across variations in their component parts. Comparisons to several computational models of vision, including models of high-level and low-level vision, revealed that infants’ performance was best described by a model of shape based on the skeletal structure. Interestingly, infants outperformed a range of artificial neural network models, selected for their massive object experience and biological plausibility, under the same conditions. Altogether, these findings suggest that robust representations of shape can be formed with little language or object experience by relying on the perceptually invariant skeletal structure.