Elements of a stochastic 3D prediction engine in larval zebrafish prey capture

  1. Andrew D Bolton  Is a corresponding author
  2. Martin Haesemeyer
  3. Josua Jordi
  4. Ulrich Schaechtle
  5. Feras A Saad
  6. Vikash K Mansinghka
  7. Joshua B Tenenbaum
  8. Florian Engert
  1. Harvard University, United States
  2. Massachusetts Institute of Technology, United States

Abstract

The computational principles underlying predictive capabilities in animals are poorly understood. Here, we wondered whether predictive models mediating prey capture could be reduced to a simple set of sensorimotor rules performed by a primitive organism. For this task, we chose the larval zebrafish, a tractable vertebrate that pursues and captures swimming microbes. Using a novel naturalistic 3D setup, we show that the zebrafish combines position and velocity perception to construct a future positional estimate of its prey, indicating an ability to project trajectories forward in time. Importantly, the stochasticity in the fish's sensorimotor transformations provides a considerable advantage over equivalent noise-free strategies. This surprising result coalesces with recent findings that illustrate the benefits of biological stochasticity to adaptive behavior. In sum, our study reveals that zebrafish are equipped with a recursive prey capture algorithm, built up from simple stochastic rules, that embodies an implicit predictive model of the world.

Data availability

All software related to behavioral analysis, modeling, and virtual prey capture simulation is freely available at www.github.com/larrylegend33/PreycapMaster. The software is licensed under a GNU General Public License 3.0. Source data for analysis and simulations is enclosed as "Source Data" in relevant figures. Source Data for Figure 2 contains all pursuit bouts analyzed in the dataset; it was used to construct Figures 2, 3, 5, and 6A, and is accompanied by instructions for running queries. Source Data for Figure 6 contains the generators for simulating from the DPMMs in Figure 6. Using the code at www.github.com/larrylegend33/PreycapMaster and the generators in Source Data - Figure 6 requires obtaining the BayesDB software package, which is freely available at http://probcomp.csail.mit.edu/software/bayesdb/.

Article and author information

Author details

  1. Andrew D Bolton

    Center for Brain Science, Harvard University, Cambridge, United States
    For correspondence
    andrewdbolton@fas.harvard.edu
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-3059-7226
  2. Martin Haesemeyer

    Center for Brain Science, Harvard University, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
    ORCID icon "This ORCID iD identifies the author of this article:" 0000-0003-2704-3601
  3. Josua Jordi

    Center for Brain Science, Harvard University, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  4. Ulrich Schaechtle

    Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  5. Feras A Saad

    Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  6. Vikash K Mansinghka

    Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  7. Joshua B Tenenbaum

    Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.
  8. Florian Engert

    Center for Brain Science, Harvard University, Cambridge, United States
    Competing interests
    The authors declare that no competing interests exist.

Funding

National Institutes of Health (U19NS104653)

  • Florian Engert

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Ethics

Animal experimentation: Experiments were conducted according to the guidelines of the National Institutes of Health and were approved by the Standing Committee on the Use of Animals in Research of Harvard University. Animals were handled according IACUC protocol #2729.

Reviewing Editor

  1. Gordon J Berman, Emory University, United States

Publication history

  1. Received: September 18, 2019
  2. Accepted: November 25, 2019
  3. Accepted Manuscript published: November 26, 2019 (version 1)
  4. Version of Record published: December 24, 2019 (version 2)

Copyright

© 2019, Bolton et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 2,483
    Page views
  • 316
    Downloads
  • 8
    Citations

Article citation count generated by polling the highest count across the following sources: Crossref, PubMed Central, Scopus.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Andrew D Bolton
  2. Martin Haesemeyer
  3. Josua Jordi
  4. Ulrich Schaechtle
  5. Feras A Saad
  6. Vikash K Mansinghka
  7. Joshua B Tenenbaum
  8. Florian Engert
(2019)
Elements of a stochastic 3D prediction engine in larval zebrafish prey capture
eLife 8:e51975.
https://doi.org/10.7554/eLife.51975
  1. Further reading

Further reading

    1. Neuroscience
    Mingchao Yan et al.
    Tools and Resources

    Resolving trajectories of axonal pathways in the primate prefrontal cortex remains crucial to gain insights into higher-order processes of cognition and emotion, which requires a comprehensive map of axonal projections linking demarcated subdivisions of prefrontal cortex and the rest of brain. Here, we report a mesoscale excitatory projectome issued from the ventrolateral prefrontal cortex (vlPFC) to the entire macaque brain by using viral-based genetic axonal tracing in tandem with high-throughput serial two-photon tomography, which demonstrated prominent monosynaptic projections to other prefrontal areas, temporal, limbic, and subcortical areas, relatively weak projections to parietal and insular regions but no projections directly to the occipital lobe. In a common 3D space, we quantitatively validated an atlas of diffusion tractography-derived vlPFC connections with correlative green fluorescent protein-labeled axonal tracing, and observed generally good agreement except a major difference in the posterior projections of inferior fronto-occipital fasciculus. These findings raise an intriguing question as to how neural information passes along long-range association fiber bundles in macaque brains, and call for the caution of using diffusion tractography to map the wiring diagram of brain circuits.

    1. Medicine
    2. Neuroscience
    Simon Oxenford et al.
    Tools and Resources

    Background: Deep Brain Stimulation (DBS) electrode implant trajectories are stereotactically defined using preoperative neuroimaging. To validate the correct trajectory, microelectrode recordings (MER) or local field potential recordings (LFP) can be used to extend neuroanatomical information (defined by magnetic resonance imaging) with neurophysiological activity patterns recorded from micro- and macroelectrodes probing the surgical target site. Currently, these two sources of information (imaging vs. electrophysiology) are analyzed separately, while means to fuse both data streams have not been introduced.

    Methods: Here we present a tool that integrates resources from stereotactic planning, neuroimaging, MER and high-resolution atlas data to create a real-time visualization of the implant trajectory. We validate the tool based on a retrospective cohort of DBS patients (𝑁 = 52) offline and present single use cases of the real-time platform. Results: We establish an open-source software tool for multimodal data visualization and analysis during DBS surgery. We show a general correspondence between features derived from neuroimaging and electrophysiological recordings and present examples that demonstrate the functionality of the tool.

    Conclusions: This novel software platform for multimodal data visualization and analysis bears translational potential to improve accuracy of DBS surgery. The toolbox is made openly available and is extendable to integrate with additional software packages.

    Funding: Deutsche Forschungsgesellschaft (410169619, 424778381), Deutsches Zentrum für Luftund Raumfahrt (DynaSti), National Institutes of Health (2R01 MH113929), Foundation for OCD Research (FFOR).