Social interactions drive efficient foraging and income equality in groups of fish
Abstract
The social interactions underlying group foraging and their benefits have been mostly studied using mechanistic models replicating qualitative features of group behavior, and focused on a single resource or a few clustered ones. Here, we tracked groups of freely foraging adult zebrafish with spatially dispersed food items and found that fish perform stereotypical maneuvers when consuming food, which attract neighboring fish. We then present a mathematical model, based on inferred functional interactions between fish, which accurately describes individual and group foraging of real fish. We show that these interactions allow fish to combine individual and social information to achieve near-optimal foraging efficiency and promote income equality within groups. We further show that the interactions that would maximize efficiency in these social foraging models depend on group size, but not on food distribution - suggesting that fish may adaptively pick the subgroup of neighbors they 'listen to' to determine their own behavior.
Data availability
All data used in this work have been made available via the main author's public GitHub account: https://github.com/schneidmanlab/zebrafishForaging
Article and author information
Author details
Funding
Israeli Science Foundation (1629/12)
- Roy Harpaz
- Elad Schneidman
European Research Council (311238)
- Roy Harpaz
- Elad Schneidman
Human Frontier Science Program (RGP0065/2012)
- Roy Harpaz
- Elad Schneidman
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Ethics
Animal experimentation: Animal care and all the experimental procedures were approved by the Institutional Animal Care and Use Committee of the Weizmann Institute of Science (Protocol 17310415-2)
Copyright
© 2020, Harpaz & Schneidman
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,690
- views
-
- 310
- downloads
-
- 33
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Computational and Systems Biology
- Neuroscience
Audiovisual information reaches the brain via both sustained and transient input channels, representing signals’ intensity over time or changes thereof, respectively. To date, it is unclear to what extent transient and sustained input channels contribute to the combined percept obtained through multisensory integration. Based on the results of two novel psychophysical experiments, here we demonstrate the importance of the transient (instead of the sustained) channel for the integration of audiovisual signals. To account for the present results, we developed a biologically inspired, general-purpose model for multisensory integration, the multisensory correlation detectors, which combines correlated input from unimodal transient channels. Besides accounting for the results of our psychophysical experiments, this model could quantitatively replicate several recent findings in multisensory research, as tested against a large collection of published datasets. In particular, the model could simultaneously account for the perceived timing of audiovisual events, multisensory facilitation in detection tasks, causality judgments, and optimal integration. This study demonstrates that several phenomena in multisensory research that were previously considered unrelated, all stem from the integration of correlated input from unimodal transient channels.
-
- Cell Biology
- Computational and Systems Biology
Induced pluripotent stem cell (iPSC) technology is revolutionizing cell biology. However, the variability between individual iPSC lines and the lack of efficient technology to comprehensively characterize iPSC-derived cell types hinder its adoption in routine preclinical screening settings. To facilitate the validation of iPSC-derived cell culture composition, we have implemented an imaging assay based on cell painting and convolutional neural networks to recognize cell types in dense and mixed cultures with high fidelity. We have benchmarked our approach using pure and mixed cultures of neuroblastoma and astrocytoma cell lines and attained a classification accuracy above 96%. Through iterative data erosion, we found that inputs containing the nuclear region of interest and its close environment, allow achieving equally high classification accuracy as inputs containing the whole cell for semi-confluent cultures and preserved prediction accuracy even in very dense cultures. We then applied this regionally restricted cell profiling approach to evaluate the differentiation status of iPSC-derived neural cultures, by determining the ratio of postmitotic neurons and neural progenitors. We found that the cell-based prediction significantly outperformed an approach in which the population-level time in culture was used as a classification criterion (96% vs 86%, respectively). In mixed iPSC-derived neuronal cultures, microglia could be unequivocally discriminated from neurons, regardless of their reactivity state, and a tiered strategy allowed for further distinguishing activated from non-activated cell states, albeit with lower accuracy. Thus, morphological single-cell profiling provides a means to quantify cell composition in complex mixed neural cultures and holds promise for use in the quality control of iPSC-derived cell culture models.