TY - JOUR TI - Complementary congruent and opposite neurons achieve concurrent multisensory integration and segregation AU - Zhang, Wen-Hao AU - Wang, He AU - Chen, Aihua AU - Gu, Yong AU - Lee, Tai Sing AU - Wong, KY Michael AU - Wu, Si A2 - Latham, Peter A2 - Gold, Joshua I VL - 8 PY - 2019 DA - 2019/05/23 SP - e43753 C1 - eLife 2019;8:e43753 DO - 10.7554/eLife.43753 UR - https://doi.org/10.7554/eLife.43753 AB - Our brain perceives the world by exploiting multisensory cues to extract information about various aspects of external stimuli. The sensory cues from the same stimulus should be integrated to improve perception, and otherwise segregated to distinguish different stimuli. In reality, however, the brain faces the challenge of recognizing stimuli without knowing in advance the sources of sensory cues. To address this challenge, we propose that the brain conducts integration and segregation concurrently with complementary neurons. Studying the inference of heading-direction via visual and vestibular cues, we develop a network model with two reciprocally connected modules modeling interacting visual-vestibular areas. In each module, there are two groups of neurons whose tunings under each sensory cue are either congruent or opposite. We show that congruent neurons implement integration, while opposite neurons compute cue disparity information for segregation, and the interplay between two groups of neurons achieves efficient multisensory information processing. KW - opposite neuron KW - multisensory integration KW - concurrent integration and segregation KW - decentralized architecture KW - continuous attractor neural network KW - Bayesian inference JF - eLife SN - 2050-084X PB - eLife Sciences Publications, Ltd ER -