Optimal multisensory decision-making in a reaction-time task

  1. Jan Drugowitsch  Is a corresponding author
  2. Gregory C DeAngelis
  3. Eliana M Klier
  4. Dora E Angelaki
  5. Alexandre Pouget
  1. University of Rochester, United States
  2. Baylor College of Medicine, United States

Abstract

Humans and animals can integrate sensory evidence from various sources to make decisions in a statistically near-optimal manner, provided that the stimulus presentation time is fixed across trials. Little is known about whether optimality is preserved when subjects can choose when to make a decision (reaction-time task), nor when sensory inputs have time-varying reliability. Using a reaction-time version of a visual/vestibular heading discrimination task, we show that behavior is clearly sub-optimal when quantified with traditional optimality metrics that ignore reaction times. We created a computational model that accumulates evidence optimally across both cues and time, and trades off accuracy with decision speed. This model quantitatively explains subjects' choices and reaction times, supporting the hypothesis that subjects do, in fact, accumulate evidence optimally over time and across sensory modalities, even when the reaction time is under the subject's control.

Article and author information

Author details

  1. Jan Drugowitsch

    University of Rochester, New York, United States
    For correspondence
    jdrugo@gmail.com
    Competing interests
    No competing interests declared.
  2. Gregory C DeAngelis

    University of Rochester, New York, United States
    Competing interests
    No competing interests declared.
  3. Eliana M Klier

    Baylor College of Medicine, Houston, United States
    Competing interests
    No competing interests declared.
  4. Dora E Angelaki

    Baylor College of Medicine, Houston, United States
    Competing interests
    Dora E Angelaki, Reviewing editor, eLife.
  5. Alexandre Pouget

    University of Rochester, New York, United States
    Competing interests
    No competing interests declared.

Ethics

Human subjects: Informed consent was obtained from all participants and all procedures were reviewed and approved by the Washington University Office of Human Research Protections (OHRP), Institutional Review Board (IRB; IRB ID# 201109183). Consent to publish was not obtained in writing, as it was not required by the IRB, but all subjects were recruited for this purpose and approved verbally. Of the initial seven subjects, three participated in a follow-up experiment roughly two years after the initial data collection. Procedures for the follow-up experiment were approved by the Institutional Review Board for Human Subject Research for Baylor College of Medicine and Affiliated Hospitals (BCM IRB, ID# H-29411) and informed consent and consent to publish was given again by all three subjects.

Copyright

© 2014, Drugowitsch et al.

This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.

Metrics

  • 6,577
    views
  • 896
    downloads

Views, downloads and citations are aggregated across all versions of this paper published by eLife.

Download links

A two-part list of links to download the article, or parts of the article, in various formats.

Downloads (link to download the article as PDF)

Open citations (links to open the citations from this article in various online reference manager services)

Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)

  1. Jan Drugowitsch
  2. Gregory C DeAngelis
  3. Eliana M Klier
  4. Dora E Angelaki
  5. Alexandre Pouget
(2014)
Optimal multisensory decision-making in a reaction-time task
eLife 3:e03005.
https://doi.org/10.7554/eLife.03005

Share this article

https://doi.org/10.7554/eLife.03005

Further reading

    1. Neuroscience
    2. Physics of Living Systems
    Moritz Schloetter, Georg U Maret, Christoph J Kleineidam
    Research Article

    Neurons generate and propagate electrical pulses called action potentials which annihilate on arrival at the axon terminal. We measure the extracellular electric field generated by propagating and annihilating action potentials and find that on annihilation, action potentials expel a local discharge. The discharge at the axon terminal generates an inhomogeneous electric field that immediately influences target neurons and thus provokes ephaptic coupling. Our measurements are quantitatively verified by a powerful analytical model which reveals excitation and inhibition in target neurons, depending on position and morphology of the source-target arrangement. Our model is in full agreement with experimental findings on ephaptic coupling at the well-studied Basket cell-Purkinje cell synapse. It is able to predict ephaptic coupling for any other synaptic geometry as illustrated by a few examples.

    1. Neuroscience
    Sven Ohl, Martin Rolfs
    Research Article

    Detecting causal relations structures our perception of events in the world. Here, we determined for visual interactions whether generalized (i.e. feature-invariant) or specialized (i.e. feature-selective) visual routines underlie the perception of causality. To this end, we applied a visual adaptation protocol to assess the adaptability of specific features in classical launching events of simple geometric shapes. We asked observers to report whether they observed a launch or a pass in ambiguous test events (i.e. the overlap between two discs varied from trial to trial). After prolonged exposure to causal launch events (the adaptor) defined by a particular set of features (i.e. a particular motion direction, motion speed, or feature conjunction), observers were less likely to see causal launches in subsequent ambiguous test events than before adaptation. Crucially, adaptation was contingent on the causal impression in launches as demonstrated by a lack of adaptation in non-causal control events. We assessed whether this negative aftereffect transfers to test events with a new set of feature values that were not presented during adaptation. Processing in specialized (as opposed to generalized) visual routines predicts that the transfer of visual adaptation depends on the feature similarity of the adaptor and the test event. We show that the negative aftereffects do not transfer to unadapted launch directions but do transfer to launch events of different speeds. Finally, we used colored discs to assign distinct feature-based identities to the launching and the launched stimulus. We found that the adaptation transferred across colors if the test event had the same motion direction as the adaptor. In summary, visual adaptation allowed us to carve out a visual feature space underlying the perception of causality and revealed specialized visual routines that are tuned to a launch’s motion direction.