The brain efficiently processes multisensory information by selectively combining related signals across the continuous stream of multisensory inputs. To do so, it needs to detect correlation, lag and synchrony across the senses; optimally integrate related information; and dynamically adapt to spatiotemporal conflicts across the senses. Here we show that all these aspects of multisensory perception can be jointly explained by postulating an elementary processing unit akin to the Hassenstein–Reichardt detector—a model originally developed for visual motion perception. This unit, termed the multisensory correlation detector (MCD), integrates related multisensory signals through a set of temporal filters followed by linear combination. Our model can tightly replicate human perception as measured in a series of empirical studies, both novel and previously published. MCDs provide a unified general theory of multisensory processing, which simultaneously explains a wide spectrum of phenomena with a simple, yet physiologically plausible model. The human brain integrates inputs across multiple sensory streams into a unified percept. Here Parise and Ernst present a model that assesses the correlation, lag and synchrony of multisensory stimuli, and predicts psychophysical performance on multisensory temporal and spatial tasks.
ABSTRACT
PUBLICATION RECORD
- Publication year
2016
- Venue
Nature Communications
- Publication date
2016-06-06
- Fields of study
Biology, Medicine, Computer Science, Psychology
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-46 of 46 references · Page 1 of 1