Forced Fusion in Multisensory Heading Estimation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | de Winkel, Ksander N. | - |
dc.contributor.author | Katliar, Mikhail | - |
dc.contributor.author | Buelthoff, Heinrich H. | - |
dc.date.accessioned | 2021-09-04T16:14:42Z | - |
dc.date.available | 2021-09-04T16:14:42Z | - |
dc.date.created | 2021-06-18 | - |
dc.date.issued | 2015-05-04 | - |
dc.identifier.issn | 1932-6203 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/93591 | - |
dc.description.abstract | It has been shown that the Central Nervous System (CNS) integrates visual and inertial information in heading estimation for congruent multisensory stimuli and stimuli with small discrepancies. Multisensory information should, however, only be integrated when the cues are redundant. Here, we investigated how the CNS constructs an estimate of heading for combinations of visual and inertial heading stimuli with a wide range of discrepancies. Participants were presented with 2s visual-only and inertial-only motion stimuli, and combinations thereof. Discrepancies between visual and inertial heading ranging between 0-90 degrees were introduced for the combined stimuli. In the unisensory conditions, it was found that visual heading was generally biased towards the fore-aft axis, while inertial heading was biased away from the fore-aft axis. For multisensory stimuli, it was found that five out of nine participants integrated visual and inertial heading information regardless of the size of the discrepancy; for one participant, the data were best described by a model that explicitly performs causal inference. For the remaining three participants the evidence could not readily distinguish between these models. The finding that multisensory information is integrated is in line with earlier findings, but the finding that even large discrepancies are generally disregarded is surprising. Possibly, people are insensitive to discrepancies in visual-inertial heading angle because such discrepancies are only encountered in artificial environments, making a neural mechanism to account for them otiose. An alternative explanation is that detection of a discrepancy may depend on stimulus duration, where sensitivity to detect discrepancies differs between people. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | PUBLIC LIBRARY SCIENCE | - |
dc.subject | OPTIC FLOW STIMULI | - |
dc.subject | VESTIBULAR CUE INTEGRATION | - |
dc.subject | MST NEURONS | - |
dc.subject | RESPONSE SELECTIVITY | - |
dc.subject | INERTIAL CUES | - |
dc.subject | PERCEPTION | - |
dc.subject | SIGNALS | - |
dc.subject | INFORMATION | - |
dc.subject | SENSITIVITY | - |
dc.title | Forced Fusion in Multisensory Heading Estimation | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Buelthoff, Heinrich H. | - |
dc.identifier.doi | 10.1371/journal.pone.0127104 | - |
dc.identifier.scopusid | 2-s2.0-84929120372 | - |
dc.identifier.wosid | 000353943000163 | - |
dc.identifier.bibliographicCitation | PLOS ONE, v.10, no.5 | - |
dc.relation.isPartOf | PLOS ONE | - |
dc.citation.title | PLOS ONE | - |
dc.citation.volume | 10 | - |
dc.citation.number | 5 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Science & Technology - Other Topics | - |
dc.relation.journalWebOfScienceCategory | Multidisciplinary Sciences | - |
dc.subject.keywordPlus | OPTIC FLOW STIMULI | - |
dc.subject.keywordPlus | VESTIBULAR CUE INTEGRATION | - |
dc.subject.keywordPlus | MST NEURONS | - |
dc.subject.keywordPlus | RESPONSE SELECTIVITY | - |
dc.subject.keywordPlus | INERTIAL CUES | - |
dc.subject.keywordPlus | PERCEPTION | - |
dc.subject.keywordPlus | SIGNALS | - |
dc.subject.keywordPlus | INFORMATION | - |
dc.subject.keywordPlus | SENSITIVITY | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
145 Anam-ro, Seongbuk-gu, Seoul, 02841, Korea+82-2-3290-2963
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.