Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

The eyes grasp, the hands see: Metric category knowledge transfers between vision and touch

Full metadata record
DC Field Value Language
dc.contributor.authorWallraven, Christian-
dc.contributor.authorBuelthoff, Heinrich H.-
dc.contributor.authorWaterkamp, Steffen-
dc.contributor.authorvan Dam, Loes-
dc.contributor.authorGaiert, Nina-
dc.date.accessioned2021-09-05T06:18:07Z-
dc.date.available2021-09-05T06:18:07Z-
dc.date.created2021-06-15-
dc.date.issued2014-08-
dc.identifier.issn1069-9384-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/97751-
dc.description.abstractCategorization of seen objects is often determined by the shapes of objects. However, shape is not exclusive to the visual modality: The haptic system also is expert at identifying shapes. Hence, an important question for understanding shape processing is whether humans store separate modality-dependent shape representations, or whether information is integrated into one multisensory representation. To answer this question, we created a metric space of computer-generated novel objects varying in shape. These objects were then printed using a 3-D printer, to generate tangible stimuli. In a categorization experiment, participants first explored the objects visually and haptically. We found that both modalities led to highly similar categorization behavior. Next, participants were trained either visually or haptically on shape categories within the metric space. As expected, visual training increased visual performance, and haptic training increased haptic performance. Importantly, however, we found that visual training also improved haptic performance, and vice versa. Two additional experiments showed that the location of the categorical boundary in the metric space also transferred across modalities, as did heightened discriminability of objects adjacent to the boundary. This observed transfer of metric category knowledge across modalities indicates that visual and haptic forms of shape information are integrated into a shared multisensory representation.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherSPRINGER-
dc.subjectPERCEPTION-
dc.subjectOBJECTS-
dc.subjectFAMILIAR-
dc.titleThe eyes grasp, the hands see: Metric category knowledge transfers between vision and touch-
dc.typeArticle-
dc.contributor.affiliatedAuthorWallraven, Christian-
dc.identifier.doi10.3758/s13423-013-0563-4-
dc.identifier.scopusid2-s2.0-84904461442-
dc.identifier.wosid000339727600010-
dc.identifier.bibliographicCitationPSYCHONOMIC BULLETIN & REVIEW, v.21, no.4, pp.976 - 985-
dc.relation.isPartOfPSYCHONOMIC BULLETIN & REVIEW-
dc.citation.titlePSYCHONOMIC BULLETIN & REVIEW-
dc.citation.volume21-
dc.citation.number4-
dc.citation.startPage976-
dc.citation.endPage985-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassssci-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaPsychology-
dc.relation.journalWebOfScienceCategoryPsychology, Mathematical-
dc.relation.journalWebOfScienceCategoryPsychology, Experimental-
dc.subject.keywordPlusPERCEPTION-
dc.subject.keywordPlusOBJECTS-
dc.subject.keywordPlusFAMILIAR-
dc.subject.keywordAuthorShape-
dc.subject.keywordAuthorObject categorization-
dc.subject.keywordAuthorVision-
dc.subject.keywordAuthorHaptics-
dc.subject.keywordAuthorCategorization-
dc.subject.keywordAuthorMultisensory representations-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Wallraven, Christian photo

Wallraven, Christian
인공지능학과
Read more

Altmetrics

Total Views & Downloads

BROWSE