Memory-based attentional capture by colour and shape contents in visual working memory
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Sunghyun | - |
dc.contributor.author | Cho, Yang Seok | - |
dc.date.accessioned | 2021-09-04T05:20:13Z | - |
dc.date.available | 2021-09-04T05:20:13Z | - |
dc.date.created | 2021-06-18 | - |
dc.date.issued | 2016 | - |
dc.identifier.issn | 1350-6285 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/90299 | - |
dc.description.abstract | Current theories assume that there is substantial overlap between visual working memory (VWM) and visual attention functioning, such that active representations in VWM automatically act as an attentional set, resulting in attentional biases towards objects that match the mnemonic content. Most evidence for this comes from visual search tasks in which a distractor similar to the memory interferes with the detection of a simultaneous target. Here we provide additional evidence using one of the most popular paradigms in the literature for demonstrating an active attentional set: The contingent spatial orienting paradigm of Folk and colleagues. This paradigm allows memory-based attentional biases to be more directly attributed to spatial orienting. Experiment 1 demonstrated a memory-contingent spatial attention effect for colour but not for shape contents of VWM. Experiment 2 tested the hypothesis that the placeholders used for spatial cueing interfered with the shape processing, and showed that memory-based attentional capture for shape returned when placeholders were removed. The results of the present study are consistent with earlier findings from distractor interference paradigms, and provide additional evidence that biases in spatial orienting contribute to memory-based influences on attention. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | ROUTLEDGE JOURNALS, TAYLOR & FRANCIS LTD | - |
dc.subject | CONTROL SETTINGS | - |
dc.subject | TOP-DOWN | - |
dc.subject | SELECTIVE ATTENTION | - |
dc.subject | CONTINGENT CAPTURE | - |
dc.subject | ELECTROPHYSIOLOGICAL EVIDENCE | - |
dc.subject | EYE-MOVEMENTS | - |
dc.subject | ABRUPT ONSET | - |
dc.subject | TERM-MEMORY | - |
dc.subject | SEARCH | - |
dc.subject | GUIDANCE | - |
dc.title | Memory-based attentional capture by colour and shape contents in visual working memory | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Cho, Yang Seok | - |
dc.identifier.doi | 10.1080/13506285.2016.1184734 | - |
dc.identifier.scopusid | 2-s2.0-84979009442 | - |
dc.identifier.wosid | 000380770700005 | - |
dc.identifier.bibliographicCitation | VISUAL COGNITION, v.24, no.1, pp.51 - 62 | - |
dc.relation.isPartOf | VISUAL COGNITION | - |
dc.citation.title | VISUAL COGNITION | - |
dc.citation.volume | 24 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 51 | - |
dc.citation.endPage | 62 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | ssci | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Psychology | - |
dc.relation.journalWebOfScienceCategory | Psychology, Experimental | - |
dc.subject.keywordPlus | CONTROL SETTINGS | - |
dc.subject.keywordPlus | TOP-DOWN | - |
dc.subject.keywordPlus | SELECTIVE ATTENTION | - |
dc.subject.keywordPlus | CONTINGENT CAPTURE | - |
dc.subject.keywordPlus | ELECTROPHYSIOLOGICAL EVIDENCE | - |
dc.subject.keywordPlus | EYE-MOVEMENTS | - |
dc.subject.keywordPlus | ABRUPT ONSET | - |
dc.subject.keywordPlus | TERM-MEMORY | - |
dc.subject.keywordPlus | SEARCH | - |
dc.subject.keywordPlus | GUIDANCE | - |
dc.subject.keywordAuthor | Visual search | - |
dc.subject.keywordAuthor | attentional capture | - |
dc.subject.keywordAuthor | working memory | - |
dc.subject.keywordAuthor | attentional set | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.