Detailed Information

Cited 3 time in webofscience Cited 4 time in scopus
Metadata Downloads

Neural Decoding of Imagined Speech and Visual Imagery as Intuitive Paradigms for BCI Communication

Authors
Lee, Seo-HyunLee, MinjiLee, Seong-Whan
Issue Date
12월-2020
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Keywords
Visualization; Decoding; Electroencephalography; Support vector machines; Performance evaluation; Electrodes; Training; Brain-computer interface; BCI inefficiency; electroencephalography; imagined speech; intuitive BCI; visual imagery
Citation
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, v.28, no.12, pp.2647 - 2659
Indexed
SCIE
SCOPUS
Journal Title
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING
Volume
28
Number
12
Start Page
2647
End Page
2659
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/51377
DOI
10.1109/TNSRE.2020.3040289
ISSN
1534-4320
Abstract
Brain-computer interface (BCI) is oriented toward intuitive systems that users can easily operate. Imagined speech and visual imagery are emerging paradigms that can directly convey a user's intention. We investigated the underlying characteristics that affect the decoding performance of these two paradigms. Twenty-two subjects performed imagined speech and visual imagery of twelve words/phrases frequently used for patients' communication. Spectral features were analyzed with thirteen-class classification (including rest class) using EEG filtered in six frequency ranges. In addition, cortical regions relevant to the two paradigms were analyzed by classification using single-channel and pre-defined cortical groups. Furthermore, we analyzed the word properties that affect the decoding performance based on the number of syllables, concrete and abstract concepts, and the correlation between the two paradigms. Finally, we investigated multiclass scalability in both paradigms. The high-frequency band displayed a significantly superior performance to that in the case of any other spectral features in the thirteen-class classification (imagined speech: 39.73 +/- 5.64%; visual imagery: 40.14 +/- 4.17%). Furthermore, the performance of Broca's and Wernicke's areas and auditory cortex was found to have improved among the cortical regions in both paradigms. As the number of classes increased, the decoding performance decreased moderately. Moreover, every subject exceeded the confidence level performance, implying the strength of the two paradigms in BCI inefficiency. These two intuitive paradigms were found to be highly effective for multiclass communication systems, having considerable similarities between each other. The results could provide crucial information for improving the decoding performance for practical BCI applications.
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Seong Whan photo

Lee, Seong Whan
인공지능학과
Read more

Altmetrics

Total Views & Downloads

BROWSE