A Subject-Transfer Framework Based on Single-Trial EMG Analysis Using Convolutional Neural Networks
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kim, Keun-Tae | - |
dc.contributor.author | Guan, Cuntai | - |
dc.contributor.author | Lee, Seong-Whan | - |
dc.date.accessioned | 2021-08-31T15:12:35Z | - |
dc.date.available | 2021-08-31T15:12:35Z | - |
dc.date.created | 2021-06-18 | - |
dc.date.issued | 2020-01 | - |
dc.identifier.issn | 1534-4320 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/58550 | - |
dc.description.abstract | In recent years, electromyography (EMG)-based practical myoelectric interfaces have been developed to improve the quality of daily life for people with physical disabilities. With these interfaces, it is very important to decode a user's movement intention, to properly control the external devices. However, improving the performance of these interfaces is difficult due to the high variations in the EMG signal patterns caused by intra-user variability. Therefore, this paper proposes a novel subject-transfer framework for decoding hand movements, which is robust in terms of intra-user variability. In the proposed framework, supportive convolutional neural network (CNN) classifiers, which are pre-trained using the EMG data of several subjects, are selected and fine-tuned for the target subject via single-trial analysis. Then, the target subject's hand movements are classified by voting the outputs of the supportive CNN classifiers. The feasibility of the proposed framework is validated with NinaPro databases 2 and 3, which comprise 49 hand movements of 40 healthy and 11 amputee subjects, respectively. The experimental results indicate that, when compared to the self-decoding framework, which uses only the target subject's data, the proposed framework can successfully decode hand movements with improved performance in both healthy and amputee subjects. From the experimental results, the proposed subject-transfer framework can be seen to represent a useful tool for EMG-based practical myoelectric interfaces controlling external devices. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.subject | OF-THE-ART | - |
dc.subject | PATTERN-RECOGNITION | - |
dc.subject | REAL-TIME | - |
dc.subject | MYOELECTRIC CONTROL | - |
dc.subject | SIGNALS | - |
dc.title | A Subject-Transfer Framework Based on Single-Trial EMG Analysis Using Convolutional Neural Networks | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Lee, Seong-Whan | - |
dc.identifier.doi | 10.1109/TNSRE.2019.2946625 | - |
dc.identifier.scopusid | 2-s2.0-85078358351 | - |
dc.identifier.wosid | 000508375400010 | - |
dc.identifier.bibliographicCitation | IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, v.28, no.1, pp.94 - 103 | - |
dc.relation.isPartOf | IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING | - |
dc.citation.title | IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING | - |
dc.citation.volume | 28 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 94 | - |
dc.citation.endPage | 103 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Rehabilitation | - |
dc.relation.journalWebOfScienceCategory | Engineering, Biomedical | - |
dc.relation.journalWebOfScienceCategory | Rehabilitation | - |
dc.subject.keywordPlus | OF-THE-ART | - |
dc.subject.keywordPlus | PATTERN-RECOGNITION | - |
dc.subject.keywordPlus | REAL-TIME | - |
dc.subject.keywordPlus | MYOELECTRIC CONTROL | - |
dc.subject.keywordPlus | SIGNALS | - |
dc.subject.keywordAuthor | Subject-transfer framework | - |
dc.subject.keywordAuthor | myoelectric interfaces | - |
dc.subject.keywordAuthor | electromyography | - |
dc.subject.keywordAuthor | convolutional neural networks | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.