EEG-based affective state recognition from human brain signals by using Hjorth-activity
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Mehmood, Raja Majid | - |
dc.contributor.author | Bilal, Muhammad | - |
dc.contributor.author | Vimal, S. | - |
dc.contributor.author | Lee, Seong-Whan | - |
dc.date.accessioned | 2022-11-17T22:41:02Z | - |
dc.date.available | 2022-11-17T22:41:02Z | - |
dc.date.created | 2022-11-17 | - |
dc.date.issued | 2022-10 | - |
dc.identifier.issn | 0263-2241 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/145689 | - |
dc.description.abstract | EEG-based emotion recognition enables investigation of human brain activity, which is recognized as an important factor in brain-computer interface. In recent years, several methods have been studied to find optimal features from brain signals. The main limitation of existing studies is that either they consider very few emotion classes or they employ a large feature set. To overcome these issues, we propose a novel Hjorth-feature-based emotion recognition model. Unlike other methods, our proposed method explores a wider set of emotion clas-ses in the arousal-valence domain. To reduce the dimension of the feature set, we employ Hjorth parameters (HPs) and analyze the parameters in the frequency domain. At the same time, our study was focused to maintain the accuracy of emotion recognition for four emotional classes. The average accuracy was approximately 69%, 76%, 85%, 59%, and 87% for DEAP, SEED-IV, DREAMER, SELEMO, and ASCERTAIN, respectively. Results show that the features from HP activity with random forest outperforms all the classic methods of EEG-based emotion recognition. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | ELSEVIER SCI LTD | - |
dc.subject | EVENT-RELATED SYNCHRONIZATION | - |
dc.subject | EMOTION RECOGNITION | - |
dc.subject | FEATURE-SELECTION | - |
dc.subject | FEATURE-EXTRACTION | - |
dc.subject | EXPERIENCE | - |
dc.subject | DESYNCHRONIZATION | - |
dc.subject | ASYMMETRIES | - |
dc.subject | AROUSAL | - |
dc.subject | SYSTEM | - |
dc.subject | THETA | - |
dc.title | EEG-based affective state recognition from human brain signals by using Hjorth-activity | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Lee, Seong-Whan | - |
dc.identifier.doi | 10.1016/j.measurement.2022.111738 | - |
dc.identifier.scopusid | 2-s2.0-85138441095 | - |
dc.identifier.wosid | 000859313200002 | - |
dc.identifier.bibliographicCitation | MEASUREMENT, v.202 | - |
dc.relation.isPartOf | MEASUREMENT | - |
dc.citation.title | MEASUREMENT | - |
dc.citation.volume | 202 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Engineering, Multidisciplinary | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.subject.keywordPlus | EVENT-RELATED SYNCHRONIZATION | - |
dc.subject.keywordPlus | EMOTION RECOGNITION | - |
dc.subject.keywordPlus | FEATURE-SELECTION | - |
dc.subject.keywordPlus | FEATURE-EXTRACTION | - |
dc.subject.keywordPlus | EXPERIENCE | - |
dc.subject.keywordPlus | DESYNCHRONIZATION | - |
dc.subject.keywordPlus | ASYMMETRIES | - |
dc.subject.keywordPlus | AROUSAL | - |
dc.subject.keywordPlus | SYSTEM | - |
dc.subject.keywordPlus | THETA | - |
dc.subject.keywordAuthor | EEG | - |
dc.subject.keywordAuthor | Affective state | - |
dc.subject.keywordAuthor | Emotion recognition | - |
dc.subject.keywordAuthor | DEAP | - |
dc.subject.keywordAuthor | SEED -IV | - |
dc.subject.keywordAuthor | DREAMER | - |
dc.subject.keywordAuthor | SELEMO | - |
dc.subject.keywordAuthor | ASCERTAIN | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.