Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Gesture spotting and recognition for human-robot interaction

Full metadata record
DC Field Value Language
dc.contributor.authorYang, Hee-Deok-
dc.contributor.authorPark, A-Yeon-
dc.contributor.authorLee, Seong-Whan-
dc.date.accessioned2021-09-09T17:23:09Z-
dc.date.available2021-09-09T17:23:09Z-
dc.date.created2021-06-10-
dc.date.issued2007-04-
dc.identifier.issn1552-3098-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/125793-
dc.description.abstractVisual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleGesture spotting and recognition for human-robot interaction-
dc.typeArticle-
dc.contributor.affiliatedAuthorLee, Seong-Whan-
dc.identifier.doi10.1109/TRO.2006.889491-
dc.identifier.scopusid2-s2.0-34247223015-
dc.identifier.wosid000245904500007-
dc.identifier.bibliographicCitationIEEE TRANSACTIONS ON ROBOTICS, v.23, no.2, pp.256 - 270-
dc.relation.isPartOfIEEE TRANSACTIONS ON ROBOTICS-
dc.citation.titleIEEE TRANSACTIONS ON ROBOTICS-
dc.citation.volume23-
dc.citation.number2-
dc.citation.startPage256-
dc.citation.endPage270-
dc.type.rimsART-
dc.type.docTypeArticle; Proceedings Paper-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaRobotics-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.subject.keywordAuthorgesture spotting-
dc.subject.keywordAuthorhidden Markov model (HNM)-
dc.subject.keywordAuthorhuman-robot interaction (HRI)-
dc.subject.keywordAuthormobile robot-
dc.subject.keywordAuthortransition gesture model-
dc.subject.keywordAuthorwhole-body gesture recognition-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Seong Whan photo

Lee, Seong Whan
인공지능학과
Read more

Altmetrics

Total Views & Downloads

BROWSE