Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Improving the robustness of gaze tracking under unconstrained illumination conditions

Full metadata record
DC Field Value Language
dc.contributor.authorUhm, Kwang-Hyun-
dc.contributor.authorKang, Mun-Cheon-
dc.contributor.authorKim, Joon-Yeon-
dc.contributor.authorKo, Sung-Jea-
dc.date.accessioned2021-08-30T18:06:20Z-
dc.date.available2021-08-30T18:06:20Z-
dc.date.created2021-06-19-
dc.date.issued2020-08-
dc.identifier.issn1380-7501-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/53888-
dc.description.abstractIn human-computer interaction (HCI) applications, the performance degradation of gaze trackers in real-world environments is a critical issue. Typically, gaze trackers utilize the pupil center and corneal reflection (CR) obtained from an infrared (IR) light source to estimate the point of regard (POR). However, false CRs are often generated due to extraneous light sources such as sunlight or lamps. In this study, we propose a method of improving the robustness of gaze tracking under unconstrained illumination conditions. First, the proposed method generates a coded CR pattern by utilizing time-multiplexed IR light sources. Next, the CR candidates are detected in eye images, and their coordinates are compensated based on the head and eye movements of the user. Finally, true CRs are selected from the motion-compensated CR candidates by utilizing a novel cost function. Experimental results indicate that the gaze-tracking performance of the proposed method under various light conditions is considerably better than those of the conventional methods.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherSPRINGER-
dc.subjectSYSTEM-
dc.subjectPUPIL-
dc.titleImproving the robustness of gaze tracking under unconstrained illumination conditions-
dc.typeArticle-
dc.contributor.affiliatedAuthorKo, Sung-Jea-
dc.identifier.doi10.1007/s11042-020-08679-y-
dc.identifier.scopusid2-s2.0-85084085853-
dc.identifier.wosid000527911600002-
dc.identifier.bibliographicCitationMULTIMEDIA TOOLS AND APPLICATIONS, v.79, no.29-30, pp.20603 - 20616-
dc.relation.isPartOfMULTIMEDIA TOOLS AND APPLICATIONS-
dc.citation.titleMULTIMEDIA TOOLS AND APPLICATIONS-
dc.citation.volume79-
dc.citation.number29-30-
dc.citation.startPage20603-
dc.citation.endPage20616-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryComputer Science, Software Engineering-
dc.relation.journalWebOfScienceCategoryComputer Science, Theory & Methods-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.subject.keywordPlusSYSTEM-
dc.subject.keywordPlusPUPIL-
dc.subject.keywordAuthorGaze tracking-
dc.subject.keywordAuthorHuman computer interaction-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > School of Electrical Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE