Improving the robustness of gaze tracking under unconstrained illumination conditions
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Uhm, Kwang-Hyun | - |
dc.contributor.author | Kang, Mun-Cheon | - |
dc.contributor.author | Kim, Joon-Yeon | - |
dc.contributor.author | Ko, Sung-Jea | - |
dc.date.accessioned | 2021-08-30T18:06:20Z | - |
dc.date.available | 2021-08-30T18:06:20Z | - |
dc.date.created | 2021-06-19 | - |
dc.date.issued | 2020-08 | - |
dc.identifier.issn | 1380-7501 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/53888 | - |
dc.description.abstract | In human-computer interaction (HCI) applications, the performance degradation of gaze trackers in real-world environments is a critical issue. Typically, gaze trackers utilize the pupil center and corneal reflection (CR) obtained from an infrared (IR) light source to estimate the point of regard (POR). However, false CRs are often generated due to extraneous light sources such as sunlight or lamps. In this study, we propose a method of improving the robustness of gaze tracking under unconstrained illumination conditions. First, the proposed method generates a coded CR pattern by utilizing time-multiplexed IR light sources. Next, the CR candidates are detected in eye images, and their coordinates are compensated based on the head and eye movements of the user. Finally, true CRs are selected from the motion-compensated CR candidates by utilizing a novel cost function. Experimental results indicate that the gaze-tracking performance of the proposed method under various light conditions is considerably better than those of the conventional methods. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | SPRINGER | - |
dc.subject | SYSTEM | - |
dc.subject | PUPIL | - |
dc.title | Improving the robustness of gaze tracking under unconstrained illumination conditions | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Ko, Sung-Jea | - |
dc.identifier.doi | 10.1007/s11042-020-08679-y | - |
dc.identifier.scopusid | 2-s2.0-85084085853 | - |
dc.identifier.wosid | 000527911600002 | - |
dc.identifier.bibliographicCitation | MULTIMEDIA TOOLS AND APPLICATIONS, v.79, no.29-30, pp.20603 - 20616 | - |
dc.relation.isPartOf | MULTIMEDIA TOOLS AND APPLICATIONS | - |
dc.citation.title | MULTIMEDIA TOOLS AND APPLICATIONS | - |
dc.citation.volume | 79 | - |
dc.citation.number | 29-30 | - |
dc.citation.startPage | 20603 | - |
dc.citation.endPage | 20616 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Software Engineering | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Theory & Methods | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.subject.keywordPlus | SYSTEM | - |
dc.subject.keywordPlus | PUPIL | - |
dc.subject.keywordAuthor | Gaze tracking | - |
dc.subject.keywordAuthor | Human computer interaction | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.