Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A comparison of geometric-and regression-based mobile gaze-tracking

Full metadata record
DC Field Value Language
dc.contributor.authorBrowatzki, Bjoern-
dc.contributor.authorBuelthoff, Heinrich H.-
dc.contributor.authorChuang, Lewis L.-
dc.date.accessioned2021-09-05T09:42:29Z-
dc.date.available2021-09-05T09:42:29Z-
dc.date.created2021-06-15-
dc.date.issued2014-04-08-
dc.identifier.issn1662-5161-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/98776-
dc.description.abstractVideo-based gaze-tracking systems are typically restricted in terms of their effective tracking space. This constraint limits the use of eyetrackers in studying mobile human behavior. Here, we compare two possible approaches for estimating the gaze of participants who are free to walk in a large space whilst looking at different regions of a large display. Geometrically, we linearly combined eye-in-head rotations and head-in-world coordinates to derive a gaze vector and its intersection with a planar display, by relying on the use of a head-mounted eyetracker and body-motion tracker. Alternatively, we employed Gaussian process regression to estimate the gaze intersection directly from the input data itself. Our evaluation of both methods indicates that a regression approach can deliver comparable results to a geometric approach. The regression approach is favored, given that it has the potential for further optimization, provides confidence bounds for its gaze estimates and offers greater flexibility in its implementation. Open-source software for the methods reported here is also provided for user implementation.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherFRONTIERS RESEARCH FOUNDATION-
dc.subjectMOVEMENT-
dc.subjectPOTENTIALS-
dc.subjectSEARCH-
dc.subjectCOIL-
dc.titleA comparison of geometric-and regression-based mobile gaze-tracking-
dc.typeArticle-
dc.contributor.affiliatedAuthorBuelthoff, Heinrich H.-
dc.identifier.doi10.3389/fnhum.2014.00200-
dc.identifier.scopusid2-s2.0-84898722207-
dc.identifier.wosid000333919700001-
dc.identifier.bibliographicCitationFRONTIERS IN HUMAN NEUROSCIENCE, v.8-
dc.relation.isPartOfFRONTIERS IN HUMAN NEUROSCIENCE-
dc.citation.titleFRONTIERS IN HUMAN NEUROSCIENCE-
dc.citation.volume8-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaNeurosciences & Neurology-
dc.relation.journalResearchAreaPsychology-
dc.relation.journalWebOfScienceCategoryNeurosciences-
dc.relation.journalWebOfScienceCategoryPsychology-
dc.subject.keywordPlusMOVEMENT-
dc.subject.keywordPlusPOTENTIALS-
dc.subject.keywordPlusSEARCH-
dc.subject.keywordPlusCOIL-
dc.subject.keywordAuthorcalibration method-
dc.subject.keywordAuthorgaze measurement-
dc.subject.keywordAuthoreye tracking-
dc.subject.keywordAuthoreye movement-
dc.subject.keywordAuthoractive vision-
dc.subject.keywordAuthorgaussian processes-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Brain and Cognitive Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE