-
[pdf]
[bibtex]@InProceedings{Lan_2024_ACCV, author = {Lan, Enfan and Hu, Zhengxi and Liu, Jingtai}, title = {UAGE: A Supervised Contrastive Method for Unconstrained Adaptive Gaze Estimation}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {2231-2247} }
UAGE: A Supervised Contrastive Method for Unconstrained Adaptive Gaze Estimation
Abstract
Gaze estimation, which involves perceiving human gaze directions, is the foundation of gaze analysis. It provides crucial clues for understanding human attention and intention. However, most existing methods are designed for constrained environments, which leads to a significant performance drop in unconstrained practical applications. In this work, we propose a supervised contrastive method for Unconstrained Adaptive Gaze Estimation (UAGE), which consists of an unconstrained gaze estimation method and a Gaze-guided Contrastive Domain Adaptation (GCDA) framework. Our method leverages the entire human body states and the uncertainty of gaze behaviors to robustly estimate gazes in unconstrained environments, rather than solely relying on head states. Additionally, we employ the GCDA framework to adapt the model to new domains, thereby improving its generalization ability. Experiment results show that our UAGE method has achieved state-of-the-art within-domain performance on the unconstrained GAFA dataset and has reduced the angular error by 14% compared to the baseline in cross-domain gaze estimation, with GAFA as the source domain and Gaze360 as the target domain. The code is available at https://github.com/youthhfor/UAGE.git.
Related Material