On-Device Few-Shot Personalization for Real-Time Gaze Estimation

Junfeng He, Khoi Pham, Nachiappan Valliappan, Pingmei Xu, Chase Roberts, Dmitry Lagun, Vidhya Navalpakkam; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0

Abstract


Building fast and accurate gaze estimation models without additional specialized hardware is a hard problem. In this paper, we present on-device few-shot personalization methods for 2D gaze estimation. The proposed supervised method achieves better accuracy using as few as 2-5 calibration points per user compared to prior methods that require more than 13 calibration points. In addition, we propose an unsupervised personalization method which uses only unlabeled facial images to improve gaze estimation accuracy. Our best personalized model achieves 24-26% better accuracy (measured by mean error) on phones compared to the state-of-the-art using <=5 calibration points per user. It is also computationally efficient, requiring 20x fewer FLOPS when compared to prior methods. This unlocks a variety of important real world applications such as using gaze for accessibility, gaming and human-computer interaction while running entirely on-device in real-time.

Related Material


[pdf]
[bibtex]
@InProceedings{He_2019_ICCV,
author = {He, Junfeng and Pham, Khoi and Valliappan, Nachiappan and Xu, Pingmei and Roberts, Chase and Lagun, Dmitry and Navalpakkam, Vidhya},
title = {On-Device Few-Shot Personalization for Real-Time Gaze Estimation},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}
}