Calibration-Free Gaze Estimation Using Human Gaze Patterns

Fares Alnajar, Theo Gevers, Roberto Valenti, Sennay Ghebreab; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2013, pp. 137-144

Abstract


We present a novel method to auto-calibrate gaze estimators based on gaze patterns obtained from other viewers. Our method is based on the observation that the gaze patterns of humans are indicative of where a new viewer will look at [12]. When a new viewer is looking at a stimulus, we first estimate a topology of gaze points (initial gaze points). Next, these points are transformed so that they match the gaze patterns of other humans to find the correct gaze points. In a flexible uncalibrated setup with a web camera and no chin rest, the proposed method was tested on ten subjects and ten images. The method estimates the gaze points after looking at a stimulus for a few seconds with an average accuracy of 4.3 im. Although the reported performance is lower than what could be achieved with dedicated hardware or calibrated setup, the proposed method still provides a sufficient accuracy to trace the viewer attention. This is promising considering the fact that auto-calibration is done in a flexible setup , without the use of a chin rest, and based only on a few seconds of gaze initialization data. To the best of our knowledge, this is the first work to use human gaze patterns in order to auto-calibrate gaze estimators.

Related Material


[pdf]
[bibtex]
@InProceedings{Alnajar_2013_ICCV,
author = {Alnajar, Fares and Gevers, Theo and Valenti, Roberto and Ghebreab, Sennay},
title = {Calibration-Free Gaze Estimation Using Human Gaze Patterns},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}
}