A Pointing Gesture Based Egocentric Interaction System: Dataset, Approach and Application

Yichao Huang, Xiaorui Liu, Xin Zhang, Lianwen Jin; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2016, pp. 16-23

Abstract


Natural hand-based human device interaction is essential for wearable camera development. This paper presents a solution for the point gesture based interaction in the egocentric vision and its application. Firstly, a dataset named EgoFinger is established focusing on the pointing gesture for the egocentric vision. We discuss the dataset collection in detail as well as in depth analysis of this dataset. The analysis shows that the dataset covers substantial data samples in various environments and dynamic hand shapes. Furthermore, we propose a two-stage Faster-RCNN based hand detection and dual-target fingertip detection framework. Comparing with state-of-art tracking and detection algorithms, it performs the best. Finally, using the fingertip detection result, we design and implement an input system for the egocentric vision, i.e., Ego-Air-Writing. By considering the fingertip as a pen, the user with wearable glass can write character in the air and interact with system.

Related Material


[pdf]
[bibtex]
@InProceedings{Huang_2016_CVPR_Workshops,
author = {Huang, Yichao and Liu, Xiaorui and Zhang, Xin and Jin, Lianwen},
title = {A Pointing Gesture Based Egocentric Interaction System: Dataset, Approach and Application},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2016}
}