Two-Stream CNNs for Gesture-Based Verification and Identification: Learning User Style

Jonathan Wu, Prakash Ishwar, Janusz Konrad; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2016, pp. 42-50

Abstract


Recently, gestures have been proposed as an alternative biometric modality to traditional biometrics such as face, fingerprint, iris and gait. As a biometric, gesture is a short body motion that contains static anatomical information and changing behavioral (dynamic) information. We consider two types of gestures: full-body gestures, such as a wave of the arms, and hand gestures, such as a subtle curl of the fingers and palm. Most prior work in this area evaluates gestures in the context of a "password," where each user has a single, chosen gesture motion. Contrary to prior work, we instead aim to learn a user's gesture "style" from a set of training gestures. We use two-stream convolutional neural networks, a form of deep learning, to learn this gesture style. First, we evaluate the generalization performance during testing of our approach against gestures or users that have not been seen during training. Then, we study the importance of dynamics by suppressing dynamic information in training and testing. We find that we are able to outperform state-of-the-art methods in identification and verification for two biometrics-oriented gesture datasets for body and in-air hand gestures.

Related Material


[pdf]
[bibtex]
@InProceedings{Wu_2016_CVPR_Workshops,
author = {Wu, Jonathan and Ishwar, Prakash and Konrad, Janusz},
title = {Two-Stream CNNs for Gesture-Based Verification and Identification: Learning User Style},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2016}
}