Gait Recognition by Deformable Registration

Yasushi Makihara, Daisuke Adachi, Chi Xu, Yasushi Yagi; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2018, pp. 561-571


This paper describes a method of gait recognition robust against intra-subject posture changes. A person sometimes walks with changing his/her posture when looking down at a smartphone or carrying a heavy object, which makes intra-subject variation large and consequently makes gait recognition difficult. We therefore introduce a deformable registration model to mitigate the intra-subject posture changes. More specifically, we represent a deformation field by a set of deformation vectors on lattice-type control points allocated on an image, i.e., by free-form deformation (FFD) framework. Given a pair of a probe and a gallery, we compute the deformation field so as to minimize the difference between a probe morphed by the deformation field and the gallery, as well as to ensure the spatial smoothness of the deformation field. We then learn the intra-subject eigen deformation modes from a training set of the same subjects' pairs (e.g., bending the upper body forward and swinging arms more), which are relatively different from inter-subject deformation modes (e.g., body shape spread and stride change). Moreover, because the deformable registration is responsible for a preprocessing part before matching, it can be combined with any types of matching algorithms for gait recognition. Experiments with 1,334 subjects show that the proposed method improves the gait recognition accuracy in both cases without and with a state-of-the-art deep learning-based matcher, respectively.

Related Material

author = {Makihara, Yasushi and Adachi, Daisuke and Xu, Chi and Yagi, Yasushi},
title = {Gait Recognition by Deformable Registration},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2018}