Cloth-Changing Person Re-Identification With Self-Attention

Vaibhav Bansal, Gian Luca Foresti, Niki Martinel; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) Workshops, 2022, pp. 602-610

Abstract


The basic assumption in the standard person re-identification (ReID) problem is that the clothing of the target person IDs would remain constant over long periods. This assumption creates errors during real-world implementations. In addition, most of the methods that handle ReID use CNN-based networks and have found limited success because CNNs can exploit only local dependencies and suffer the loss of information due to the use of downsampling operations. In this paper, we focus on a more challenging, realistic scenario of long-term cloth-changing ReID (CC-ReID). We aim to learn robust and unique feature representations that are invariant to clothing changes to address the CC-ReID problem. To overcome the limitations faced by CNNs, we propose a Vision-transformer-based framework. We also propose to intuitively exploit the unique soft-biometric-based discriminative information such as gait features and pair them with ViT feature representation for allowing the model to generate long-range structural and contextual relationships that are crucial for re-identification task in the long-term scenario. To evaluate the proposed approach, we perform experiments on two recent CC-ReID datasets, PRCC and LTCC. The experimental results show that the proposed approach achieves state-of-the-art results on the CC-ReID task.

Related Material


[pdf]
[bibtex]
@InProceedings{Bansal_2022_WACV, author = {Bansal, Vaibhav and Foresti, Gian Luca and Martinel, Niki}, title = {Cloth-Changing Person Re-Identification With Self-Attention}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) Workshops}, month = {January}, year = {2022}, pages = {602-610} }