-
[pdf]
[supp]
[arXiv]
[code]
[bibtex]@InProceedings{Qian_2020_ACCV, author = {Qian, Xuelin and Wang, Wenxuan and Zhang, Li and Zhu, Fangrui and Fu, Yanwei and Xiang, Tao and Jiang, Yu-Gang and Xue, Xiangyang}, title = {Long-Term Cloth-Changing Person Re-identification}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {November}, year = {2020} }
Long-Term Cloth-Changing Person Re-identification
Abstract
Person re-identification (Re-ID) aims to match a target person across camera views at different locations and times. Existing Re-ID studies focus on the short-term cloth-consistent setting, under which a person re-appears in different camera views with the same outfit. A discriminative feature representation learned by existing deep Re-ID models is thus dominated by the visual appearance of clothing. In this work, we focus on a much more difficult yet practical setting where person matching is conducted over long-duration, e.g., over days and months and therefore inevitably under the new challenge of changing clothes. This problem, termed Long-Term Cloth-Changing (LTCC) Re-ID is much understudied due to the lack of large scale datasets. The first contribution of this work is a new LTCC dataset containing people captured over a long period of time with frequent clothing changes. As a second contribution, we propose a novel Re-ID method specifically designed to address the cloth-changing challenge. Specifically, we consider that under cloth-changes, soft-biometrics such as body shape would be more reliable. We, therefore, introduce a shape embedding module as well as a cloth-elimination shape-distillation module aiming to eliminate the now unreliable clothing appearance features and focus on the body shape information. Extensive experiments show that superior performance is achieved by the proposed model on the new LTCC dataset. The dataset is available on the project website: https: //naiq.github.io/LTCC_Perosn_ReID.html.
Related Material