Generalized Person Re-identification by Locating and Eliminating Domain-Sensitive Features

Wendong Wang, Fengxiang Yang, Zhiming Luo, Shaozi Li; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 3258-3273

Abstract


In this paper, we study the problem of domain generalization for person re-identification (re-ID), which adopts training data from multiple domains to learn a re-ID model that can be directly deployed to unseen target domains without further fine-tuning. One promising idea is removing the subsets of features that are not beneficial to the generalization of models. This can be achieved by muting the subset features that correspond to high back-propagated gradients as these subsets are easy for the model to overfit. But this method ignores the interaction of multiple domains. Therefore, we propose a novel method to solve this problem by comparing the gradients from two different training schemes. One of the training schemes discriminates input data from their corresponding domain to obtain back-propagated temporary gradients in the intermediate features. At the same time, another scheme discriminates input data from all domains to obtain the temporary gradients. By comparing the temporary gradient between the two schemes, we can identify the domain-generalizable subset features from those domain-specific subset features. We thus mute them in the subsequent training process to enforce the model to learn domain-generalizable information and improve its generalization. Extensive experiments on four large-scale re-ID benchmarks have verified the effectiveness of our method. Code is available at https://github.com/Ssd111/LEDF.git.

Related Material


[pdf] [code]
[bibtex]
@InProceedings{Wang_2022_ACCV, author = {Wang, Wendong and Yang, Fengxiang and Luo, Zhiming and Li, Shaozi}, title = {Generalized Person Re-identification by Locating and Eliminating Domain-Sensitive Features}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {3258-3273} }