Attribute-Guided Pedestrian Retrieval: Bridging Person Re-ID with Internal Attribute Variability

Yan Huang, Zhang Zhang, Qiang Wu, Yi Zhong, Liang Wang; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 17689-17699

Abstract


In various domains such as surveillance and smart retail pedestrian retrieval centering on person re-identification (Re-ID) plays a pivotal role. Existing Re-ID methodologies often overlook subtle internal attribute variations which are crucial for accurately identifying individuals with changing appearances. In response our paper introduces the Attribute-Guided Pedestrian Retrieval (AGPR) task focusing on integrating specified attributes with query images to refine retrieval results. Although there has been progress in attribute-driven image retrieval there remains a notable gap in effectively blending robust Re-ID models with intra-class attribute variations. To bridge this gap we present the Attribute-Guided Transformer-based Pedestrian Retrieval (ATPR) framework. ATPR adeptly merges global ID recognition with local attribute learning ensuring a cohesive linkage between the two. Furthermore to effectively handle the complexity of attribute interconnectivity ATPR organizes attributes into distinct groups and applies both inter-group correlation and intra-group decorrelation regularizations. Our extensive experiments on a newly established benchmark using the RAP dataset demonstrate the effectiveness of ATPR within the AGPR paradigm.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Huang_2024_CVPR, author = {Huang, Yan and Zhang, Zhang and Wu, Qiang and Zhong, Yi and Wang, Liang}, title = {Attribute-Guided Pedestrian Retrieval: Bridging Person Re-ID with Internal Attribute Variability}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {17689-17699} }