Weighted Contrative Hashing

Jiaguo Yu, Huming Qiu, Dubing Chen, Haofeng Zhang; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 3861-3876

Abstract


The development of unsupervised hashing is advanced by the recent popular contrastive learning paradigm. However, previous contrastive learning-based works have been hampered by (1) insufficient data similarity mining based on global-only image representations, and (2) the hash code semantic loss caused by the data augmentation. In this paper, we propose a novel method, namely Weighted Contrative Hashing (WCH), to take a step towards solving these two problems. We introduce a novel mutual attention module to alleviate the problem of information asymmetry in network features caused by the missing image structure during augmentation. Furthermore, we explore the fine-grained semantic relations between images, i.e., we divide the images into multiple patches and calculate similarities between patches. The aggregated similarities, which reflect a deep image relation, are distilled to facilitate the hash codes learning with a distillation loss, so as to obtain better retrieval performance. Extensive experiments show that the proposed WCH model significantly outperforms existing unsupervised hashing methods on three benchmark datasets.

Related Material


[pdf] [code]
[bibtex]
@InProceedings{Yu_2022_ACCV, author = {Yu, Jiaguo and Qiu, Huming and Chen, Dubing and Zhang, Haofeng}, title = {Weighted Contrative Hashing}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {3861-3876} }