From Sparse to Dense: Semantic Graph Evolutionary Hashing for Unsupervised Cross-Modal Retrieval

Yang Zhao, Jiaguo Yu, Shengbin Liao, Zheng Zhang, Haofeng Zhang; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 195-211

Abstract


In recent years, cross-modal hashing has attracted an increasing attention due to its fast retrieval speed and low storage requirements. However, labeled datasets are limited in real application, and existing unsupervised cross-modal hashing algorithms usually employ heuristic geometric prior as semantics, which introduces serious deviations as the similarity score from original features cannot reasonably represent the relationships among instances. In this paper, we study the unsupervised deep cross-modal hash retrieval method and propose a novel Semantic Graph Evolutionary Hashing (SGEH) to solve the above problem. The key novelty of SGEH is its evolutionary affinity graph construction method. To be concrete, we explore the sparse similarity graph with clustering results, which evolve from fusing the affinity information from code-driven graph on intrinsic data and subsequently extends to dense hybrid semantic graph which restricts the process of hash code learning to learn more discriminative results. Moreover, the batch-inputs are chosen from edge set rather than vertexes for better exploring the original spatial information in the sparse graph. Experiments on four benchmark datasets demonstrate the superiority of our framework over the state-of-the-art unsupervised cross-modal retrieval methods.

Related Material


[pdf] [supp] [code]
[bibtex]
@InProceedings{Zhao_2022_ACCV, author = {Zhao, Yang and Yu, Jiaguo and Liao, Shengbin and Zhang, Zheng and Zhang, Haofeng}, title = {From Sparse to Dense: Semantic Graph Evolutionary Hashing for Unsupervised Cross-Modal Retrieval}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {195-211} }