Deep Position-Aware Hashing for Semantic Continuous Image Retrieval

Ruikui Wang, Ruiping Wang, Shishi Qiao, Shiguang Shan, Xilin Chen; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2020, pp. 2493-2502

Abstract


Preserving the semantic similarity is one of the most important goals of hashing. Most existing deep hashing methods employ pairs or triplets of samples in training stage, which only consider the semantic similarity within a mini-batch and depict the local positional relationship in Hamming space, leading to intermittent semantic similarity preservation. In this paper, we propose Deep Position-Aware Hashing (DPAH) to ensure continuous semantic similarity in Hamming space by modeling global positional relationship. Specifically, we introduce a set of learnable class centers as the global proxies to represent the global information and generate discriminative binary codes by constraining the distance between data points and class centers. In addition, in order to reduce the information loss caused by relaxing the binary codes to real-values in optimization, we propose kurtosis loss (KT loss) to handle the distribution of real-valued features before thresholding to be double-peak, and then enable the real-valued features to be more binary-like. Comprehensive experiments on three datasets show that our DPAH outperforms state-of-the-art methods.

Related Material


[pdf] [supp] [video]
[bibtex]
@InProceedings{Wang_2020_WACV,
author = {Wang, Ruikui and Wang, Ruiping and Qiao, Shishi and Shan, Shiguang and Chen, Xilin},
title = {Deep Position-Aware Hashing for Semantic Continuous Image Retrieval},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {March},
year = {2020}
}