Unsupervised Deep Generative Adversarial Hashing Network
Kamran Ghasedi Dizaji, Feng Zheng, Najmeh Sadoughi, Yanhua Yang, Cheng Deng, Heng Huang; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 3664-3673
Abstract
Unsupervised deep hash functions have not shown satisfactory improvements against the shallow alternatives, and usually, require supervised pretraining to avoid getting stuck in bad local minima. In this paper, we propose a deep unsupervised hashing function, called HashGAN, which outperforms unsupervised hashing models with significant margins without any supervised pretraining. HashGAN consists of three networks, a generator, a discriminator and an encoder. By sharing the parameters of the encoder and discriminator, we benefit from the adversarial loss as a data dependent regularization in training our deep hash function. Moreover, a novel loss function is introduced for hashing real images, resulting in minimum entropy, uniform frequency, consistent and independent hash bits. Furthermore, we train the generator conditioning on random binary inputs and also use these binary variables in a triplet ranking loss for improving hash codes. In our experiments, HashGAN outperforms the previous unsupervised hash functions in image retrieval and achieves the state-of-the-art performance in image clustering. We also provide an ablation study, showing the contribution of each component in our loss function.
Related Material
[pdf]
[video]
[
bibtex]
@InProceedings{Dizaji_2018_CVPR,
author = {Dizaji, Kamran Ghasedi and Zheng, Feng and Sadoughi, Najmeh and Yang, Yanhua and Deng, Cheng and Huang, Heng},
title = {Unsupervised Deep Generative Adversarial Hashing Network},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2018}
}