Multi-View Complementary Hash Tables for Nearest Neighbor Search

Xianglong Liu, Lei Huang, Cheng Deng, Jiwen Lu, Bo Lang; The IEEE International Conference on Computer Vision (ICCV), 2015, pp. 1107-1115


Recent years have witnessed the success of hashing techniques in fast nearest neighbor search. In practice many applications (e.g., visual search, object detection, image matching, etc.) have enjoyed the benefits of complementary hash tables and information fusion over multiple views. However, most of prior research mainly focused on compact hash code cleaning, and rare work studies how to build multiple complementary hash tables, much less to adaptively integrate information stemming from multiple views. In this paper we first present a novel multi-view complementary hash table method that learns complementarity hash tables from the data with multiple views. For single multi-view table, using exemplar based feature fusion, we approximate the inherent data similarities with a low-rank matrix, and learn discriminative hash functions in an efficient way. To build complementary tables and meanwhile maintain scalable training and fast out-of-sample extension, an exemplar reweighting scheme is introduced to update the induced low-rank similarity in the sequential table construction framework, which indeed brings mutual benefits between tables by placing greater importance on exemplars shared by mis-separated neighbors. Extensive experiments on three large-scale image datasets demonstrate that the proposed method significantly outperforms various naive solutions and state-of-the-art multi-table methods.

Related Material

author = {Liu, Xianglong and Huang, Lei and Deng, Cheng and Lu, Jiwen and Lang, Bo},
title = {Multi-View Complementary Hash Tables for Nearest Neighbor Search},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2015}