Syntharch: Interactive Image Search With Attribute-Conditioned Synthesis

Zac Yu, Adriana Kovashka; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020, pp. 170-171

Abstract


The use of interactive systems has been found to be a promising approach for content-based image retrieval, the task of retrieving a specific image from a database based on its content. These systems allow the user to refine the set of results iteratively until the target is reached. In order to proceed with the search efficiently, conventional methods rely on some shared knowledge between the user and the system, such as semantic visual attributes of the images. Those approaches demand the images to be semantically labeled and introduce a semantic gap between the two parties' understanding. In this paper, we explore an alternative approach to interactive image search where feedback is elicited exclusively in visual forms, therefore eliminating the semantic gap and allowing for a generalized version of the method to operate on unlabeled databases. We present Syntharch, a novel interactive image search approach which uses synthesized images as options for feedback, instead of asking textual questions to gain information on the relative attribute values of the target image. We further demonstrate that by using synthesized images rather than real images retrieved from the database as feedback options, Syntharch causes less confusion to the user. Finally, we establish that our proposed search method performs similarly or better in comparison to the conventional approach.

Related Material


[pdf]
[bibtex]
@InProceedings{Yu_2020_CVPR_Workshops,
author = {Yu, Zac and Kovashka, Adriana},
title = {Syntharch: Interactive Image Search With Attribute-Conditioned Synthesis},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2020}
}