Beyond Comparing Image Pairs: Setwise Active Learning for Relative Attributes

Lucy Liang, Kristen Grauman; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014, pp. 208-215

Abstract


It is useful to automatically compare images based on their visual properties---to predict which image is brighter, more feminine, more blurry, etc. However, comparative models are inherently more costly to train than their classification counterparts. Manually labeling all pairwise comparisons is intractable, so which pairs should a human supervisor compare? We explore active learning strategies for training relative attribute ranking functions, with the goal of requesting human comparisons only where they are most informative. We introduce a novel criterion that requests a partial ordering for a set of examples that minimizes the total rank margin in attribute space, subject to a visual diversity constraint. The setwise criterion helps amortize effort by identifying mutually informative comparisons, and the diversity requirement safeguards against requests a human viewer will find ambiguous. We develop an efficient strategy to search for sets that meet this criterion. On three challenging datasets and experiments with "live" online annotators, the proposed method outperforms both traditional passive learning as well as existing active rank learning methods.

Related Material


[pdf]
[bibtex]
@InProceedings{Liang_2014_CVPR,
author = {Liang, Lucy and Grauman, Kristen},
title = {Beyond Comparing Image Pairs: Setwise Active Learning for Relative Attributes},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2014}
}