Implied Feedback: Learning Nuances of User Behavior in Image Search

Devi Parikh, Kristen Grauman; The IEEE International Conference on Computer Vision (ICCV), 2013, pp. 745-752


User feedback helps an image search system refine its relevance predictions, tailoring the search towards the user's preferences. Existing methods simply take feedback at face value: clicking on an image means the user wants things like it; commenting that an image lacks a specific attribute means the user wants things that have it. However, we expect there is actually more information behind the user's literal feedback. In particular, a user's (possibly subconscious) search strategy leads him to comment on certain images rather than others, based on how any of the visible candidate images compare to the desired content. For example, he may be more likely to give negative feedback on an irrelevant image that is relatively close to his target, as opposed to bothering with one that is altogether different. We introduce novel features to capitalize on such implied feedback cues, and learn a ranking function that uses them to improve the system's relevance estimates. We validate the approach with real users searching for shoes, faces, or scenes using two different modes of feedback: binary relevance feedback and relative attributes-based feedback. The results show that retrieval improves significantly when the system accounts for the learned behaviors. We show that the nuances learned are domain-invariant, and useful for both generic user-independent search as well as personalized user-specific search.

Related Material

author = {Parikh, Devi and Grauman, Kristen},
title = {Implied Feedback: Learning Nuances of User Behavior in Image Search},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2013}