Learning Attribute Representations With Localization for Flexible Fashion Search

Kenan E. Ak, Ashraf A. Kassim, Joo Hwee Lim, Jo Yew Tham; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 7708-7717

Abstract


In this paper, we investigate ways of conducting a detailed fashion search using query images and attributes. A credible fashion search platform should be able to (1) find images that share the same attributes as the query image, (2) allow users to manipulate certain attributes, e.g. replace collar attribute from round to v-neck, and (3) handle region-specific attribute manipulations, e.g. replacing the color attribute of the sleeve region without changing the color attribute of other regions. A key challenge to be addressed is that fashion products have multiple attributes and it is important for each of these attributes to have representative features. To address these challenges, we propose the FashionSearchNet which uses a weakly supervised localization method to extract regions of attributes. By doing so, unrelated features can be ignored thus improving the similarity learning. Also, FashionSearchNet incorporates a new procedure that enables region awareness to be able to handle region-specific requests. FashionSearchNet outperforms the most recent fashion search techniques and is shown to be able to carry out different search scenarios using the dynamic queries.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Ak_2018_CVPR,
author = {Ak, Kenan E. and Kassim, Ashraf A. and Lim, Joo Hwee and Tham, Jo Yew},
title = {Learning Attribute Representations With Localization for Flexible Fashion Search},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2018}
}