Area Under the ROC Curve Maximization for Metric Learning

Bojana Gajić, Ariel Amato, Ramon Baldrich, Joost van de Weijer, Carlo Gatta; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 2807-2816

Abstract


Most popular metric learning losses have no direct relation with the evaluation metrics that are subsequently applied to evaluate their performance. We hypothesize that training a metric learning model by maximizing the area under the ROC curve (which is a typical performance measure of recognition systems) can induce an implicit ranking suitable for retrieval problems. This hypothesis is supported by previous work that proved that a curve dominates in ROC space if and only if it dominates in Precision-Recall space. To test this hypothesis, we design and maximize an approximated, derivable relaxation of the area under the ROC curve. The proposed AUC loss achieves state-of-the-art results on two large scale retrieval benchmark datasets (Stanford Online Products and DeepFashion In-Shop). Moreover, the AUC loss achieves comparable performance to more complex, domain specific, state-of-the-art methods for vehicle re-identification.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Gajic_2022_CVPR, author = {Gaji\'c, Bojana and Amato, Ariel and Baldrich, Ramon and van de Weijer, Joost and Gatta, Carlo}, title = {Area Under the ROC Curve Maximization for Metric Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {2807-2816} }