ARC: Adversarial Robust Cuts for Semi-Supervised and Multi-Label Classification

Sima Behpour; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2018, pp. 1905-1907

Abstract


Many structured prediction tasks arising in computer vision and natural language processing tractably reduce to making minimum cost cuts in graphs with edge weights learned using maximum margin methods. Unfortunately, the hinge loss used to construct these methods often provides a particularly loose bound on the loss function of inter-est (e.g., the Hamming loss). We develop Adversarial Ro-bust Cuts (ARC), an approach that poses the learning task as a minimax game between predictor and "label approximator" based on minimum cost graph cuts. Unlike maximum margin methods, this game-theoretic perspective always provides meaningful bounds on the Hamming loss. We conduct multi-label and semi-supervised binary prediction experiments that demonstrate the benefits of our approach.

Related Material


[pdf]
[bibtex]
@InProceedings{Behpour_2018_CVPR_Workshops,
author = {Behpour, Sima},
title = {ARC: Adversarial Robust Cuts for Semi-Supervised and Multi-Label Classification},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2018}
}