Influence-Balanced Loss for Imbalanced Visual Classification

Seulki Park, Jongin Lim, Younghan Jeon, Jin Young Choi; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 735-744

Abstract


In this paper, we propose a balancing training method to address problems in imbalanced data learning. To this end, we derive a new loss used in the balancing training phase that alleviates the influence of samples that cause an overfitted decision boundary. The proposed loss efficiently improves the performance of any type of imbalance learning methods. In experiments on multiple benchmark data sets, we demonstrate the validity of our method and reveal that the proposed loss outperforms the state-of-the-art cost-sensitive loss methods. Furthermore, since our loss is not restricted to a specific task, model, or training method, it can be easily used in combination with other recent re-sampling, meta-learning, and cost-sensitive learning methods for class-imbalance problems. Our code is made available.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Park_2021_ICCV, author = {Park, Seulki and Lim, Jongin and Jeon, Younghan and Choi, Jin Young}, title = {Influence-Balanced Loss for Imbalanced Visual Classification}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {735-744} }