Margin Contrastive Learning with Learnable-Vector for Continual Learning

Kotaro Nagata, Kazuhiro Hotta; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 3570-3576

Abstract


In continual learning, there is a serious problem "catastrophic forgetting", in which previously acquired knowledge is forgotten when a new task is learned. Various methods have been proposed to solve this problem. Among them, Relay methods, which store a portion of the past training data and regenerate it for later tasks, have shown excellent performance. In this paper, we propose a new online continuous learning method that adds a representative vector for each class and a margin for similarity computation to the conventional method, Supervised Contrastive Replay (SCR). Our method aims to mitigate the catastrophic forgetting caused by class imbalance by having the learnable vectors of each class and adding a margin to the similarity calculation. Experiments on multiple image classification datasets confirm that our method outperformed conventional methods.

Related Material


[pdf]
[bibtex]
@InProceedings{Nagata_2023_ICCV, author = {Nagata, Kotaro and Hotta, Kazuhiro}, title = {Margin Contrastive Learning with Learnable-Vector for Continual Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {3570-3576} }