Memory-Efficient Continual Learning with Neural Collapse Contrastive

Trung-Anh Dang, Vincent Nguyen, Ngoc-Son Vu, Christel Vrain; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 7939-7948

Abstract


Contrastive learning has significantly improved representation quality enhancing knowledge transfer across tasks in continual learning (CL). However catastrophic forgetting remains a key challenge as contrastive based methods primarily focus on "soft relationships" or "softness" between samples which shift with changing data distributions and lead to representation overlap across tasks. Recently the newly identified Neural Collapse phenomenon has shown promise in CL by focusing on "hard relationships" or "hardness" between samples and fixed prototypes. However this approach overlooks "softness" crucial for capturing intra-class variability and this rigid focus can also pull old class representations toward current ones increasing forgetting. Building on these insights we propose Focal Neural Collapse Contrastive (FNC^2) a novel representation learning loss that effectively balances both soft and hard relationships. Additionally we introduce the Hardness-Softness Distillation (HSD) loss to progressively preserve the knowledge gained from these relationships across tasks. Our method outperforms state-of-the-art approaches particularly in minimizing memory reliance. Remarkably even without the use of memory our approach rivals rehearsal-based methods offering a compelling solution for data privacy concerns.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Dang_2025_WACV, author = {Dang, Trung-Anh and Nguyen, Vincent and Vu, Ngoc-Son and Vrain, Christel}, title = {Memory-Efficient Continual Learning with Neural Collapse Contrastive}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {7939-7948} }