Task-Adaptive Saliency Guidance for Exemplar-free Class Incremental Learning

Xialei Liu, Jiang-Tian Zhai, Andrew D. Bagdanov, Ke Li, Ming-Ming Cheng; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 23954-23963

Abstract


Exemplar-free Class Incremental Learning (EFCIL) aims to sequentially learn tasks with access only to data from the current one. EFCIL is of interest because it mitigates concerns about privacy and long-term storage of data while at the same time alleviating the problem of catastrophic forgetting in incremental learning. In this work we introduce task-adaptive saliency for EFCIL and propose a new framework which we call Task-Adaptive Saliency Supervision (TASS) for mitigating the negative effects of saliency drift between different tasks. We first apply boundary-guided saliency to maintain task adaptivity and plasticity on model attention. Besides we introduce task-agnostic low-level signals as auxiliary supervision to increase the stability of model attention. Finally we introduce a module for injecting and recovering saliency noise to increase the robustness of saliency preservation. Our experiments demonstrate that our method can better preserve saliency maps across tasks and achieve state-of-the-art results on the CIFAR-100 Tiny-ImageNet and ImageNet-Subset EFCIL benchmarks. Code is available at https://github.com/scok30/tass.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Liu_2024_CVPR, author = {Liu, Xialei and Zhai, Jiang-Tian and Bagdanov, Andrew D. and Li, Ke and Cheng, Ming-Ming}, title = {Task-Adaptive Saliency Guidance for Exemplar-free Class Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {23954-23963} }