Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning

Kai Zhu, Wei Zhai, Yang Cao, Jiebo Luo, Zheng-Jun Zha; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, pp. 9296-9305

Abstract


Non-exemplar class-incremental learning is to recognize both the old and new classes when old class samples cannot be saved. It is a challenging task since representation optimization and feature retention can only be achieved under supervision from new classes. To address this problem, we propose a novel self-sustaining representation expansion scheme. Our scheme consists of a structure reorganization strategy that fuses main-branch expansion and side-branch updating to maintain the old features, and a main-branch distillation scheme to transfer the invariant knowledge. Furthermore, a prototype selection mechanism is proposed to enhance the discrimination between the old and new classes by selectively incorporating new samples into the distillation process. Extensive experiments on three benchmarks demonstrate significant incremental performance, outperforming the state-of-the-art methods by a margin of 3% , 3% and 6% , respectively.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Zhu_2022_CVPR, author = {Zhu, Kai and Zhai, Wei and Cao, Yang and Luo, Jiebo and Zha, Zheng-Jun}, title = {Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2022}, pages = {9296-9305} }