Striking a Balance Between Stability and Plasticity for Class-Incremental Learning

Guile Wu, Shaogang Gong, Pan Li; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 1124-1133

Abstract


Class-incremental learning (CIL) aims at continuously updating a trained model with new classes (plasticity) without forgetting previously learned old ones (stability). Contemporary studies resort to storing representative exemplars for rehearsal or preventing consolidated model parameters from drifting, but the former requires an additional space for storing exemplars at every incremental phase while the latter usually shows poor model generalization. In this paper, we focus on resolving the stability-plasticity dilemma in class-incremental learning where no exemplars from old classes are stored. To make a trade-off between learning new information and maintaining old knowledge, we reformulate a simple yet effective baseline method based on a cosine classifier framework and reciprocal adaptive weights. With the reformulated baseline, we present two new approaches to CIL by learning class-independent knowledge and multi-perspective knowledge, respectively. The former exploits class-independent knowledge to bridge learning new and old classes, while the latter learns knowledge from different perspectives to facilitate CIL. Extensive experiments on several widely used CIL benchmark datasets show the superiority of our approaches over the state-of-the-art methods.

Related Material


[pdf]
[bibtex]
@InProceedings{Wu_2021_ICCV, author = {Wu, Guile and Gong, Shaogang and Li, Pan}, title = {Striking a Balance Between Stability and Plasticity for Class-Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {1124-1133} }