- [pdf] [supp]
TCP: Triplet Contrastive-Relationship Preserving for Class-Incremental Learning
In class-incremental learning (CIL), when deep neural networks learn new classes, their recognition performance in old classes will drop significantly. This phenomenon is widely known as catastrophic forgetting. To alleviate catastrophic forgetting, existing methods store a small portion of old class data with a memory buffer and replay it while learning new classes. These methods suffer from a severe imbalance problem between old and new classes. In this paper, we discover that the imbalance problem in CIL makes it difficult to preserve the feature relation of old classes and hard to learn the feature relation between old and new classes. To mitigate the above two issues, we design a triplet contrastive preserving (TCP) loss to preserve old knowledge, and propose an asymmetric augmented contrastive learning (A2CL) method to learn new classes. Comprehensive experiments demonstrate the effectiveness of our method, which increases the average accuracies by 1.26% and 0.95% on CIFAR-100 and ImageNet. Especially under smaller memory buffer settings where the imbalance problem is more severe, our method can surpass the baselines by a large margin (up to 3.2%). We also show that TCP can be easily plugged into other methods and further improve their performance.