SEC-Prompt:SEmantic Complementary Prompting for Few-Shot Class-Incremental Learning

Ye Liu, Meng Yang; Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR), 2025, pp. 25643-25656

Abstract


Few-shot class-incremental learning (FSCIL) presents a significant challenge in machine learning, requiring models to integrate new classes from limited examples while preserving performance on previously learned classes. Recently, prompt-based CIL approaches leverage ample data to train prompts, effectively mitigating catastrophic forgetting. However, these methods do not account for the semantic features embedded in prompts, exacerbating the plasticity-stability dilemma in few-shot incremental learning. In this paper, we propose a novel and simple framework named SEmantic Complementary Prompt(SEC-Prompt), which learns two sets of semantically complementary prompts based on an adaptive query: discriminative prompts(D-Prompt) and non-discriminative prompts(ND-Prompt). D-Prompt enhances the separation of class-specific feature distributions by strengthening key discriminative features, while ND-Prompt balances non-discriminative information to promote generalization to novel classes. To efficiently learn high-quality knowledge from limited samples, we leverage ND-Prompt for data augmentation to increase sample diversity and introduce Prompt Clustering Loss to prevent noise contamination in D-Prompt, ensuring robust discriminative feature learning and improved generalization. Our experimental results showcase state-of-the-art performance across three benchmark datasets, including CIFAR100, ImageNet-R and CUB datasets.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Liu_2025_CVPR, author = {Liu, Ye and Yang, Meng}, title = {SEC-Prompt:SEmantic Complementary Prompting for Few-Shot Class-Incremental Learning}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {25643-25656} }