-
[pdf]
[arXiv]
[bibtex]@InProceedings{Wang_2025_ICCV, author = {Wang, Yan and Zhou, Da-Wei and Ye, Han-Jia}, title = {Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2025}, pages = {806-816} }
Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning
Abstract
Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Existing pre-trained model-based CIL methods often freeze the pre-trained network and adapt to incremental tasks using additional lightweight modules such as adapters. However, incorrect module selection during inference hurts performance, and task-specific modules often overlook shared general knowledge, leading to errors on distinguishing between similar classes across tasks. To address the aforementioned challenges, we propose integrating Task-Specific and Universal Adapters (TUNA) in this paper. Specifically, we train task-specific adapters to capture the most crucial features relevant to their respective tasks and introduce an entropy-based selection mechanism to choose the most suitable adapter. Furthermore, we leverage an adapter fusion strategy to construct a universal adapter, which encodes the most discriminative features shared across tasks. We combine task-specific and universal adapter predictions to harness both specialized and general knowledge during inference. Extensive experiments on various benchmark datasets demonstrate the state-of-the-art performance of our approach. Code is available at https://github.com/LAMDA-CL/ICCV2025-TUNA
Related Material
