AdaPrefix++: Integrating Adapters Prefixes and Hypernetwork for Continual Learning

Sayanta Adhikari, Dupati Srikar Chandra, P. K. Srijith, Pankaj Wasnik, Naoyuki Oneo; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 7298-7307

Abstract


Continual learning allows systems to continuously learn and adapt to the tasks in an evolving real-world environment without forgetting previous tasks. Developing deep learning models that can continually learn over a sequence of tasks is challenging. We propose a novel method AdaPrefix which addresses this and empowers continual learning capability in pretrained large models (PLMs). AdaPrefix provide a continual learning method for transformer-based deep learning models by appropriately integrating the parameter-efficient methods adapters and prefixes. AdaPrefix is an effective approach for smaller PLMs and achieves better results than state-of-the-art approaches. We further improve upon AdaPrefix by proposing AdaPrefix++ enabling knowledge transfer across the tasks. It leverages hypernetworks to generate prefixes and continually learns the hypernetwork parameters to facilitate knowledge transfer. AdaPrefix++ has a smaller parameter growth compared to AdaPrefix and is more effective and valuable for continual learning in PLMs. We performed several experiments on various benchmark datasets to demonstrate the performance of our approach for different PLMs and continual learning scenarios.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Adhikari_2025_WACV, author = {Adhikari, Sayanta and Chandra, Dupati Srikar and Srijith, P. K. and Wasnik, Pankaj and Oneo, Naoyuki}, title = {AdaPrefix++: Integrating Adapters Prefixes and Hypernetwork for Continual Learning}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {7298-7307} }