DePT: Decoupled Prompt Tuning

Ji Zhang, Shihan Wu, Lianli Gao, Heng Tao Shen, Jingkuan Song; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 12924-12933

Abstract


This work breaks through the Base-New Tradeoff (BNT) dilemma in prompt tuning i.e. the better the tuned model generalizes to the base (or target) task the worse it generalizes to new tasks and vice versa. Specifically through an in-depth analysis of the learned features of the base and new tasks we observe that the BNT stems from a channel bias issue--the vast majority of feature channels are occupied by base-specific knowledge leading to the collapse of task-shared knowledge important to new tasks. To address this we propose the Decoupled Prompt Tuning (DePT) framework which decouples base-specific knowledge from feature channels into an isolated feature space during prompt tuning so as to maximally preserve task-shared knowledge in the original feature space for achieving better zero-shot generalization on new tasks. Importantly our DePT is orthogonal to existing prompt tuning approaches and can enhance them with negligible additional computational cost. Extensive experiments on several datasets show the flexibility and effectiveness of DePT.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Zhang_2024_CVPR, author = {Zhang, Ji and Wu, Shihan and Gao, Lianli and Shen, Heng Tao and Song, Jingkuan}, title = {DePT: Decoupled Prompt Tuning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {12924-12933} }