Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning

Songlin Dong, Haoyu Luo, Yuhang He, Xing Wei, Jie Cheng, Yihong Gong; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 18711-18720

Abstract


Current class-incremental learning research mainly focuses on single-label classification tasks while multi-label class-incremental learning (MLCIL) with more practical application scenarios is rarely studied. Although there have been many anti-forgetting methods to solve the problem of catastrophic forgetting in single-label class-incremental learning, these methods have difficulty in solving the MLCIL problem due to label absence and information dilution problems. To solve these problems, we propose a Knowledge Restore and Transfer (KRT) framework including a dynamic pseudo-label (DPL) module to solve the label absence problem by restoring the knowledge of old classes to the new data and an incremental cross-attention (ICA) module with session-specific knowledge retention tokens storing knowledge and a unified knowledge transfer token transferring knowledge to solve the information dilution problem. Comprehensive experimental results on MS-COCO and PASCAL VOC datasets demonstrate the effectiveness of our method for improving recognition performance and mitigating forgetting on multi-label class-incremental learning tasks.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Dong_2023_ICCV, author = {Dong, Songlin and Luo, Haoyu and He, Yuhang and Wei, Xing and Cheng, Jie and Gong, Yihong}, title = {Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {18711-18720} }