CLUE: Consolidating Learned and Undergoing Experience in Domain-Incremental Classification

Chengyi Cai, Jiaxin Liu, Wendi Yu, Yuchen Guo; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 125-141

Abstract


Deep neural networks tend to be vulnerable to catastrophic forgetting when learning new tasks. To address it, continual learning has become a promising and popular research field in recent years. It is noticed that plentiful research predominantly focuses on class-incremental (CI) settings. However, another practical setting, domain-incremental (DI) learning, where the domain distribution shifts in new tasks, also suffers from deteriorating rigidity and should be emphasized. Concentrating on the DI setting, in which the learned model is overwritten by new domains and is no longer valid for former tasks, a novel method named Consolidating Learned and Undergoing Experience (CLUE) is proposed in this paper. In particular, CLUE consolidates former and current experiences by setting penalties on feature extractor distortion and sample outputs alteration. CLUE is highly applicable to classification models as neither extra parameters nor processing steps are introduced. It is observed through extensive experiments that CLUE achieves significant performance improvement compared with other baselines in the three benchmarks. In addition, CLUE is robust even with fewer replay samples. Moreover, its feasibility is supported by both theoretical derivation and model interpretability visualization.

Related Material


[pdf] [supp] [code]
[bibtex]
@InProceedings{Cai_2022_ACCV, author = {Cai, Chengyi and Liu, Jiaxin and Yu, Wendi and Guo, Yuchen}, title = {CLUE: Consolidating Learned and Undergoing Experience in Domain-Incremental Classification}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {125-141} }