Heterogeneous Forgetting Compensation for Class-Incremental Learning

Jiahua Dong, Wenqi Liang, Yang Cong, Gan Sun; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 11742-11751

Abstract


Class-incremental learning (CIL) has achieved remarkable successes in learning new classes consecutively while overcoming catastrophic forgetting on old categories. However, most existing CIL methods unreasonably assume that all old categories have the same forgetting pace, and neglect negative influence of forgetting heterogeneity among different old classes on forgetting compensation. To surmount the above challenges, we develop a novel Heterogeneous Forgetting Compensation (HFC) model, which can resolve heterogeneous forgetting of easy-to-forget and hard-to-forget old categories from both representation and gradient aspects. Specifically, we design a task-semantic aggregation block to alleviate heterogeneous forgetting from representation aspect. It aggregates local category information within each task to learn task-shared global representations. Moreover, we develop two novel plug-and-play losses: a gradient-balanced forgetting compensation loss and a gradient-balanced relation distillation loss to alleviate forgetting from gradient aspect. They consider gradient-balanced compensation to rectify forgetting heterogeneity of old categories and heterogeneous relation consistency. Experiments on several representative datasets illustrate effectiveness of our HFC model. The code is available at https://github.com/JiahuaDong/HFC.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Dong_2023_ICCV, author = {Dong, Jiahua and Liang, Wenqi and Cong, Yang and Sun, Gan}, title = {Heterogeneous Forgetting Compensation for Class-Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {11742-11751} }