Causes of Catastrophic Forgetting in Class-Incremental Semantic Segmentation

Tobias Kalb, Jürgen Beyerer; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 56-73

Abstract


Class-incremental learning for semantic segmentation (CiSS) is presently a highly researched field which aims at updating a semantic segmentation model by sequentially learning new semantic classes. A major challenge in CiSS is overcoming the effects of catastrophic forgetting, which describes the sudden drop of accuracy on previously learned classes after the model is trained on a new set of classes. Despite latest advances in mitigating catastrophic forgetting, the underlying causes of forgetting specifically in CiSS are not well understood. Therefore, in a set of experiments and representational analyses, we demonstrate that the semantic shift of the background class and a bias towards new classes are the major causes of forgetting in CiSS. Furthermore, we show that both causes mostly manifest themselves in deeper classification layers of the network, while the early layers of the model are not affected. Finally, we demonstrate how both causes are effectively mitigated utilizing the information contained in the background, with the help of knowledge distillation and an unbiased cross-entropy loss.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Kalb_2022_ACCV, author = {Kalb, Tobias and Beyerer, J\"urgen}, title = {Causes of Catastrophic Forgetting in Class-Incremental Semantic Segmentation}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {56-73} }