The Expanding Scope of the Stability Gap: Unveiling its Presence in Joint Incremental Learning of Homogeneous Tasks

Sandesh Kamath, Albin Soutif-Cormerais, Joost Van De Weijer, Bogdan Raducanu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2024, pp. 4182-4186

Abstract


Recent research identified a temporary performance drop on previously learned tasks when transitioning to a new one.This drop is called the stability gap and has great consequences for continual learning: it complicates the direct employment of continually learning since the worse-case performance at task-boundaries is dramatic it limits its potential as an energy-efficient training paradigm and finally the stability drop could result in a reduced final performance of the algorithm. In this paper we show that the stability gap also occurs when applying joint incremental training of homogeneous tasks. In this scenario the learner continues training on the same data distribution and has access to all data from previous tasks. In addition we show that in this scenario there exists a low-loss linear path to the next minima but that SGD optimization does not choose this path. We perform further analysis including a finer batch-wise analysis which could provide insights towards potential solution directions.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Kamath_2024_CVPR, author = {Kamath, Sandesh and Soutif-Cormerais, Albin and Van De Weijer, Joost and Raducanu, Bogdan}, title = {The Expanding Scope of the Stability Gap: Unveiling its Presence in Joint Incremental Learning of Homogeneous Tasks}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2024}, pages = {4182-4186} }