Selective Freezing for Efficient Continual Learning

Amelia Sorrenti, Giovanni Bellitto, Federica Proietto Salanitri, Matteo Pennisi, Concetto Spampinato, Simone Palazzo; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 3550-3559

Abstract


This paper aims to tackle the challenges of continual learning, where sequential learning from a stream of tasks can lead to catastrophic forgetting. Simultaneously, it addresses the need to reduce the computational demands of large-scale deep learning models to mitigate their environmental impact. To achieve this twofold objective, we propose a method that combines selective layer freezing with fast adaptation in a continual learning context. We begin by conducting an extensive analysis of layer freezing in continual learning, revealing that certain configurations allow for freezing a substantial portion of the model without significant accuracy degradation. Leveraging this insight, we introduce a novel approach that optimizes plasticity on new tasks while preserving stability on previous tasks by dynamically identifying a subset of layers to freeze during training. Experimental results demonstrate the effectiveness of our approach in achieving competitive performance with manually-tuned freezing strategies. Moreover, we quantitatively estimate the reduction in computation and energy requirements achieved through our freezing strategy by considering the number of parameters and updates required for model training.

Related Material


[pdf]
[bibtex]
@InProceedings{Sorrenti_2023_ICCV, author = {Sorrenti, Amelia and Bellitto, Giovanni and Salanitri, Federica Proietto and Pennisi, Matteo and Spampinato, Concetto and Palazzo, Simone}, title = {Selective Freezing for Efficient Continual Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {3550-3559} }