-
[pdf]
[arXiv]
[bibtex]@InProceedings{Araujo_2022_CVPR, author = {Araujo, Vladimir and Hurtado, Julio and Soto, Alvaro and Moens, Marie-Francine}, title = {Entropy-Based Stability-Plasticity for Lifelong Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {3721-3728} }
Entropy-Based Stability-Plasticity for Lifelong Learning
Abstract
The ability to continuously learn remains elusive for deep learning models. Unlike humans, models cannot accumulate knowledge in their weights when learning new tasks, mainly due to an excess of plasticity and the low incentive to reuse weights when training a new task. To address the stability-plasticity dilemma in neural networks, we propose a novel method called Entropy-based Stability-Plasticity (ESP). Our approach can decide dynamically how much each model layer should be modified via a plasticity factor. We incorporate branch layers and an entropy-based criterion into the model to find such factor. Our experiments in the domains of natural language and vision show the effectiveness of our approach in leveraging prior knowledge by reducing interference. Also, in some cases, it is possible to freeze layers during training leading to speed up in training.
Related Material