A Simple Recipe to Meta-Learn Forward and Backward Transfer

Edoardo Cetin, Antonio Carta, Oya Celiktutan; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 18732-18742

Abstract


Meta-learning holds the potential to provide a general and explicit solution to tackle interference and forgetting in continual learning. However, many popular algorithms introduce expensive and unstable optimization processes with new key hyper-parameters and requirements, hindering their applicability. We propose a new, general, and simple meta-learning algorithm for continual learning (SiM4C) that explicitly optimizes to minimize forgetting and facilitate forward transfer. We show our method is stable, introduces only minimal computational overhead, and can be integrated with any memory-based continual learning algorithm in only a few lines of code. SiM4C meta-learns how to effectively continually learn even on very long task sequences, largely outperforming prior meta-approaches. Naively integrating with existing memory-based algorithms, we also record universal performance benefits and state-of-the-art results across different visual classification benchmarks without introducing new hyper-parameters.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Cetin_2023_ICCV, author = {Cetin, Edoardo and Carta, Antonio and Celiktutan, Oya}, title = {A Simple Recipe to Meta-Learn Forward and Backward Transfer}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {18732-18742} }