FedRCIL: Federated Knowledge Distillation for Representation based Contrastive Incremental Learning

Athanasios Psaltis, Christos Chatzikonstantinou, Charalampos Z. Patrikakis, Petros Daras; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2023, pp. 3463-3472

Abstract


The present work proposes a holistic approach to address catastrophic forgetting in the field of computer vision during the process of incremental learning. More specifically, it suggests a series of steps for effective learning of models in distributed environments, based on extracting meaningful representations, modeling them into actual knowledge, and transferring it through a continual distillation mechanism. Additionally, it introduces a federated learning algorithm tailored to the problem, eliminating the need for central model transfer, by proposing an approach based on multi-scale representation learning, coupled with a Knowledge Distillation technique. Finally, inspired by the current trend, it modifies a contrastive learning technique combining existing knowledge with previous states, aiming to preserve previously learned knowledge while incorporating new knowledge. Thorough experimentation has been conducted to provide a comprehensive analysis of the issue at hand, highlighting the great potential of the proposed method, achieving great results in a federated environment with reduced communication cost and a robust performance within highly distributed incremental scenarios.

Related Material


[pdf]
[bibtex]
@InProceedings{Psaltis_2023_ICCV, author = {Psaltis, Athanasios and Chatzikonstantinou, Christos and Patrikakis, Charalampos Z. and Daras, Petros}, title = {FedRCIL: Federated Knowledge Distillation for Representation based Contrastive Incremental Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2023}, pages = {3463-3472} }