Traceable Federated Continual Learning

Qiang Wang, Bingyan Liu, Yawen Li; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 12872-12881

Abstract


Federated continual learning (FCL) is a typical mechanism to achieve collaborative model training among clients that own dynamic data. While traditional FCL methods have been proved effective they do not consider the task repeatability and fail to achieve good performance under this practical scenario. In this paper we propose a new paradigm namely Traceable Federated Continual Learning (TFCL) aiming to cope with repetitive tasks by tracing and augmenting them. Following the new paradigm we develop TagFed a framework that enables accurate and effective Tracing augmentation and Federation for TFCL. The key idea is to decompose the whole model into a series of marked sub-models for optimizing each client task before conducting group-wise knowledge aggregation such that the repetitive tasks can be located precisely and federated selectively for improved performance. Extensive experiments on our constructed benchmark demonstrate the effectiveness and efficiency of the proposed framework. We will release our code at: https://github.com/P0werWeirdo/TagFCL.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Wang_2024_CVPR, author = {Wang, Qiang and Liu, Bingyan and Li, Yawen}, title = {Traceable Federated Continual Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {12872-12881} }