-
[pdf]
[supp]
[bibtex]@InProceedings{Zhang_2025_CVPR, author = {Zhang, Yifei and Zhu, Hao and Tan, Alysa Ziying and Yu, Dianzhi and Huang, Longtao and Yu, Han}, title = {pFedMxF: Personalized Federated Class-Incremental Learning with Mixture of Frequency Aggregation}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {30640-30650} }
pFedMxF: Personalized Federated Class-Incremental Learning with Mixture of Frequency Aggregation
Abstract
Federated learning (FL) has emerged as a promising paradigm for privacy-preserving collaborative machine learning. However, extending FL to class incremental learning settings introduces three key challenges: 1) spatial heterogeneity due to non-IID data distributions across clients, 2) temporal heterogeneity due to sequential arrival of tasks, and 3) resource heterogeneity due to diverse client capabilities. Existing approaches generally address these challenges in isolation, potentially leading to interference between updates, catastrophic forgetting, or excessive communication overhead. In this paper, we propose personalized Federated class-incremental parameter efficient fine-tuning with Mixture of Frequency aggregation (pFedMixF), a novel framework that simultaneously addresses all three heterogeneity challenges through frequency domain decomposition. Our key insight is that assigning orthogonal frequency components to different clients and tasks enables interference-free learning to be achieved with minimal communication costs. We further design an Auto-Task Agnostic Classifier that automatically routes samples to task-specific classifiers while adapting to heterogeneous class distributions.We conduct extensive experiments on three benchmark datasets, comparing our approach with eight state-of-the-art methods. The results demonstrate that \methodname achieves comparable test accuracy while requiring only 25% of the entire model parameters and incurring significantly lower communication costs than baseline methods.
Related Material