MetaVers: Meta-Learned Versatile Representations for Personalized Federated Learning

Jin Hyuk Lim, SeungBum Ha, Sung Whan Yoon; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024, pp. 2587-2596

Abstract


One of the daunting challenges in federated learning (FL) is the heterogeneity across clients that hinders the successful federation of a global model. When the heterogeneity becomes worse, personalized federated learning (PFL) pursues to detour the hardship of capturing the commonality across clients by allowing the personalization of models built upon the federation. In the scope of PFL for visual models, on the contrary, the recent effort for aggregating an effective global representation rather than chasing further personalization draws great attention. Along the same lines, we aim to train a large-margin global representation with a strong generalization across clients by adopting the meta-learning framework and margin-based loss, which are widely accepted to be effective in handling multiple visual tasks. Our method called MetVers achieves state-of-the-art accuracies for the PFL benchmarks with the CIFAR-10, CIFAR-100, and CINIC-10 datasets while showing robustness against data reconstruction attacks. Noteworthy, the versatile representation of MetaVers exhibits a strong generalization when tested on new clients with novel classes.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Lim_2024_WACV, author = {Lim, Jin Hyuk and Ha, SeungBum and Yoon, Sung Whan}, title = {MetaVers: Meta-Learned Versatile Representations for Personalized Federated Learning}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2024}, pages = {2587-2596} }