PGFed: Personalize Each Client's Global Objective for Federated Learning

Jun Luo, Matias Mendieta, Chen Chen, Shandong Wu; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 3946-3956

Abstract


Personalized federated learning has received an upsurge of attention due to the mediocre performance of conventional federated learning (FL) over heterogeneous data. Unlike conventional FL which trains a single global consensus model, personalized FL allows different models for different clients. However, existing personalized FL algorithms only implicitly transfer the collaborative knowledge across the federation by embedding the knowledge into the aggregated model or regularization. We observed that this implicit knowledge transfer fails to maximize the potential of each client's empirical risk toward other clients. Based on our observation, in this work, we propose Personalized Global Federated Learning (PGFed), a novel personalized FL framework that enables each client to personalize its own global objective by explicitly and adaptively aggregating the empirical risks of itself and other clients. To avoid massive (O(N^2)) communication overhead and potential privacy leakage while achieving this, each client's risk is estimated through a first-order approximation for other clients' adaptive risk aggregation. On top of PGFed, we develop a momentum upgrade, dubbed PGFedMo, to more efficiently utilize clients' empirical risks. Our extensive experiments on four datasets under different federated settings show consistent improvements of PGFed over previous state-of-the-art methods. The code is publicly available at https://github.com/ljaiverson/pgfed.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Luo_2023_ICCV, author = {Luo, Jun and Mendieta, Matias and Chen, Chen and Wu, Shandong}, title = {PGFed: Personalize Each Client's Global Objective for Federated Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {3946-3956} }