Navigating Heterogeneity and Privacy in One-Shot Federated Learning with Diffusion Models

Matias Mendieta, Guangyu Sun, Chen Chen; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 2601-2610

Abstract


Federated learning (FL) enables multiple clients to train models collectively while preserving data privacy. However FL faces challenges in terms of communication cost and data heterogeneity. One-shot federated learning has emerged as a solution by reducing communication rounds improving efficiency and providing better security against eavesdropping attacks. Nevertheless data heterogeneity remains a significant challenge impacting performance. This work explores the effectiveness of diffusion models in one-shot FL demonstrating their applicability in addressing data heterogeneity and improving FL performance. Additionally we investigate the utility of our diffusion model approach FedDiff compared to other one-shot FL methods under differential privacy (DP). Furthermore to improve generated sample quality under DP settings we propose a pragmatic Fourier Magnitude Filtering (FMF) method enhancing the effectiveness of the generated data for global model training. Code available at https://github.com/mmendiet/FedDiff.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Mendieta_2025_WACV, author = {Mendieta, Matias and Sun, Guangyu and Chen, Chen}, title = {Navigating Heterogeneity and Privacy in One-Shot Federated Learning with Diffusion Models}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {2601-2610} }