-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Seo_2024_CVPR, author = {Seo, Seonguk and Kim, Jinkyu and Kim, Geeho and Han, Bohyung}, title = {Relaxed Contrastive Learning for Federated Learning}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {12279-12288} }
Relaxed Contrastive Learning for Federated Learning
Abstract
We propose a novel contrastive learning framework to effectively address the challenges of data heterogeneity in federated learning. We first analyze the inconsistency of gradient updates across clients during local training and establish its dependence on the distribution of feature representations leading to the derivation of the supervised contrastive learning (SCL) objective to mitigate local deviations. In addition we show that a naive integration of SCL into federated learning incurs representation collapse resulting in slow convergence and limited performance gains. To address this issue we introduce a relaxed contrastive learning loss that imposes a divergence penalty on excessively similar sample pairs within each class. This strategy prevents collapsed representations and enhances feature transferability facilitating collaborative training and leading to significant performance improvements. Our framework outperforms all existing federated learning approaches by significant margins on the standard benchmarks as demonstrated by extensive experimental results. The source code is available at our project page(https://github.com/skynbe/FedRCL).
Related Material