FedRepOpt: Gradient Re-parametrized Optimizers in Federated Learning

Kin Wai Lau, Yasar Abbas Ur Rehman, Pedro Porto Buarque de Gusmão, Lai-Man Po, Lan Ma, Yuyang Xie; Proceedings of the Asian Conference on Computer Vision (ACCV), 2024, pp. 1866-1882

Abstract


Federated Learning (FL) has emerged as a privacy-preserving method for training machine learning models in a distributed manner on edge devices. However, on-device models face inherent computational power and memory limitations, potentially resulting in constrained gradient updates. As the model's size increases, the frequency of gradient updates on edge devices decreases, ultimately leading to suboptimal training outcomes during any particular FL round. This limits the feasibility of deploying advanced and large-scale models on edge devices, hindering the potential for performance enhancements. To address this issue, we propose FedRepOpt, a gradient re-parameterized optimizer for FL. The gradient re-parameterized method allows training a simple local model with a similar performance as a complex model by modifying the optimizer's gradients according to a set of model-specific hyperparameters obtained from the complex models. In this work, we focus on VGG-style and Ghost-style models in the FL environment. Extensive experiments demonstrate that models using FedRepOpt obtain a significant boost in performance of 16.7% and 11.4% compared to the RepGhost-style and RepVGG-style networks, while also demonstrating a faster convergence time of 11.7% and 57.4% compared to their complex structure. Codes are available at https://github.com/StevenLauHKHK/FedRepOpt.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Lau_2024_ACCV, author = {Lau, Kin Wai and Rehman, Yasar Abbas Ur and de Gusm\~ao, Pedro Porto Buarque and Po, Lai-Man and Ma, Lan and Xie, Yuyang}, title = {FedRepOpt: Gradient Re-parametrized Optimizers in Federated Learning}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {1866-1882} }