Communication-efficient adaptive federated learning

Y Wang, L Lin, J Chen - International conference on machine …, 2022 - proceedings.mlr.press
International conference on machine learning, 2022proceedings.mlr.press
Federated learning is a machine learning training paradigm that enables clients to jointly
train models without sharing their own localized data. However, the implementation of
federated learning in practice still faces numerous challenges, such as the large
communication overhead due to the repetitive server-client synchronization and the lack of
adaptivity by SGD-based model updates. Despite that various methods have been proposed
for reducing the communication cost by gradient compression or quantization, and the …
Abstract
Federated learning is a machine learning training paradigm that enables clients to jointly train models without sharing their own localized data. However, the implementation of federated learning in practice still faces numerous challenges, such as the large communication overhead due to the repetitive server-client synchronization and the lack of adaptivity by SGD-based model updates. Despite that various methods have been proposed for reducing the communication cost by gradient compression or quantization, and the federated versions of adaptive optimizers such as FedAdam are proposed to add more adaptivity, the current federated learning framework still cannot solve the aforementioned challenges all at once. In this paper, we propose a novel communication-efficient adaptive federated learning method (FedCAMS) with theoretical convergence guarantees. We show that in the nonconvex stochastic optimization setting, our proposed FedCAMS achieves the same convergence rate of as its non-compressed counterparts. Extensive experiments on various benchmarks verify our theoretical analysis.
proceedings.mlr.press
Showing the best result for this search. See all results