Communication-Efficient Adaptive Federated Learning

05/05/2022
by   Yujia Wang, et al.
18

Federated learning is a machine learning training paradigm that enables clients to jointly train models without sharing their own localized data. However, the implementation of federated learning in practice still faces numerous challenges, such as the large communication overhead due to the repetitive server-client synchronization and the lack of adaptivity by SGD-based model updates. Despite that various methods have been proposed for reducing the communication cost by gradient compression or quantization, and the federated versions of adaptive optimizers such as FedAdam are proposed to add more adaptivity, the current federated learning framework still cannot solve the aforementioned challenges all at once. In this paper, we propose a novel communication-efficient adaptive federated learning method (FedCAMS) with theoretical convergence guarantees. We show that in the nonconvex stochastic optimization setting, our proposed FedCAMS achieves the same convergence rate of O(1/√(TKm)) as its non-compressed counterparts. Extensive experiments on various benchmarks verify our theoretical analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/29/2020

Adaptive Federated Optimization

Federated learning is a distributed machine learning paradigm in which a...
research
03/07/2019

Robust and Communication-Efficient Federated Learning from Non-IID Data

Federated Learning allows multiple parties to jointly train a deep learn...
research
01/22/2020

Intermittent Pulling with Local Compensation for Communication-Efficient Federated Learning

Federated Learning is a powerful machine learning paradigm to cooperativ...
research
12/15/2020

CosSGD: Nonlinear Quantization for Communication-efficient Federated Learning

Federated learning facilitates learning across clients without transferr...
research
10/18/2022

FLECS-CGD: A Federated Learning Second-Order Framework via Compression and Sketching with Compressed Gradient Differences

In the recent paper FLECS (Agafonov et al, FLECS: A Federated Learning S...
research
12/05/2021

Intrinisic Gradient Compression for Federated Learning

Federated learning is a rapidly-growing area of research which enables a...
research
12/12/2020

Communication-Efficient Federated Learning with Compensated Overlap-FedAvg

Petabytes of data are generated each day by emerging Internet of Things ...

Please sign up or login with your details

Forgot password? Click here to reset