FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization

by   Amirhossein Reisizadeh, et al.

Federated learning is a new distributed machine learning approach, where a model is trained over a set of devices such as mobile phones, while keeping data localized. Federated learning faces several systems challenges including (i) communication bottleneck due to a large number of devices uploading their local updates to a parameter server, and (ii) scalability as the federated network consists of millions of devices some of each are active at any given time. As a result of these system challenges alongside additional challenges such as statistical heterogeneity of data and privacy concerns, designing a provably efficient federated learning method is of significant importance and difficulty. In this paper, we present FedPAQ, a communication-efficient Federated Learning method with Periodic Averaging and Quantization. Our method stands on three key features: (1) Quantized message-passing where the edge nodes quantize their updates before uploading to the parameter server; (2) periodic averaging where models are updated locally at devices and only periodically averaged at the server; and (3) partial device participation where only a fraction of devices participate in each round of the training. These features address the communications and scalability challenges in federated learning. FedPAQ is provably near-optimal in the following sense. Under the problem setup of expected risk minimization with independent and identically distributed data points, when the loss function is strongly convex the proposed method converges to the optimal solution with near-optimal rate, and when the loss function is non-convex it finds a first-order stationary point with near-optimal rate.



page 1

page 2

page 3

page 4


CPFed: Communication-Efficient and Privacy-Preserving Federated Learning

Federated learning is a machine learning setting where a set of edge dev...

Ternary Compression for Communication-Efficient Federated Learning

Learning over massive data stored in different locations is essential in...

Distributionally Robust Federated Averaging

In this paper, we study communication efficient distributed algorithms f...

Optimizing the Communication-Accuracy Trade-off in Federated Learning with Rate-Distortion Theory

A significant bottleneck in federated learning is the network communicat...

Robust Federated Learning with Noisy Communication

Federated learning is a communication-efficient training process that al...

Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices

Training deep neural networks on large datasets can often be accelerated...

Towards Efficient Scheduling of Federated Mobile Devices under Computational and Statistical Heterogeneity

Originated from distributed learning, federated learning enables privacy...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.