FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization

09/28/2019
by   Amirhossein Reisizadeh, et al.
18

Federated learning is a new distributed machine learning approach, where a model is trained over a set of devices such as mobile phones, while keeping data localized. Federated learning faces several systems challenges including (i) communication bottleneck due to a large number of devices uploading their local updates to a parameter server, and (ii) scalability as the federated network consists of millions of devices some of each are active at any given time. As a result of these system challenges alongside additional challenges such as statistical heterogeneity of data and privacy concerns, designing a provably efficient federated learning method is of significant importance and difficulty. In this paper, we present FedPAQ, a communication-efficient Federated Learning method with Periodic Averaging and Quantization. Our method stands on three key features: (1) Quantized message-passing where the edge nodes quantize their updates before uploading to the parameter server; (2) periodic averaging where models are updated locally at devices and only periodically averaged at the server; and (3) partial device participation where only a fraction of devices participate in each round of the training. These features address the communications and scalability challenges in federated learning. FedPAQ is provably near-optimal in the following sense. Under the problem setup of expected risk minimization with independent and identically distributed data points, when the loss function is strongly convex the proposed method converges to the optimal solution with near-optimal rate, and when the loss function is non-convex it finds a first-order stationary point with near-optimal rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2020

CPFed: Communication-Efficient and Privacy-Preserving Federated Learning

Federated learning is a machine learning setting where a set of edge dev...
research
03/07/2020

Ternary Compression for Communication-Efficient Federated Learning

Learning over massive data stored in different locations is essential in...
research
02/25/2021

Distributionally Robust Federated Averaging

In this paper, we study communication efficient distributed algorithms f...
research
10/06/2021

Federated Learning via Plurality Vote

Federated learning allows collaborative workers to solve a machine learn...
research
11/01/2019

Robust Federated Learning with Noisy Communication

Federated learning is a communication-efficient training process that al...
research
08/17/2023

Over-the-Air Computation Aided Federated Learning with the Aggregation of Normalized Gradient

Over-the-air computation is a communication-efficient solution for feder...
research
05/25/2020

Towards Efficient Scheduling of Federated Mobile Devices under Computational and Statistical Heterogeneity

Originated from distributed learning, federated learning enables privacy...

Please sign up or login with your details

Forgot password? Click here to reset