Communication Efficient Federated Learning over Multiple Access Channels

01/23/2020
by   Wei-Ting Chang, et al.
0

In this work, we study the problem of federated learning (FL), where distributed users aim to jointly train a machine learning model with the help of a parameter server (PS). In each iteration of FL, users compute local gradients, followed by transmission of the quantized gradients for subsequent aggregation and model updates at PS. One of the challenges of FL is that of communication overhead due to FL's iterative nature and large model sizes. One recent direction to alleviate communication bottleneck in FL is to let users communicate simultaneously over a multiple access channel (MAC), possibly making better use of the communication resources. In this paper, we consider the problem of FL learning over a MAC. In particular, we focus on the design of digital gradient transmission schemes over a MAC, where gradients at each user are first quantized, and then transmitted over a MAC to be decoded individually at the PS. When designing digital FL schemes over MACs, there are new opportunities to assign different amount of resources (such as rate or bandwidth) to different users based on a) the informativeness of the gradients at each user, and b) the underlying channel conditions. We propose a stochastic gradient quantization scheme, where the quantization parameters are optimized based on the capacity region of the MAC. We show that such channel aware quantization for FL outperforms uniform quantization, particularly when users experience different channel conditions, and when have gradients with varying levels of informativeness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2020

Efficient Federated Learning over Multiple Access Channel with Differential Privacy Constraints

In this paper, the problem of federated learning (FL) over a multiple ac...
research
05/26/2022

QUIC-FL: Quick Unbiased Compression for Federated Learning

Distributed Mean Estimation (DME) is a fundamental building block in com...
research
02/12/2020

Wireless Federated Learning with Local Differential Privacy

In this paper, we study the problem of federated learning (FL) over a wi...
research
06/17/2021

Quantized Federated Learning under Transmission Delay and Outage Constraints

Federated learning (FL) has been recognized as a viable distributed lear...
research
09/14/2022

Compressed Particle-Based Federated Bayesian Learning and Unlearning

Conventional frequentist FL schemes are known to yield overconfident dec...
research
04/16/2022

FedVQCS: Federated Learning via Vector Quantized Compressed Sensing

In this paper, a new communication-efficient federated learning (FL) fra...
research
06/05/2020

UVeQFed: Universal Vector Quantization for Federated Learning

Traditional deep learning models are trained at a centralized server usi...

Please sign up or login with your details

Forgot password? Click here to reset