Communication-Efficient Federated Learning via Predictive Coding

08/02/2021
by   Kai Yue, et al.
0

Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based communication scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate-distortion cost, and further reduce the redundancy with entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99 performance when compared with other baseline methods.

READ FULL TEXT
research
10/06/2021

Federated Learning via Plurality Vote

Federated learning allows collaborative workers to solve a machine learn...
research
09/10/2021

Toward Communication Efficient Adaptive Gradient Method

In recent years, distributed optimization is proven to be an effective a...
research
01/07/2022

Optimizing the Communication-Accuracy Trade-off in Federated Learning with Rate-Distortion Theory

A significant bottleneck in federated learning is the network communicat...
research
10/11/2021

ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training

Federated learning is a powerful distributed learning scheme that allows...
research
04/04/2022

FedSynth: Gradient Compression via Synthetic Data in Federated Learning

Model compression is important in federated learning (FL) with large mod...
research
01/23/2020

RPN: A Residual Pooling Network for Efficient Federated Learning

Federated learning is a new machine learning framework which enables dif...
research
05/03/2021

OCTOPUS: Overcoming Performance andPrivatization Bottlenecks in Distributed Learning

The diversity and quantity of the data warehousing, gathering data from ...

Please sign up or login with your details

Forgot password? Click here to reset