FedSynth: Gradient Compression via Synthetic Data in Federated Learning

04/04/2022
by   Shengyuan Hu, et al.
6

Model compression is important in federated learning (FL) with large models to reduce communication cost. Prior works have been focusing on sparsification based compression that could desparately affect the global model accuracy. In this work, we propose a new scheme for upstream communication where instead of transmitting the model update, each client learns and transmits a light-weight synthetic dataset such that using it as the training data, the model performs similarly well on the real training data. The server will recover the local model update via the synthetic data and apply standard aggregation. We then provide a new algorithm FedSynth to learn the synthetic data locally. Empirically, we find our method is comparable/better than random masking baselines in all three common federated learning benchmark datasets.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

12/18/2018

Expanding the Reach of Federated Learning by Reducing Client Resource Requirements

Communication on heterogeneous edge networks is a fundamental bottleneck...
08/11/2020

Federated Learning via Synthetic Data

Federated learning allows for the training of a model using data on mult...
07/17/2020

Learn distributed GAN with Temporary Discriminators

In this work, we propose a method for training distributed GAN with sequ...
04/26/2021

Communication-Efficient Federated Learning with Dual-Side Low-Rank Compression

Federated learning (FL) is a promising and powerful approach for trainin...
09/17/2020

Distilled One-Shot Federated Learning

Current federated learning algorithms take tens of communication rounds ...
11/18/2021

A Novel Optimized Asynchronous Federated Learning Framework

Federated Learning (FL) since proposed has been applied in many fields, ...
08/02/2021

Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.