FedSynth: Gradient Compression via Synthetic Data in Federated Learning

by   Shengyuan Hu, et al.

Model compression is important in federated learning (FL) with large models to reduce communication cost. Prior works have been focusing on sparsification based compression that could desparately affect the global model accuracy. In this work, we propose a new scheme for upstream communication where instead of transmitting the model update, each client learns and transmits a light-weight synthetic dataset such that using it as the training data, the model performs similarly well on the real training data. The server will recover the local model update via the synthetic data and apply standard aggregation. We then provide a new algorithm FedSynth to learn the synthetic data locally. Empirically, we find our method is comparable/better than random masking baselines in all three common federated learning benchmark datasets.



page 1

page 2

page 3

page 4


Expanding the Reach of Federated Learning by Reducing Client Resource Requirements

Communication on heterogeneous edge networks is a fundamental bottleneck...

Federated Learning via Synthetic Data

Federated learning allows for the training of a model using data on mult...

Learn distributed GAN with Temporary Discriminators

In this work, we propose a method for training distributed GAN with sequ...

Communication-Efficient Federated Learning with Dual-Side Low-Rank Compression

Federated learning (FL) is a promising and powerful approach for trainin...

Distilled One-Shot Federated Learning

Current federated learning algorithms take tens of communication rounds ...

A Novel Optimized Asynchronous Federated Learning Framework

Federated Learning (FL) since proposed has been applied in many fields, ...

Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.