FedSynth: Gradient Compression via Synthetic Data in Federated Learning

04/04/2022
by   Shengyuan Hu, et al.
6

Model compression is important in federated learning (FL) with large models to reduce communication cost. Prior works have been focusing on sparsification based compression that could desparately affect the global model accuracy. In this work, we propose a new scheme for upstream communication where instead of transmitting the model update, each client learns and transmits a light-weight synthetic dataset such that using it as the training data, the model performs similarly well on the real training data. The server will recover the local model update via the synthetic data and apply standard aggregation. We then provide a new algorithm FedSynth to learn the synthetic data locally. Empirically, we find our method is comparable/better than random masking baselines in all three common federated learning benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2018

Expanding the Reach of Federated Learning by Reducing Client Resource Requirements

Communication on heterogeneous edge networks is a fundamental bottleneck...
research
08/11/2020

Federated Learning via Synthetic Data

Federated learning allows for the training of a model using data on mult...
research
07/17/2020

Learn distributed GAN with Temporary Discriminators

In this work, we propose a method for training distributed GAN with sequ...
research
06/11/2022

Federated Learning with GAN-based Data Synthesis for Non-IID Clients

Federated learning (FL) has recently emerged as a popular privacy-preser...
research
09/17/2020

Distilled One-Shot Federated Learning

Current federated learning algorithms take tens of communication rounds ...
research
08/02/2021

Communication-Efficient Federated Learning via Predictive Coding

Federated learning can enable remote workers to collaboratively train a ...
research
09/10/2019

Gradient Descent with Compressed Iterates

We propose and analyze a new type of stochastic first order method: grad...

Please sign up or login with your details

Forgot password? Click here to reset