Distilled One-Shot Federated Learning

09/17/2020
by   Yanlin Zhou, et al.
0

Current federated learning algorithms take tens of communication rounds transmitting unwieldy model weights under ideal circumstances and hundreds when data is poorly distributed. Inspired by recent work on dataset distillation and distributed one-shot learning, we propose Distilled One-Shot Federated Learning, which reduces the number of communication rounds required to train a performant model to only one. Each client distills their private dataset and sends the synthetic data (e.g. images or sentences) to the server. The distilled data look like noise and become useless after model fitting. We empirically show that, in only one round of communication, our method can achieve 96 on federated IMDB with a customized CNN (centralized 86 TREC-6 with a Bi-LSTM (centralized 89 match the centralized baseline on all three tasks. By evading the need for model-wise updates (i.e., weights, gradients, loss, etc.), the total communication cost of DOSFL is reduced by over an order of magnitude. We believe that DOSFL represents a new direction orthogonal to previous work, towards weight-less and gradient-less federated learning.

READ FULL TEXT

page 7

page 13

page 14

research
06/26/2023

Federated Learning on Non-iid Data via Local and Global Distillation

Most existing federated learning algorithms are based on the vanilla Fed...
research
02/28/2019

One-Shot Federated Learning

We present one-shot federated learning, where a central server learns a ...
research
08/30/2021

FedKD: Communication Efficient Federated Learning via Knowledge Distillation

Federated learning is widely used to learn intelligent models from decen...
research
04/04/2022

FedSynth: Gradient Compression via Synthetic Data in Federated Learning

Model compression is important in federated learning (FL) with large mod...
research
02/01/2022

Recycling Model Updates in Federated Learning: Are Gradient Subspaces Low-Rank?

In this paper, we question the rationale behind propagating large number...
research
09/03/2022

Suppressing Noise from Built Environment Datasets to Reduce Communication Rounds for Convergence of Federated Learning

Smart sensing provides an easier and convenient data-driven mechanism fo...
research
02/13/2023

One-Shot Federated Conformal Prediction

In this paper, we introduce a conformal prediction method to construct p...

Please sign up or login with your details

Forgot password? Click here to reset