One-shot Federated Learning without Server-side Training

04/26/2022
by   Shangchao Su, et al.
0

Federated Learning (FL) has recently made significant progress as a new machine learning paradigm for privacy protection. Due to the high communication cost of traditional FL, one-shot federated learning is gaining popularity as a way to reduce communication cost between clients and the server. Most of the existing one-shot FL methods are based on Knowledge Distillation; however, distillation based approach requires an extra training phase and depends on publicly available data sets. In this work, we consider a novel and challenging setting: performing a single round of parameter aggregation on the local models without server-side training on a public data set. In this new setting, we propose an effective algorithm for Model Aggregation via Exploring Common Harmonized Optima (MA-Echo), which iteratively updates the parameters of all local models to bring them close to a common low-loss area on the loss surface, without harming performance on their own data sets at the same time. Compared to the existing methods, MA-Echo can work well even in extremely non-identical data distribution settings where the support categories of each local model have no overlapped labels with those of the others. We conduct extensive experiments on two popular image classification data sets to compare the proposed method with existing methods and demonstrate the effectiveness of MA-Echo, which clearly outperforms the state-of-the-arts.

READ FULL TEXT

page 1

page 2

page 9

research
02/24/2023

FedPDC:Federated Learning for Public Dataset Correction

As people pay more and more attention to privacy protection, Federated L...
research
09/10/2022

Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation

Federated Learning (FL) is a machine learning paradigm where local nodes...
research
04/14/2022

Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

Federated learning (FL) is a distributed machine learning paradigm in wh...
research
07/23/2023

ProtoFL: Unsupervised Federated Learning via Prototypical Distillation

Federated learning (FL) is a promising approach for enhancing data priva...
research
05/21/2023

One-Shot Federated Learning for LEO Constellations that Reduces Convergence Time from Days to 90 Minutes

A Low Earth orbit (LEO) satellite constellation consists of a large numb...
research
12/01/2020

Communication-Efficient Federated Distillation

Communication constraints are one of the major challenges preventing the...
research
02/03/2021

A Bayesian Federated Learning Framework with Multivariate Gaussian Product

Federated learning (FL) allows multiple clients to collaboratively learn...

Please sign up or login with your details

Forgot password? Click here to reset