FedDistill: Making Bayesian Model Ensemble Applicable to Federated Learning

09/04/2020
by   Hong-You Chen, et al.
0

Federated learning aims to leverage users' own data and computational resources in learning a strong global model, without directly accessing their data but only local models. It usually requires multiple rounds of communication, in which aggregating local models into a global model plays an important role. In this paper, we propose a novel aggregation scenario and algorithm named FedDistill, which enjoys the robustness of Bayesian model ensemble in aggregating users' predictions and employs knowledge distillation to summarize the ensemble predictions into a global model, with the help of unlabeled data collected at the server. Our empirical studies validate FedDistill's superior performance, especially when users' data are not i.i.d. and the neural networks go deeper. Moreover, FedDistill is compatible with recent efforts in regularizing users' model training, making it an easily applicable module: you only need to replace the aggregation method but leave other parts of your federated learning algorithms intact.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/02/2022

FedDKD: Federated Learning with Decentralized Knowledge Distillation

The performance of federated learning in neural networks is generally in...
05/20/2021

Data-Free Knowledge Distillation for Heterogeneous Federated Learning

Federated Learning (FL) is a decentralized machine-learning paradigm, in...
05/03/2020

Multi-Center Federated Learning

Federated learning has received great attention for its capability to tr...
07/19/2022

FedX: Unsupervised Federated Learning with Cross Knowledge Distillation

This paper presents FedX, an unsupervised federated learning framework. ...
07/14/2022

Multi-Level Branched Regularization for Federated Learning

A critical challenge of federated learning is data heterogeneity and imb...
07/21/2021

Fed-ensemble: Improving Generalization through Model Ensembling in Federated Learning

In this paper we propose Fed-ensemble: a simple approach that bringsmode...
10/28/2021

Towards Model Agnostic Federated Learning Using Knowledge Distillation

An often unquestioned assumption underlying most current federated learn...