DeepAI AI Chat
Log In Sign Up

FedDistill: Making Bayesian Model Ensemble Applicable to Federated Learning

by   Hong-You Chen, et al.

Federated learning aims to leverage users' own data and computational resources in learning a strong global model, without directly accessing their data but only local models. It usually requires multiple rounds of communication, in which aggregating local models into a global model plays an important role. In this paper, we propose a novel aggregation scenario and algorithm named FedDistill, which enjoys the robustness of Bayesian model ensemble in aggregating users' predictions and employs knowledge distillation to summarize the ensemble predictions into a global model, with the help of unlabeled data collected at the server. Our empirical studies validate FedDistill's superior performance, especially when users' data are not i.i.d. and the neural networks go deeper. Moreover, FedDistill is compatible with recent efforts in regularizing users' model training, making it an easily applicable module: you only need to replace the aggregation method but leave other parts of your federated learning algorithms intact.


page 1

page 2

page 3

page 4


Federated Learning on Non-iid Data via Local and Global Distillation

Most existing federated learning algorithms are based on the vanilla Fed...

Multi-Center Federated Learning

Federated learning has received great attention for its capability to tr...

Multi-Level Branched Regularization for Federated Learning

A critical challenge of federated learning is data heterogeneity and imb...

Fed-ensemble: Improving Generalization through Model Ensembling in Federated Learning

In this paper we propose Fed-ensemble: a simple approach that bringsmode...

Analysis and Optimal Edge Assignment For Hierarchical Federated Learning on Non-IID Data

Distributed learning algorithms aim to leverage distributed and diverse ...

A Bayesian Federated Learning Framework with Multivariate Gaussian Product

Federated learning (FL) allows multiple clients to collaboratively learn...

Towards Model Agnostic Federated Learning Using Knowledge Distillation

An often unquestioned assumption underlying most current federated learn...