Ensemble Distillation for Robust Model Fusion in Federated Learning

06/12/2020
by   Tao Lin, et al.
1

Federated Learning (FL) is a machine learning setting where many devices collaboratively train a machine learning model while keeping the training data decentralized. In most of the current training schemes the central model is refined by averaging the parameters of the server model and the updated parameters from the client side. However, directly averaging model parameters is only possible if all models have the same structure and size, which could be a restrictive constraint in many scenarios. In this work we investigate more powerful and more flexible aggregation schemes for FL. Specifically, we propose ensemble distillation for model fusion, i.e. training the central classifier through unlabeled data on the outputs of the models from the clients. This knowledge distillation technique mitigates privacy risk and cost to the same extent as the baseline FL algorithms, but allows flexible aggregation over heterogeneous client models that can differ e.g. in size, numerical precision or structure. We show in extensive empirical experiments on various CV/NLP datasets (CIFAR-10/100, ImageNet, AG News, SST2) and settings (heterogeneous models/data) that the server model can be trained much faster, requiring fewer communication rounds than any existing FL technique so far.

READ FULL TEXT

page 2

page 17

page 19

page 22

page 23

page 24

research
10/16/2022

Federated Learning with Privacy-Preserving Ensemble Attention Distillation

Federated Learning (FL) is a machine learning paradigm where many local ...
research
11/09/2022

Knowledge Distillation for Federated Learning: a Practical Guide

Federated Learning (FL) enables the training of Deep Learning models wit...
research
08/16/2022

Resource-aware Federated Learning using Knowledge Extraction and Multi-model Fusion

With increasing concern about user data privacy, federated learning (FL)...
research
10/04/2022

Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning

Knowledge distillation has recently become popular as a method of model ...
research
03/05/2021

Distributed Dynamic Map Fusion via Federated Learning for Intelligent Networked Vehicles

The technology of dynamic map fusion among networked vehicles has been d...
research
02/04/2021

FedAUX: Leveraging Unlabeled Auxiliary Data in Federated Learning

Federated Distillation (FD) is a popular novel algorithmic paradigm for ...
research
12/17/2018

Learning Private Neural Language Modeling with Attentive Aggregation

Mobile keyboard suggestion is typically regarded as a word-level languag...

Please sign up or login with your details

Forgot password? Click here to reset