Towards Model Agnostic Federated Learning Using Knowledge Distillation

10/28/2021
by   Andrei Afonin, et al.
0

An often unquestioned assumption underlying most current federated learning algorithms is that all the participants use identical model architectures. In this work, we initiate a theoretical study of model agnostic communication protocols which would allow data holders (agents) using different models to collaborate with each other and perform federated learning. We focus on the setting where the two agents are attempting to perform kernel regression using different kernels (and hence have different models). Our study yields a surprising result – the most natural algorithm of using alternating knowledge distillation (AKD) imposes overly strong regularization and may lead to severe under-fitting. Our theory also shows an interesting connection between AKD and the alternating projection algorithm for finding intersection of sets. Leveraging this connection, we propose a new algorithms which improve upon AKD. Our theoretical predictions also closely match real world experiments using neural networks. Thus, our work proposes a rich yet tractable framework for analyzing and developing new practical model agnostic federated learning algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2022

FedDKD: Federated Learning with Decentralized Knowledge Distillation

The performance of federated learning in neural networks is generally in...
research
07/19/2022

FedX: Unsupervised Federated Learning with Cross Knowledge Distillation

This paper presents FedX, an unsupervised federated learning framework. ...
research
10/08/2019

FedMD: Heterogenous Federated Learning via Model Distillation

Federated learning enables the creation of a powerful centralized model ...
research
02/25/2021

Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning

Federated learning is a new learning paradigm that decouples data collec...
research
08/16/2022

Knowledge-Injected Federated Learning

Federated learning is an emerging technique for training models from dec...
research
08/07/2023

Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization

Meeting summarization has emerged as a promising technique for providing...
research
09/04/2020

FedDistill: Making Bayesian Model Ensemble Applicable to Federated Learning

Federated learning aims to leverage users' own data and computational re...

Please sign up or login with your details

Forgot password? Click here to reset