Adaptive Distillation for Decentralized Learning from Heterogeneous Clients

08/18/2020
by   Jiaxin Ma, et al.
8

This paper addresses the problem of decentralized learning to achieve a high-performance global model by asking a group of clients to share local models pre-trained with their own data resources. We are particularly interested in a specific case where both the client model architectures and data distributions are diverse, which makes it nontrivial to adopt conventional approaches such as Federated Learning and network co-distillation. To this end, we propose a new decentralized learning method called Decentralized Learning via Adaptive Distillation (DLAD). Given a collection of client models and a large number of unlabeled distillation samples, the proposed DLAD 1) aggregates the outputs of the client models while adaptively emphasizing those with higher confidence in given distillation samples and 2) trains the global model to imitate the aggregated outputs. Our extensive experimental evaluation on multiple public datasets (MNIST, CIFAR-10, and CINIC-10) demonstrates the effectiveness of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2023

Personalized Decentralized Federated Learning with Knowledge Distillation

Personalization in federated learning (FL) functions as a coordinator fo...
research
11/28/2022

Decentralized Learning with Multi-Headed Distillation

Decentralized learning with private data is a central problem in machine...
research
06/17/2022

Decentralized adaptive clustering of deep nets is beneficial for client collaboration

We study the problem of training personalized deep learning models in a ...
research
08/25/2023

Heterogeneous Decentralized Machine Unlearning with Seed Model Distillation

As some recent information security legislation endowed users with uncon...
research
06/30/2022

Cross-domain Federated Object Detection

Detection models trained by one party (server) may face severe performan...
research
08/30/2023

Federated Two Stage Decoupling With Adaptive Personalization Layers

Federated learning has gained significant attention due to its groundbre...
research
12/16/2022

FewFedWeight: Few-shot Federated Learning Framework across Multiple NLP Tasks

Massively multi-task learning with large language models has recently ma...

Please sign up or login with your details

Forgot password? Click here to reset