Personalized Decentralized Federated Learning with Knowledge Distillation

02/23/2023
by   Eunjeong Jeong, et al.
0

Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.

READ FULL TEXT
research
01/21/2023

The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation

Heterogeneity of data distributed across clients limits the performance ...
research
08/18/2020

Adaptive Distillation for Decentralized Learning from Heterogeneous Clients

This paper addresses the problem of decentralized learning to achieve a ...
research
07/29/2021

QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning

Traditionally, federated learning (FL) aims to train a single global mod...
research
09/29/2022

Label driven Knowledge Distillation for Federated Learning with non-IID Data

In real-world applications, Federated Learning (FL) meets two challenges...
research
06/17/2022

MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare

Federated learning has attracted increasing attention to building models...
research
10/04/2022

Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning

Knowledge distillation has recently become popular as a method of model ...
research
05/04/2022

FedSPLIT: One-Shot Federated Recommendation System Based on Non-negative Joint Matrix Factorization and Knowledge Distillation

Non-negative matrix factorization (NMF) with missing-value completion is...

Please sign up or login with your details

Forgot password? Click here to reset