A Decentralized Collaborative Learning Framework Across Heterogeneous Devices for Personalized Predictive Analytics

05/27/2022
by   Guanhua Ye, et al.
0

In this paper, we propose a Similarity-based Decentralized Knowledge Distillation (SD-Dist) framework for collaboratively learning heterogeneous deep models on decentralized devices. By introducing a preloaded reference dataset, SD-Dist enables all participant devices to identify similar users and distil knowledge from them without any assumptions on a fixed model architecture. In addition, none of these operations will reveal any sensitive information like personal data and model parameters. Extensive experimental results on three real-life datasets show that SD-Dist can achieve competitive performance with less compute resources, while ensuring model heterogeneity and privacy. As revealed in our experiments, our framework also enhances the resultant models' robustness when users' data is sparse and diverse.

READ FULL TEXT
research
05/02/2022

FedDKD: Federated Learning with Decentralized Knowledge Distillation

The performance of federated learning in neural networks is generally in...
research
08/25/2023

Heterogeneous Decentralized Machine Unlearning with Seed Model Distillation

As some recent information security legislation endowed users with uncon...
research
12/17/2021

Personalized On-Device E-health Analytics with Decentralized Block Coordinate Descent

Actuated by the growing attention to personal healthcare and the pandemi...
research
04/08/2023

Model-Agnostic Decentralized Collaborative Learning for On-Device POI Recommendation

As an indispensable personalized service in Location-based Social Networ...
research
03/11/2022

Deep Class Incremental Learning from Decentralized Data

In this paper, we focus on a new and challenging decentralized machine l...
research
11/28/2022

Decentralized Learning with Multi-Headed Distillation

Decentralized learning with private data is a central problem in machine...
research
01/24/2019

Communication-Efficient and Decentralized Multi-Task Boosting while Learning the Collaboration Graph

We study the decentralized machine learning scenario where many users co...

Please sign up or login with your details

Forgot password? Click here to reset