Decentralized Learning with Multi-Headed Distillation

11/28/2022
by   Andrey Zhmoginov, et al.
0

Decentralized learning with private data is a central problem in machine learning. We propose a novel distillation-based decentralized learning technique that allows multiple agents with private non-iid data to learn from each other, without having to share their data, weights or weight updates. Our approach is communication efficient, utilizes an unlabeled public dataset and uses multiple auxiliary heads for each client, greatly improving training efficiency in the case of heterogeneous data. This approach allows individual models to preserve and enhance performance on their private tasks while also dramatically improving their performance on the global aggregated data distribution. We study the effects of data and model architecture heterogeneity and the impact of the underlying communication graph topology on learning efficiency and show that our agents can significantly improve their performance compared to learning in isolation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2020

Adaptive Distillation for Decentralized Learning from Heterogeneous Clients

This paper addresses the problem of decentralized learning to achieve a ...
research
04/09/2023

Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning

Decentralized learning enables serverless training of deep neural networ...
research
05/29/2023

Collaborative Learning via Prediction Consensus

We consider a collaborative learning setting where each agent's goal is ...
research
11/11/2020

Real-Time Decentralized knowledge Transfer at the Edge

Proliferation of edge networks creates islands of learning agents workin...
research
05/23/2017

Fast and Differentially Private Algorithms for Decentralized Collaborative Machine Learning

Consider a set of agents in a peer-to-peer communication network, where ...
research
12/16/2022

Addressing Data Heterogeneity in Decentralized Learning via Topological Pre-processing

Recently, local peer topology has been shown to influence the overall co...
research
05/27/2022

A Decentralized Collaborative Learning Framework Across Heterogeneous Devices for Personalized Predictive Analytics

In this paper, we propose a Similarity-based Decentralized Knowledge Dis...

Please sign up or login with your details

Forgot password? Click here to reset