Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation

09/10/2022
by   Xuan Gong, et al.
0

Federated Learning (FL) is a machine learning paradigm where local nodes collaboratively train a central model while the training data remains decentralized. Existing FL methods typically share model parameters or employ co-distillation to address the issue of unbalanced data distribution. However, they suffer from communication bottlenecks. More importantly, they risk privacy leakage. In this work, we develop a privacy preserving and communication efficient method in a FL framework with one-shot offline knowledge distillation using unlabeled, cross-domain public data. We propose a quantized and noisy ensemble of local predictions from completely trained local models for stronger privacy guarantees without sacrificing accuracy. Based on extensive experiments on image classification and text classification tasks, we show that our privacy-preserving method outperforms baseline FL algorithms with superior performance in both accuracy and communication efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2022

Federated Learning with Privacy-Preserving Ensemble Attention Distillation

Federated Learning (FL) is a machine learning paradigm where many local ...
research
05/09/2023

FedPDD: A Privacy-preserving Double Distillation Framework for Cross-silo Federated Recommendation

Cross-platform recommendation aims to improve recommendation accuracy by...
research
06/19/2023

FSAR: Federated Skeleton-based Action Recognition with Adaptive Topology Structure and Knowledge Distillation

Existing skeleton-based action recognition methods typically follow a ce...
research
09/16/2023

UNIDEAL: Curriculum Knowledge Distillation Federated Learning

Federated Learning (FL) has emerged as a promising approach to enable co...
research
04/26/2022

One-shot Federated Learning without Server-side Training

Federated Learning (FL) has recently made significant progress as a new ...
research
04/03/2021

Knowledge Distillation For Wireless Edge Learning

In this paper, we propose a framework for predicting frame errors in the...
research
11/04/2020

Federated Knowledge Distillation

Distributed learning frameworks often rely on exchanging model parameter...

Please sign up or login with your details

Forgot password? Click here to reset