FedRAD: Federated Robust Adaptive Distillation

12/02/2021
by   Stefán Páll Sturluson, et al.
0

The robustness of federated learning (FL) is vital for the distributed training of an accurate global model that is shared among large number of clients. The collaborative learning framework by typically aggregating model updates is vulnerable to model poisoning attacks from adversarial clients. Since the shared information between the global server and participants are only limited to model parameters, it is challenging to detect bad model updates. Moreover, real-world datasets are usually heterogeneous and not independent and identically distributed (Non-IID) among participants, which makes the design of such robust FL pipeline more difficult. In this work, we propose a novel robust aggregation method, Federated Robust Adaptive Distillation (FedRAD), to detect adversaries and robustly aggregate local models based on properties of the median statistic, and then performing an adapted version of ensemble Knowledge Distillation. We run extensive experiments to evaluate the proposed method against recently published works. The results show that FedRAD outperforms all other aggregators in the presence of adversaries, as well as in heterogeneous data distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2021

Global Knowledge Distillation in Federated Learning

Knowledge distillation has caught a lot of attention in Federated Learni...
research
02/16/2022

No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices

Federated learning (FL) is an important paradigm for training global mod...
research
09/29/2022

Label driven Knowledge Distillation for Federated Learning with non-IID Data

In real-world applications, Federated Learning (FL) meets two challenges...
research
03/04/2023

Federated Virtual Learning on Heterogeneous Data with Local-global Distillation

Despite Federated Learning (FL)'s trend for learning machine learning mo...
research
11/20/2020

Towards Building a Robust and Fair Federated Learning System

Federated Learning (FL) has emerged as a promising practical framework f...
research
06/19/2023

FSAR: Federated Skeleton-based Action Recognition with Adaptive Topology Structure and Knowledge Distillation

Existing skeleton-based action recognition methods typically follow a ce...
research
08/07/2023

Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization

Meeting summarization has emerged as a promising technique for providing...

Please sign up or login with your details

Forgot password? Click here to reset