PerDoor: Persistent Non-Uniform Backdoors in Federated Learning using Adversarial Perturbations

05/26/2022
by   Manaar Alam, et al.
0

Federated Learning (FL) enables numerous participants to train deep learning models collaboratively without exposing their personal, potentially sensitive data, making it a promising solution for data privacy in collaborative training. The distributed nature of FL and unvetted data, however, makes it inherently vulnerable to backdoor attacks: In this scenario, an adversary injects backdoor functionality into the centralized model during training, which can be triggered to cause the desired misclassification for a specific adversary-chosen input. A range of prior work establishes successful backdoor injection in an FL system; however, these backdoors are not demonstrated to be long-lasting. The backdoor functionality does not remain in the system if the adversary is removed from the training process since the centralized model parameters continuously mutate during successive FL training rounds. Therefore, in this work, we propose PerDoor, a persistent-by-construction backdoor injection technique for FL, driven by adversarial perturbation and targeting parameters of the centralized model that deviate less in successive FL rounds and contribute the least to the main task accuracy. An exhaustive evaluation considering an image classification scenario portrays on average 10.5× persistence over multiple FL rounds compared to traditional backdoor attacks. Through experiments, we further exhibit the potency of PerDoor in the presence of state-of-the-art backdoor prevention techniques in an FL system. Additionally, the operation of adversarial perturbation also assists PerDoor in developing non-uniform trigger patterns for backdoor inputs compared to uniform triggers (with fixed patterns and locations) of existing backdoor techniques, which are prone to be easily mitigated.

READ FULL TEXT
research
04/20/2023

Get Rid Of Your Trail: Remotely Erasing Backdoors in Federated Learning

Federated Learning (FL) enables collaborative deep learning training acr...
research
07/21/2023

MAS: Towards Resource-Efficient Federated Multiple-Task Learning

Federated learning (FL) is an emerging distributed machine learning meth...
research
10/20/2022

Analyzing the Robustness of Decentralized Horizontal and Vertical Federated Learning Architectures in a Non-IID Scenario

Federated learning (FL) allows participants to collaboratively train mac...
research
07/07/2020

Defending Against Backdoors in Federated Learning with Robust Learning Rate

Federated Learning (FL) allows a set of agents to collaboratively train ...
research
09/21/2021

DeSMP: Differential Privacy-exploited Stealthy Model Poisoning Attacks in Federated Learning

Federated learning (FL) has become an emerging machine learning techniqu...
research
07/09/2020

Attack of the Tails: Yes, You Really Can Backdoor Federated Learning

Due to its decentralized nature, Federated Learning (FL) lends itself to...

Please sign up or login with your details

Forgot password? Click here to reset