Hidden Poison: Machine Unlearning Enables Camouflaged Poisoning Attacks

12/21/2022
by   Jimmy Z. Di, et al.
0

We introduce camouflaged data poisoning attacks, a new attack vector that arises in the context of machine unlearning and other settings when model retraining may be induced. An adversary first adds a few carefully crafted points to the training dataset such that the impact on the model's predictions is minimal. The adversary subsequently triggers a request to remove a subset of the introduced points at which point the attack is unleashed and the model's predictions are negatively affected. In particular, we consider clean-label targeted attacks (in which the goal is to cause the model to misclassify a specific test point) on datasets including CIFAR-10, Imagenette, and Imagewoof. This attack is realized by constructing camouflage datapoints that mask the effect of a poisoned dataset.

READ FULL TEXT

page 2

page 3

page 10

page 12

page 22

page 23

page 24

page 26

research
09/29/2019

Strong Baseline Defenses Against Clean-Label Poisoning Attacks

Targeted clean-label poisoning is a type of adversarial attack on machin...
research
06/28/2023

A Diamond Model Analysis on Twitter's Biggest Hack

Cyberattacks have prominently increased over the past few years now, and...
research
10/15/2021

Textual Backdoor Attacks Can Be More Harmful via Two Simple Tricks

Backdoor attacks are a kind of emergent security threat in deep learning...
research
06/09/2021

AdaptOver : Adaptive Overshadowing of LTE signals

We introduce AdaptOver, a new LTE signal overshadowing attack that allow...
research
06/30/2020

Model-Targeted Poisoning Attacks: Provable Convergence and Certified Bounds

Machine learning systems that rely on training data collected from untru...
research
05/04/2021

Broadly Applicable Targeted Data Sample Omission Attacks

We introduce a novel clean-label targeted poisoning attack on learning m...
research
03/08/2022

Robustly-reliable learners under poisoning attacks

Data poisoning attacks, in which an adversary corrupts a training set wi...

Please sign up or login with your details

Forgot password? Click here to reset