Lethean Attack: An Online Data Poisoning Technique

11/24/2020
by   Eyal Perry, et al.
0

Data poisoning is an adversarial scenario where an attacker feeds a specially crafted sequence of samples to an online model in order to subvert learning. We introduce Lethean Attack, a novel data poisoning technique that induces catastrophic forgetting on an online model. We apply the attack in the context of Test-Time Training, a modern online learning framework aimed for generalization under distribution shifts. We present the theoretical rationale and empirically compare it against other sample sequences that naturally induce forgetting. Our results demonstrate that using lethean attacks, an adversary could revert a test-time training model back to coin-flip accuracy performance using a short sample sequence.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro