Backdoor Attack in the Physical World

04/06/2021
by   Yiming Li, et al.
0

Backdoor attack intends to inject hidden backdoor into the deep neural networks (DNNs), such that the prediction of infected models will be maliciously changed if the hidden backdoor is activated by the attacker-defined trigger. Currently, most existing backdoor attacks adopted the setting of static trigger, i.e., triggers across the training and testing images follow the same appearance and are located in the same area. In this paper, we revisit this attack paradigm by analyzing trigger characteristics. We demonstrate that this attack paradigm is vulnerable when the trigger in testing images is not consistent with the one used for training. As such, those attacks are far less effective in the physical world, where the location and appearance of the trigger in the digitized image may be different from that of the one used for training. Moreover, we also discuss how to alleviate such vulnerability. We hope that this work could inspire more explorations on backdoor properties, to help the design of more advanced backdoor attack and defense methods.

READ FULL TEXT

page 2

page 4

research
04/09/2020

Rethinking the Trigger of Backdoor Attack

In this work, we study the problem of backdoor attacks, which add a spec...
research
11/02/2022

BATT: Backdoor Attack with Transformation-based Triggers

Deep neural networks (DNNs) are vulnerable to backdoor attacks. The back...
research
12/07/2020

Backdoor Attack with Sample-Specific Triggers

Recently, backdoor attacks pose a new security threat to the training pr...
research
04/08/2022

Backdoor Attack against NLP models with Robustness-Aware Perturbation defense

Backdoor attack intends to embed hidden backdoor into deep neural networ...
research
03/06/2021

Hidden Backdoor Attack against Semantic Segmentation Models

Deep neural networks (DNNs) are vulnerable to the backdoor attack, which...
research
01/03/2023

Look, Listen, and Attack: Backdoor Attacks Against Video Action Recognition

Deep neural networks (DNNs) are vulnerable to a class of attacks called ...
research
09/28/2022

A Closer Look at Evaluating the Bit-Flip Attack Against Deep Neural Networks

Deep neural network models are massively deployed on a wide variety of h...

Please sign up or login with your details

Forgot password? Click here to reset