Physically Adversarial Attacks and Defenses in Computer Vision: A Survey

11/03/2022
by   Xingxing Wei, et al.
0

Although Deep Neural Networks (DNNs) have been widely applied in various real-world scenarios, they are vulnerable to adversarial examples. The current adversarial attacks in computer vision can be divided into digital attacks and physical attacks according to their different attack forms. Compared with digital attacks, which generate perturbations in the digital pixels, physical attacks are more practical in the real world. Owing to the serious security problem caused by physically adversarial examples, many works have been proposed to evaluate the physically adversarial robustness of DNNs in the past years. In this paper, we summarize a survey versus the current physically adversarial attacks and physically adversarial defenses in computer vision. To establish a taxonomy, we organize the current physical attacks from attack tasks, attack forms, and attack methods, respectively. Thus, readers can have a systematic knowledge about this topic from different aspects. For the physical defenses, we establish the taxonomy from pre-processing, in-processing, and post-processing for the DNN models to achieve a full coverage of the adversarial defenses. Based on the above survey, we finally discuss the challenges of this research field and further outlook the future direction.

READ FULL TEXT

page 8

page 9

page 13

research
09/28/2022

A Survey on Physical Adversarial Attack in Computer Vision

In the past decade, deep learning has dramatically changed the tradition...
research
09/30/2022

Physical Adversarial Attack meets Computer Vision: A Decade Survey

Although Deep Neural Networks (DNNs) have achieved impressive results in...
research
05/18/2023

How Deep Learning Sees the World: A Survey on Adversarial Attacks Defenses

Deep Learning is currently used to perform multiple tasks, such as objec...
research
08/01/2021

Advances in adversarial attacks and defenses in computer vision: A survey

Deep Learning (DL) is the most widely used tool in the contemporary fiel...
research
05/03/2021

Physical world assistive signals for deep neural network classifiers – neither defense nor attack

Deep Neural Networks lead the state of the art of computer vision tasks....
research
06/21/2022

Natural Backdoor Datasets

Extensive literature on backdoor poison attacks has studied attacks and ...
research
11/25/2021

Towards Practical Deployment-Stage Backdoor Attack on Deep Neural Networks

One major goal of the AI security community is to securely and reliably ...

Please sign up or login with your details

Forgot password? Click here to reset