A Survey on Physical Adversarial Attack in Computer Vision

09/28/2022
by   Donghua Wang, et al.
0

In the past decade, deep learning has dramatically changed the traditional hand-craft feature manner with strong feature learning capability, resulting in tremendous improvement of conventional tasks. However, deep neural networks have recently been demonstrated vulnerable to adversarial examples, a kind of malicious samples crafted by small elaborately designed noise, which mislead the DNNs to make the wrong decisions while remaining imperceptible to humans. Adversarial examples can be divided into digital adversarial attacks and physical adversarial attacks. The digital adversarial attack is mostly performed in lab environments, focusing on improving the performance of adversarial attack algorithms. In contrast, the physical adversarial attack focus on attacking the physical world deployed DNN systems, which is a more challenging task due to the complex physical environment (i.e., brightness, occlusion, and so on). Although the discrepancy between digital adversarial and physical adversarial examples is small, the physical adversarial examples have a specific design to overcome the effect of the complex physical environment. In this paper, we review the development of physical adversarial attacks in DNN-based computer vision tasks, including image recognition tasks, object detection tasks, and semantic segmentation. For the sake of completeness of the algorithm evolution, we will briefly introduce the works that do not involve the physical adversarial attack. We first present a categorization scheme to summarize the current physical adversarial attacks. Then discuss the advantages and disadvantages of the existing physical adversarial attacks and focus on the technique used to maintain the adversarial when applied into physical environment. Finally, we point out the issues of the current physical adversarial attacks to be solved and provide promising research directions.

READ FULL TEXT

page 1

page 4

page 5

page 10

page 14

page 15

page 16

research
11/03/2022

Physically Adversarial Attacks and Defenses in Computer Vision: A Survey

Although Deep Neural Networks (DNNs) have been widely applied in various...
research
09/30/2022

Physical Adversarial Attack meets Computer Vision: A Decade Survey

Although Deep Neural Networks (DNNs) have achieved impressive results in...
research
05/08/2019

Enhancing Cross-task Transferability of Adversarial Examples with Dispersion Reduction

Neural networks are known to be vulnerable to carefully crafted adversar...
research
06/13/2023

Area is all you need: repeatable elements make stronger adversarial attacks

Over the last decade, deep neural networks have achieved state of the ar...
research
04/11/2023

Benchmarking the Physical-world Adversarial Robustness of Vehicle Detection

Adversarial attacks in the physical world can harm the robustness of det...
research
11/20/2017

Adversarial Attacks Beyond the Image Space

Generating adversarial examples is an intriguing problem and an importan...

Please sign up or login with your details

Forgot password? Click here to reset