Robust and Natural Physical Adversarial Examples for Object Detectors

11/27/2020
by   Mingfu Xue, et al.
6

Recently, many studies show that deep neural networks (DNNs) are susceptible to adversarial examples. However, in order to convince that adversarial examples are real threats in real physical world, it is necessary to study and evaluate the adversarial examples in real-world scenarios. In this paper, we propose a robust and natural physical adversarial example attack method targeting object detectors under real-world conditions, which is more challenging than targeting image classifiers. The generated adversarial examples are robust to various physical constraints and visually look similar to the original images, thus these adversarial examples are natural to humans and will not cause any suspicions. First, to ensure the robustness of the adversarial examples in real-world conditions, the proposed method exploits different image transformation functions (Distance, Angle, Illumination, Printing and Photographing), to simulate various physical changes during the iterative optimization of the adversarial examples generation. Second, to construct natural adversarial examples, the proposed method uses an adaptive mask to constrain the area and intensities of added perturbations, and utilizes the real-world perturbation score (RPS) to make the perturbations be similar to those real noises in physical world. Compared with existing studies, our generated adversarial examples can achieve a high success rate with less conspicuous perturbations. Experimental results demonstrate that, the generated adversarial examples are robust under various indoor and outdoor physical conditions. Finally, the proposed physical adversarial attack method is universal and can work in black-box scenarios. The generated adversarial examples generalize well between different models.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 7

page 8

page 11

research
12/05/2019

Region-Wise Attack: On Efficient Generation of Robust Physical Adversarial Examples

Deep neural networks (DNNs) are shown to be susceptible to adversarial e...
research
10/27/2022

Isometric 3D Adversarial Examples in the Physical World

3D deep learning models are shown to be as vulnerable to adversarial exa...
research
07/24/2017

Synthesizing Robust Adversarial Examples

Neural network-based classifiers parallel or exceed human-level accuracy...
research
12/26/2018

Practical Adversarial Attack Against Object Detector

In this paper, we proposed the first practical adversarial attacks again...
research
07/08/2020

SLAP: Improving Physical Adversarial Examples with Short-Lived Adversarial Perturbations

Whilst significant research effort into adversarial examples (AE) has em...
research
07/21/2022

Synthetic Dataset Generation for Adversarial Machine Learning Research

Existing adversarial example research focuses on digitally inserted pert...
research
11/20/2017

Adversarial Attacks Beyond the Image Space

Generating adversarial examples is an intriguing problem and an importan...

Please sign up or login with your details

Forgot password? Click here to reset