Adversarial Zoom Lens: A Novel Physical-World Attack to DNNs

06/23/2022
by   Chengyin Hu, et al.
0

Although deep neural networks (DNNs) are known to be fragile, no one has studied the effects of zooming-in and zooming-out of images in the physical world on DNNs performance. In this paper, we demonstrate a novel physical adversarial attack technique called Adversarial Zoom Lens (AdvZL), which uses a zoom lens to zoom in and out of pictures of the physical world, fooling DNNs without changing the characteristics of the target object. The proposed method is so far the only adversarial attack technique that does not add physical adversarial perturbation attack DNNs. In a digital environment, we construct a data set based on AdvZL to verify the antagonism of equal-scale enlarged images to DNNs. In the physical environment, we manipulate the zoom lens to zoom in and out of the target object, and generate adversarial samples. The experimental results demonstrate the effectiveness of AdvZL in both digital and physical environments. We further analyze the antagonism of the proposed data set to the improved DNNs. On the other hand, we provide a guideline for defense against AdvZL by means of adversarial training. Finally, we look into the threat possibilities of the proposed approach to future autonomous driving and variant attack ideas similar to the proposed attack.

READ FULL TEXT

page 3

page 5

page 6

research
09/02/2022

Adversarial Color Film: Effective Physical-World Attack to DNNs

It is well known that the performance of deep neural networks (DNNs) is ...
research
09/19/2022

Adversarial Color Projection: A Projector-Based Physical Attack to DNNs

Recent advances have shown that deep neural networks (DNNs) are suscepti...
research
03/11/2021

Adversarial Laser Beam: Effective Physical-World Attack to DNNs in a Blink

Though it is well known that the performance of deep neural networks (DN...
research
04/02/2022

Adversarial Neon Beam: Robust Physical-World Adversarial Attack to DNNs

In the physical world, light affects the performance of deep neural netw...
research
06/02/2022

Adversarial Laser Spot: Robust and Covert Physical Adversarial Attack to DNNs

Most existing deep neural networks (DNNs) are easily disturbed by slight...
research
07/14/2023

RFLA: A Stealthy Reflected Light Adversarial Attack in the Physical World

Physical adversarial attacks against deep neural networks (DNNs) have re...
research
04/01/2021

Novel DNNs for Stiff ODEs with Applications to Chemically Reacting Flows

Chemically reacting flows are common in engineering, such as hypersonic ...

Please sign up or login with your details

Forgot password? Click here to reset