Differential Evolution based Dual Adversarial Camouflage: Fooling Human Eyes and Object Detectors

10/17/2022
by   Jialiang Sun, et al.
0

Recent studies reveal that deep neural network (DNN) based object detectors are vulnerable to adversarial attacks in the form of adding the perturbation to the images, leading to the wrong output of object detectors. Most current existing works focus on generating perturbed images, also called adversarial examples, to fool object detectors. Though the generated adversarial examples themselves can remain a certain naturalness, most of them can still be easily observed by human eyes, which limits their further application in the real world. To alleviate this problem, we propose a differential evolution based dual adversarial camouflage (DE_DAC) method, composed of two stages to fool human eyes and object detectors simultaneously. Specifically, we try to obtain the camouflage texture, which can be rendered over the surface of the object. In the first stage, we optimize the global texture to minimize the discrepancy between the rendered object and the scene images, making human eyes difficult to distinguish. In the second stage, we design three loss functions to optimize the local texture, making object detectors ineffective. In addition, we introduce the differential evolution algorithm to search for the near-optimal areas of the object to attack, improving the adversarial performance under certain attack area limitations. Besides, we also study the performance of adaptive DE_DAC, which can be adapted to the environment. Experiments show that our proposed method could obtain a good trade-off between the fooling human eyes and object detectors under multiple specific scenes and objects.

READ FULL TEXT

page 5

page 7

page 14

page 15

page 17

page 19

page 20

page 21

research
10/18/2019

Evading Real-Time Person Detectors by Adversarial T-shirt

It is known that deep neural networks (DNNs) could be vulnerable to adve...
research
09/30/2021

Adversarial Semantic Contour for Object Detection

Modern object detectors are vulnerable to adversarial examples, which br...
research
03/07/2022

Adversarial Texture for Fooling Person Detectors in the Physical World

Nowadays, cameras equipped with AI systems can capture and analyze image...
research
10/11/2018

Realistic Adversarial Examples in 3D Meshes

Highly expressive models such as deep neural networks (DNNs) have been w...
research
06/25/2020

Can 3D Adversarial Logos Cloak Humans?

With the trend of adversarial attacks, researchers attempt to fool train...
research
09/15/2021

FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Physical adversarial attacks in object detection have attracted increasi...
research
08/14/2023

ACTIVE: Towards Highly Transferable 3D Physical Camouflage for Universal and Robust Vehicle Evasion

Adversarial camouflage has garnered attention for its ability to attack ...

Please sign up or login with your details

Forgot password? Click here to reset