Physical Adversarial Attack on Vehicle Detector in the Carla Simulator

07/31/2020
by   Tong Wu, et al.
4

In this paper, we tackle the issue of physical adversarial examples for object detectors in the wild. Specifically, we proposed to generate adversarial patterns to be applied on vehicle surface so that it's not recognizable by detectors in the photo-realistic Carla simulator. Our approach contains two main techniques, an Enlarge-and-Repeat process and a Discrete Searching method, to craft mosaic-like adversarial vehicle textures without access to neither the model weight of the detector nor a differential rendering procedure. The experimental results demonstrate the effectiveness of our approach in the simulator.

READ FULL TEXT

page 3

page 6

page 7

research
09/15/2021

FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Physical adversarial attacks in object detection have attracted increasi...
research
12/21/2017

Note on Attacking Object Detectors with Adversarial Stickers

Deep learning has proven to be a powerful tool for computer vision and h...
research
12/07/2017

Adversarial Examples that Fool Detectors

An adversarial example is an example that has been adjusted to produce a...
research
10/09/2017

Standard detectors aren't (currently) fooled by physical adversarial stop signs

An adversarial example is an example that has been adjusted to produce t...
research
12/26/2018

Practical Adversarial Attack Against Object Detector

In this paper, we proposed the first practical adversarial attacks again...
research
08/14/2023

ACTIVE: Towards Highly Transferable 3D Physical Camouflage for Universal and Robust Vehicle Evasion

Adversarial camouflage has garnered attention for its ability to attack ...
research
05/18/2010

Dynamical issues in interactive representation of physical objects

The quality of a simulator equipped with a haptic interface is given by ...

Please sign up or login with your details

Forgot password? Click here to reset