UPC: Learning Universal Physical Camouflage Attacks on Object Detectors

09/10/2019
by   Lifeng Huang, et al.
17

In this paper, we study physical adversarial attacks on object detectors in the wild. Prior arts on this matter mostly craft instance-dependent perturbations only for rigid and planar objects. To this end, we propose to learn an adversarial pattern to effectively attack all instances belonging to the same object category (e.g., person, car), referred to as Universal Physical Camouflage Attack (UPC). Concretely, UPC crafts camouflage by jointly fooling the region proposal network, as well as misleading the classifier and the regressor to output errors. In order to make UPC effective for articulated non-rigid or non-planar objects, we introduce a set of transformations for the generated camouflage patterns to mimic their deformable properties. We additionally impose optimization constraint to make generated patterns look natural for human observers. To fairly evaluate the effectiveness of different physical-world attacks on object detectors, we present the first standardized virtual database, AttackScenes, which simulates the real 3D world in a controllable and reproducible environment. Extensive experiments suggest the superiority of our proposed UPC compared with existing physical adversarial attackers not only in virtual environments (AttackScenes), but also in real-world physical environments. Codes, models, and demos are publicly available at https://mesunhlf.github.io/index_physical.html.

READ FULL TEXT

page 1

page 5

page 11

page 12

page 13

page 14

page 15

page 16

research
09/01/2021

DPA: Learning Robust Physical Adversarial Camouflages for Object Detectors

Adversarial attacks are feasible in the real world for object detection....
research
11/27/2020

3D Invisible Cloak

In this paper, we propose a novel physical stealth attack against the pe...
research
10/18/2019

Evading Real-Time Person Detectors by Adversarial T-shirt

It is known that deep neural networks (DNNs) could be vulnerable to adve...
research
09/15/2021

FCA: Learning a 3D Full-coverage Vehicle Camouflage for Multi-view Physical Adversarial Attack

Physical adversarial attacks in object detection have attracted increasi...
research
12/12/2022

HOTCOLD Block: Fooling Thermal Infrared Detectors with a Novel Wearable Design

Adversarial attacks on thermal infrared imaging expose the risk of relat...
research
07/23/2023

Towards Generic and Controllable Attacks Against Object Detection

Existing adversarial attacks against Object Detectors (ODs) suffer from ...
research
07/04/2023

Physically Realizable Natural-Looking Clothing Textures Evade Person Detectors via 3D Modeling

Recent works have proposed to craft adversarial clothes for evading pers...

Please sign up or login with your details

Forgot password? Click here to reset