Evading Real-Time Person Detectors by Adversarial T-shirt

10/18/2019
by   Kaidi Xu, et al.
41

It is known that deep neural networks (DNNs) could be vulnerable to adversarial attacks. The so-called physical adversarial examples deceive DNN-based decision makers by attaching adversarial patches to real objects. However, most of the existing works on physical adversarial attacks focus on static objects such as glass frame, stop sign and image attached to a cardboard. In this work, we proposed adversarial T-shirt, a robust physical adversarial example for evading person detectors even if it suffers from deformation due toa moving person's pose change. To the best of our knowledge, the effect of deformation is first modeled for designing physical adversarial examples with respect to non-rigid objects such as T-shirts. We show that the proposed method achieves 79 physical worlds respectively against YOLOv2. In contrast, the state-of-the-art physical attack method to fool a person detector only achieves 27 success rate. Furthermore, by leveraging min-max optimization, we extend our method to the ensemble attack setting against object detectors YOLOv2 and Faster R-CNN simultaneously.

READ FULL TEXT

page 2

page 4

page 7

research
03/07/2022

Adversarial Texture for Fooling Person Detectors in the Physical World

Nowadays, cameras equipped with AI systems can capture and analyze image...
research
10/17/2022

Differential Evolution based Dual Adversarial Camouflage: Fooling Human Eyes and Object Detectors

Recent studies reveal that deep neural network (DNN) based object detect...
research
12/26/2018

Practical Adversarial Attack Against Object Detector

In this paper, we proposed the first practical adversarial attacks again...
research
10/09/2017

Standard detectors aren't (currently) fooled by physical adversarial stop signs

An adversarial example is an example that has been adjusted to produce t...
research
09/10/2019

UPC: Learning Universal Physical Camouflage Attacks on Object Detectors

In this paper, we study physical adversarial attacks on object detectors...
research
07/04/2023

Physically Realizable Natural-Looking Clothing Textures Evade Person Detectors via 3D Modeling

Recent works have proposed to craft adversarial clothes for evading pers...
research
05/10/2022

Using Frequency Attention to Make Adversarial Patch Powerful Against Person Detector

Deep neural networks (DNNs) are vulnerable to adversarial attacks. In pa...

Please sign up or login with your details

Forgot password? Click here to reset