Evading Real-Time Person Detectors by Adversarial T-shirt

10/18/2019
by   Kaidi Xu, et al.
41

It is known that deep neural networks (DNNs) could be vulnerable to adversarial attacks. The so-called physical adversarial examples deceive DNN-based decision makers by attaching adversarial patches to real objects. However, most of the existing works on physical adversarial attacks focus on static objects such as glass frame, stop sign and image attached to a cardboard. In this work, we proposed adversarial T-shirt, a robust physical adversarial example for evading person detectors even if it suffers from deformation due toa moving person's pose change. To the best of our knowledge, the effect of deformation is first modeled for designing physical adversarial examples with respect to non-rigid objects such as T-shirts. We show that the proposed method achieves 79 physical worlds respectively against YOLOv2. In contrast, the state-of-the-art physical attack method to fool a person detector only achieves 27 success rate. Furthermore, by leveraging min-max optimization, we extend our method to the ensemble attack setting against object detectors YOLOv2 and Faster R-CNN simultaneously.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset