Dynamic Adversarial Patch for Evading Object Detection Models

10/25/2020
by   Shahar Hoory, et al.
6

Recent research shows that neural networks models used for computer vision (e.g., YOLO and Fast R-CNN) are vulnerable to adversarial evasion attacks. Most of the existing real-world adversarial attacks against object detectors use an adversarial patch which is attached to the target object (e.g., a carefully crafted sticker placed on a stop sign). This method may not be robust to changes in the camera's location relative to the target object; in addition, it may not work well when applied to nonplanar objects such as cars. In this study, we present an innovative attack method against object detectors applied in a real-world setup that addresses some of the limitations of existing attacks. Our method uses dynamic adversarial patches which are placed at multiple predetermined locations on a target object. An adversarial learning algorithm is applied in order to generate the patches used. The dynamic attack is implemented by switching between optimized patches dynamically, according to the camera's position (i.e., the object detection system's position). In order to demonstrate our attack in a real-world setup, we implemented the patches by attaching flat screens to the target object; the screens are used to present the patches and switch between them, depending on the current camera location. Thus, the attack is dynamic and adjusts itself to the situation to achieve optimal results. We evaluated our dynamic patch approach by attacking the YOLOv2 object detector with a car as the target object and succeeded in misleading it in up to 90 viewing angle range. We improved the attack by generating patches that consider the semantic distance between the target object and its classification. We also examined the attack's transferability among different car models and were able to mislead the detector 71

READ FULL TEXT

page 1

page 3

page 6

page 7

page 10

research
12/23/2020

The Translucent Patch: A Physical and Universal Attack on Object Detectors

Physical adversarial attacks against object detectors have seen increasi...
research
04/18/2019

Fooling automated surveillance cameras: adversarial patches to attack person detection

Adversarial attacks on machine learning models have seen increasing inte...
research
11/16/2022

Attacking Object Detector Using A Universal Targeted Label-Switch Patch

Adversarial attacks against deep learning-based object detectors (ODs) h...
research
06/28/2023

Distributional Modeling for Location-Aware Adversarial Patches

Adversarial patch is one of the important forms of performing adversaria...
research
10/10/2021

Adversarial Attacks in a Multi-view Setting: An Empirical Study of the Adversarial Patches Inter-view Transferability

While machine learning applications are getting mainstream owing to a de...
research
06/19/2023

Eigenpatches – Adversarial Patches from Principal Components

Adversarial patches are still a simple yet powerful white box attack tha...
research
12/17/2019

APRICOT: A Dataset of Physical Adversarial Attacks on Object Detection

Physical adversarial attacks threaten to fool object detection systems, ...

Please sign up or login with your details

Forgot password? Click here to reset