Few-Shot Backdoor Attacks on Visual Object Tracking

01/31/2022
by   Yiming Li, et al.
0

Visual object tracking (VOT) has been widely adopted in mission-critical applications, such as autonomous driving and intelligent surveillance systems. In current practice, third-party resources such as datasets, backbone networks, and training platforms are frequently used to train high-performance VOT models. Whilst these resources bring certain convenience, they also introduce new security threats into VOT models. In this paper, we reveal such a threat where an adversary can easily implant hidden backdoors into VOT models by tempering with the training process. Specifically, we propose a simple yet effective few-shot backdoor attack (FSBA) that optimizes two losses alternately: 1) a feature loss defined in the hidden feature space, and 2) the standard tracking loss. We show that, once the backdoor is embedded into the target model by our FSBA, it can trick the model to lose track of specific objects even when the trigger only appears in one or a few frames. We examine our attack in both digital and physical-world settings and show that it can significantly degrade the performance of state-of-the-art VOT trackers. We also show that our attack is resistant to potential defenses, highlighting the vulnerability of VOT models to potential backdoor attacks.

READ FULL TEXT

page 6

page 8

page 14

page 16

page 17

page 21

research
11/02/2022

Untargeted Backdoor Attack against Object Detection

Recent studies revealed that deep neural networks (DNNs) are exposed to ...
research
03/14/2022

Efficient universal shuffle attack for visual object tracking

Recently, adversarial attacks have been applied in visual object trackin...
research
05/27/2019

Fooling Detection Alone is Not Enough: First Adversarial Attack against Multiple Object Tracking

Recent work in adversarial machine learning started to focus on the visu...
research
10/19/2019

Spatial-aware Online Adversarial Perturbations Against Visual Object Tracking

Adversarial attacks of deep neural networks have been intensively studie...
research
03/27/2021

IoU Attack: Towards Temporally Coherent Black-Box Adversarial Attack for Visual Object Tracking

Adversarial attack arises due to the vulnerability of deep neural networ...
research
08/04/2022

SOMPT22: A Surveillance Oriented Multi-Pedestrian Tracking Dataset

Multi-object tracking (MOT) has been dominated by the use of track by de...
research
04/24/2019

Physical Adversarial Textures that Fool Visual Object Tracking

We present a system for generating inconspicuous-looking textures that, ...

Please sign up or login with your details

Forgot password? Click here to reset