Overload: Latency Attacks on Object Detection for Edge Devices

04/11/2023
by   Erh-Chung Chen, et al.
0

Nowadays, the deployment of deep learning based applications on edge devices is an essential task owing to the increasing demands on intelligent services. However, the limited computing resources on edge nodes make the models vulnerable to attacks, such that the predictions made by models are unreliable. In this paper, we investigate latency attacks on deep learning applications. Unlike common adversarial attacks for misclassification, the goal of latency attacks is to increase the inference time, which may stop applications from responding to the requests within a reasonable time. This kind of attack is ubiquitous for various applications, and we use object detection to demonstrate how such kind of attacks work. We also design a framework named Overload to generate latency attacks at scale. Our method is based on a newly formulated optimization problem and a novel technique, called spatial attention, to increase the inference time of object detection. We have conducted experiments using YOLOv5 models on Nvidia NX. The experimental results show that with latency attacks, the inference time of a single image can be increased ten times longer in reference to the normal setting. Moreover, comparing to existing methods, our attacking method is simpler and more effective.

READ FULL TEXT

page 5

page 12

page 13

research
09/05/2022

Adversarial Detection: Attacking Object Detection in Real Time

Intelligent robots rely on object detection models to perceive the envir...
research
05/28/2022

BadDet: Backdoor Attacks on Object Detection

Deep learning models have been deployed in numerous real-world applicati...
research
01/12/2020

Membership Inference Attacks Against Object Detection Models

Machine learning models can leak information about the dataset they trai...
research
04/18/2021

Filtering Empty Camera Trap Images in Embedded Systems

Monitoring wildlife through camera traps produces a massive amount of im...
research
04/09/2020

TOG: Targeted Adversarial Objectness Gradient Attacks on Real-time Object Detection Systems

The rapid growth of real-time huge data capturing has pushed the deep le...
research
05/06/2023

Energy-Latency Attacks to On-Device Neural Networks via Sponge Poisoning

In recent years, on-device deep learning has gained attention as a means...
research
09/06/2019

Testing Deep Learning Models for Image Analysis Using Object-Relevant Metamorphic Relations

Deep learning models are widely used for image analysis. While they offe...

Please sign up or login with your details

Forgot password? Click here to reset