Energy-Latency Attacks via Sponge Poisoning

03/14/2022
by   Antonio Emanuele Cinà, et al.
5

Sponge examples are test-time inputs carefully-optimized to increase energy consumption and latency of neural networks when deployed on hardware accelerators. In this work, we demonstrate that sponge attacks can also be implanted at training time, when model training is outsourced to a third party, via an attack that we call sponge poisoning. This attack allows one to increase the energy consumption and latency of machine-learning models indiscriminately on each test-time input. We present a novel formalization for sponge poisoning, overcoming the limitations related to the optimization of test-time sponge examples, and show that this attack is possible even if the attacker only controls a few poisoning samples and model updates. Our extensive experimental analysis, involving two deep learning architectures and three datasets, shows that sponge poisoning can almost completely vanish the effect of such hardware accelerators. Finally, we analyze activations of the resulting sponge models, identifying the module components that are more sensitive to this vulnerability.

READ FULL TEXT

page 2

page 10

page 14

research
06/05/2020

Sponge Examples: Energy-Latency Attacks on Neural Networks

The high energy costs of neural network training and inference led to th...
research
05/06/2023

Energy-Latency Attacks to On-Device Neural Networks via Sponge Poisoning

In recent years, on-device deep learning has gained attention as a means...
research
09/08/2018

On the Intriguing Connections of Regularization, Input Gradients and Transferability of Evasion and Poisoning Attacks

Transferability captures the ability of an attack against a machine-lear...
research
03/27/2023

TransCODE: Co-design of Transformers and Accelerators for Efficient Training and Inference

Automated co-design of machine learning models and evaluation hardware i...
research
02/12/2022

EREBA: Black-box Energy Testing of Adaptive Neural Networks

Recently, various Deep Neural Network (DNN) models have been proposed fo...
research
02/24/2017

Changing Model Behavior at Test-Time Using Reinforcement Learning

Machine learning models are often used at test-time subject to constrain...

Please sign up or login with your details

Forgot password? Click here to reset