Stealthy Backdoors as Compression Artifacts

04/30/2021
by   Yulong Tian, et al.
27

In a backdoor attack on a machine learning model, an adversary produces a model that performs well on normal inputs but outputs targeted misclassifications on inputs containing a small trigger pattern. Model compression is a widely-used approach for reducing the size of deep learning models without much accuracy loss, enabling resource-hungry models to be compressed for use on resource-constrained devices. In this paper, we study the risk that model compression could provide an opportunity for adversaries to inject stealthy backdoors. We design stealthy backdoor attacks such that the full-sized model released by adversaries appears to be free from backdoors (even when tested using state-of-the-art techniques), but when the model is compressed it exhibits highly effective backdoors. We show this can be done for two common model compression techniques – model pruning and model quantization. Our findings demonstrate how an adversary may be able to hide a backdoor as a compression artifact, and show the importance of performing security tests on the models that will actually be deployed not their precompressed version.

READ FULL TEXT
research
09/21/2020

Conditional Automated Channel Pruning for Deep Neural Networks

Model compression aims to reduce the redundancy of deep networks to obta...
research
08/16/2023

Benchmarking Adversarial Robustness of Compressed Deep Learning Models

The increasing size of Deep Neural Networks (DNNs) poses a pressing need...
research
12/10/2020

Robustness and Transferability of Universal Attacks on Compressed Models

Neural network compression methods like pruning and quantization are ver...
research
10/30/2018

DeepTwist: Learning Model Compression via Occasional Weight Distortion

Model compression has been introduced to reduce the required hardware re...
research
12/06/2021

Fast Test Input Generation for Finding Deviated Behaviors in Compressed Deep Neural Network

Model compression can significantly reduce sizes of deep neural network ...
research
11/20/2020

Empirical Evaluation of Deep Learning Model Compression Techniques on the WaveNet Vocoder

WaveNet is a state-of-the-art text-to-speech vocoder that remains challe...
research
05/20/2021

Model Compression

With time, machine learning models have increased in their scope, functi...

Please sign up or login with your details

Forgot password? Click here to reset