DNN-Alias: Deep Neural Network Protection Against Side-Channel Attacks via Layer Balancing

03/12/2023
by   Mahya Morid Ahmadi, et al.
0

Extracting the architecture of layers of a given deep neural network (DNN) through hardware-based side channels allows adversaries to steal its intellectual property and even launch powerful adversarial attacks on the target system. In this work, we propose DNN-Alias, an obfuscation method for DNNs that forces all the layers in a given network to have similar execution traces, preventing attack models from differentiating between the layers. Towards this, DNN-Alias performs various layer-obfuscation operations, e.g., layer branching, layer deepening, etc, to alter the run-time traces while maintaining the functionality. DNN-Alias deploys an evolutionary algorithm to find the best combination of obfuscation operations in terms of maximizing the security level while maintaining a user-provided latency overhead budget. We demonstrate the effectiveness of our DNN-Alias technique by obfuscating the architecture of 700 randomly generated and obfuscated DNNs running on multiple Nvidia RTX 2080 TI GPU-based machines. Our experiments show that state-of-the-art side-channel architecture stealing attacks cannot extract the original DNN accurately. Moreover, we obfuscate the architecture of various DNNs, such as the VGG-11, VGG-13, ResNet-20, and ResNet-32 networks. Training the DNNs using the standard CIFAR10 dataset, we show that our DNN-Alias maintains the functionality of the original DNNs by preserving the original inference accuracy. Further, the experiments highlight that adversarial attack on obfuscated DNNs is unsuccessful.

READ FULL TEXT

page 1

page 8

research
06/01/2022

NeuroUnlock: Unlocking the Architecture of Obfuscated Deep Neural Networks

The advancements of deep neural networks (DNNs) have led to their deploy...
research
04/06/2023

EZClone: Improving DNN Model Extraction Attack via Shape Distillation from GPU Execution Profiles

Deep Neural Networks (DNNs) have become ubiquitous due to their performa...
research
08/14/2018

Cache Telepathy: Leveraging Shared Resource Attacks to Learn DNN Architectures

Deep Neural Networks (DNNs) are fast becoming ubiquitous for their abili...
research
07/20/2021

NeurObfuscator: A Full-stack Obfuscation Tool to Mitigate Neural Architecture Stealing

Neural network stealing attacks have posed grave threats to neural netwo...
research
03/16/2021

Parareal Neural Networks Emulating a Parallel-in-time Algorithm

As deep neural networks (DNNs) become deeper, the training time increase...
research
03/10/2019

Neural Network Model Extraction Attacks in Edge Devices by Hearing Architectural Hints

As neural networks continue their reach into nearly every aspect of soft...
research
04/05/2019

Minimum Uncertainty Based Detection of Adversaries in Deep Neural Networks

Despite their unprecedented performance in various domains, utilization ...

Please sign up or login with your details

Forgot password? Click here to reset