Efficiency-driven Hardware Optimization for Adversarially Robust Neural Networks

05/09/2021
by   Abhiroop Bhattacharjee, et al.
8

With a growing need to enable intelligence in embedded devices in the Internet of Things (IoT) era, secure hardware implementation of Deep Neural Networks (DNNs) has become imperative. We will focus on how to address adversarial robustness for DNNs through efficiency-driven hardware optimizations. Since memory (specifically, dot-product operations) is a key energy-spending component for DNNs, hardware approaches in the past have focused on optimizing the memory. One such approach is approximate digital CMOS memories with hybrid 6T-8T SRAM cells that enable supply voltage (Vdd) scaling yielding low-power operation, without significantly affecting the performance due to read/write failures incurred in the 6T cells. In this paper, we show how the bit-errors in the 6T cells of hybrid 6T-8T memories minimize the adversarial perturbations in a DNN. Essentially, we find that for different configurations of 8T-6T ratios and scaledVdd operation, noise incurred in the hybrid memory architectures is bound within specific limits. This hardware noise can potentially interfere in the creation of adversarial attacks in DNNs yielding robustness. Another memory optimization approach involves using analog memristive crossbars that perform Matrix-Vector-Multiplications (MVMs) efficiently with low energy and area requirements. However, crossbars generally suffer from intrinsic non-idealities that cause errors in performing MVMs, leading to degradation in the accuracy of the DNNs. We will show how the intrinsic hardware variations manifested through crossbar non-idealities yield adversarial robustness to the mapped DNNs without any additional optimization.

READ FULL TEXT
research
08/25/2020

Rethinking Non-idealities in Memristive Crossbars for Adversarial Robustness in Neural Networks

Deep Neural Networks (DNNs) have been shown to be prone to adversarial a...
research
04/22/2020

QUANOS- Adversarial Noise Sensitivity Driven Hybrid Quantization of Neural Networks

Deep Neural Networks (DNNs) have been shown to be vulnerable to adversar...
research
02/15/2023

XploreNAS: Explore Adversarially Robust Hardware-efficient Neural Architectures for Non-ideal Xbars

Compute In-Memory platforms such as memristive crossbars are gaining foc...
research
09/19/2021

On the Noise Stability and Robustness of Adversarially Trained Networks on NVM Crossbars

Applications based on Deep Neural Networks (DNNs) have grown exponential...
research
08/03/2023

Evaluation of STT-MRAM as a Scratchpad for Training in ML Accelerators

Progress in artificial intelligence and machine learning over the past d...
research
08/27/2020

Robustness Hidden in Plain Sight: Can Analog Computing Defend Against Adversarial Attacks?

The ever-increasing computational demand of Deep Learning has propelled ...
research
06/29/2023

NeuralFuse: Learning to Improve the Accuracy of Access-Limited Neural Network Inference in Low-Voltage Regimes

Deep neural networks (DNNs) have become ubiquitous in machine learning, ...

Please sign up or login with your details

Forgot password? Click here to reset