Variation-aware Binarized Memristive Networks

10/14/2019
by   Corey Lammie, et al.
0

The quantization of weights to binary states in Deep Neural Networks (DNNs) can replace resource-hungry multiply accumulate operations with simple accumulations. Such Binarized Neural Networks (BNNs) exhibit greatly reduced resource and power requirements. In addition, memristors have been shown as promising synaptic weight elements in DNNs. In this paper, we propose and simulate novel Binarized Memristive Convolutional Neural Network (BMCNN) architectures employing hybrid weight and parameter representations. We train the proposed architectures offline and then map the trained parameters to our binarized memristive devices for inference. To take into account the variations in memristive devices, and to study their effect on the performance, we introduce variations in R_ON and R_OFF. Moreover, we introduce means to mitigate the adverse effect of memristive variations in our proposed networks. Finally, we benchmark our BMCNNs and variation-aware BMCNNs using the MNIST dataset.

READ FULL TEXT

page 2

page 3

research
11/14/2021

Energy Efficient Learning with Low Resolution Stochastic Domain Wall Synapse Based Deep Neural Networks

We demonstrate that extremely low resolution quantized (nominally 5-stat...
research
02/21/2022

Variation Aware Training of Hybrid Precision Neural Networks with 28nm HKMG FeFET Based Synaptic Core

This work proposes a hybrid-precision neural network training framework ...
research
02/24/2022

Standard Deviation-Based Quantization for Deep Neural Networks

Quantization of deep neural networks is a promising approach that reduce...
research
05/23/2023

Negative Feedback Training: A Novel Concept to Improve Robustness of NVCiM DNN Accelerators

Compute-in-Memory (CiM) utilizing non-volatile memory (NVM) devices pres...
research
04/27/2021

A unified framework for Hamiltonian deep neural networks

Training deep neural networks (DNNs) can be difficult due to the occurre...
research
05/14/2018

Unifying and Merging Well-trained Deep Neural Networks for Inference Stage

We propose a novel method to merge convolutional neural-nets for the inf...
research
04/19/2022

Characterization and Optimization of Integrated Silicon-Photonic Neural Networks under Fabrication-Process Variations

Silicon-photonic neural networks (SPNNs) have emerged as promising succe...

Please sign up or login with your details

Forgot password? Click here to reset