Stochastic Deep Learning in Memristive Networks

11/09/2017
by   Anakha V Babu, et al.
0

We study the performance of stochastically trained deep neural networks (DNNs) whose synaptic weights are implemented using emerging memristive devices that exhibit limited dynamic range, resolution, and variability in their programming characteristics. We show that a key device parameter to optimize the learning efficiency of DNNs is the variability in its programming characteristics. DNNs with such memristive synapses, even with dynamic range as low as 15 and only 32 discrete levels, when trained based on stochastic updates suffer less than 3% loss in accuracy compared to floating point software baseline. We also study the performance of stochastic memristive DNNs when used as inference engines with noise corrupted data and find that if the device variability can be minimized, the relative degradation in performance for the Stochastic DNN is better than that of the software baseline. Hence, our study presents a new optimization corner for memristive devices for building large noise-immune deep learning systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2021

Energy Efficient Learning with Low Resolution Stochastic Domain Wall Synapse Based Deep Neural Networks

We demonstrate that extremely low resolution quantized (nominally 5-stat...
research
07/20/2017

Adaptive Learning Rule for Hardware-based Deep Neural Networks Using Electronic Synapse Devices

In this paper, we propose a learning rule based on a back-propagation (B...
research
05/12/2022

Adaptive Block Floating-Point for Analog Deep Learning Hardware

Analog mixed-signal (AMS) devices promise faster, more energy-efficient ...
research
02/29/2020

A Note on Latency Variability of Deep Neural Networks for Mobile Inference

Running deep neural network (DNN) inference on mobile devices, i.e., mob...
research
02/10/2020

A Framework for Semi-Automatic Precision and Accuracy Analysis for Fast and Rigorous Deep Learning

Deep Neural Networks (DNN) represent a performance-hungry application. F...
research
03/09/2020

Software-Level Accuracy Using Stochastic Computing With Charge-Trap-Flash Based Weight Matrix

The in-memory computing paradigm with emerging memory devices has been r...
research
07/29/2023

Improving Realistic Worst-Case Performance of NVCiM DNN Accelerators through Training with Right-Censored Gaussian Noise

Compute-in-Memory (CiM), built upon non-volatile memory (NVM) devices, i...

Please sign up or login with your details

Forgot password? Click here to reset