Device Modeling Bias in ReRAM-based Neural Network Simulations

11/29/2022
by   Osama Yousuf, et al.
0

Data-driven modeling approaches such as jump tables are promising techniques to model populations of resistive random-access memory (ReRAM) or other emerging memory devices for hardware neural network simulations. As these tables rely on data interpolation, this work explores the open questions about their fidelity in relation to the stochastic device behavior they model. We study how various jump table device models impact the attained network performance estimates, a concept we define as modeling bias. Two methods of jump table device modeling, binning and Optuna-optimized binning, are explored using synthetic data with known distributions for benchmarking purposes, as well as experimental data obtained from TiOx ReRAM devices. Results on a multi-layer perceptron trained on MNIST show that device models based on binning can behave unpredictably particularly at low number of points in the device dataset, sometimes over-promising, sometimes under-promising target network accuracy. This paper also proposes device level metrics that indicate similar trends with the modeling bias metric at the network level. The proposed approach opens the possibility for future investigations into statistical device models with better performance, as well as experimentally verified modeling bias in different in-memory computing and neural network architectures.

READ FULL TEXT

page 5

page 12

research
05/01/2023

Modeling and Analysis of Analog Non-Volatile Devices for Compute-In-Memory Applications

This paper introduces a novel simulation tool for analyzing and training...
research
07/06/2021

Uncertainty Modeling of Emerging Device-based Computing-in-Memory Neural Accelerators with Application to Neural Architecture Search

Emerging device-based Computing-in-memory (CiM) has been proved to be a ...
research
12/16/2021

Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions

The increasing scale of neural networks and their growing application sp...
research
06/04/2020

Counting Cards: Exploiting Weight and Variance Distributions for Robust Compute In-Memory

Compute in-memory (CIM) is a promising technique that minimizes data tra...
research
05/29/2023

Hardware-aware Training Techniques for Improving Robustness of Ex-Situ Neural Network Transfer onto Passive TiO2 ReRAM Crossbars

Passive resistive random access memory (ReRAM) crossbar arrays, a promis...
research
08/01/2023

Revolutionizing TCAD Simulations with Universal Device Encoding and Graph Attention Networks

An innovative methodology that leverages artificial intelligence (AI) an...
research
04/15/2022

Experimentally realized memristive memory augmented neural network

Lifelong on-device learning is a key challenge for machine intelligence,...

Please sign up or login with your details

Forgot password? Click here to reset