Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions

12/16/2021
by   Jonathan M. Goodwill, et al.
14

The increasing scale of neural networks and their growing application space have produced demand for more energy- and memory-efficient artificial-intelligence-specific hardware. Avenues to mitigate the main issue, the von Neumann bottleneck, include in-memory and near-memory architectures, as well as algorithmic approaches. Here we leverage the low-power and the inherently binary operation of magnetic tunnel junctions (MTJs) to demonstrate neural network hardware inference based on passive arrays of MTJs. In general, transferring a trained network model to hardware for inference is confronted by degradation in performance due to device-to-device variations, write errors, parasitic resistance, and nonidealities in the substrate. To quantify the effect of these hardware realities, we benchmark 300 unique weight matrix solutions of a 2-layer perceptron to classify the Wine dataset for both classification accuracy and write fidelity. Despite device imperfections, we achieve software-equivalent accuracy of up to 95.3 network parameters in 15 x 15 MTJ arrays having a range of device sizes. The success of this tuning process shows that new metrics are needed to characterize the performance and quality of networks reproduced in mixed signal hardware.

READ FULL TEXT

page 3

page 7

page 11

page 24

research
11/02/2022

RF signal classification in hardware with an RF spintronic neural network

Extracting information from radiofrequency (RF) signals using artificial...
research
06/10/2020

Methodology for Realizing VMM with Binary RRAM Arrays: Experimental Demonstration of Binarized-ADALINE Using OxRAM Crossbar

In this paper, we present an efficient hardware mapping methodology for ...
research
06/23/2021

NAX: Co-Designing Neural Network and Hardware Architecture for Memristive Xbar based Computing Systems

In-Memory Computing (IMC) hardware using Memristive Crossbar Arrays (MCA...
research
05/29/2023

Hardware-aware Training Techniques for Improving Robustness of Ex-Situ Neural Network Transfer onto Passive TiO2 ReRAM Crossbars

Passive resistive random access memory (ReRAM) crossbar arrays, a promis...
research
06/24/2018

In-situ Stochastic Training of MTJ Crossbar based Neural Networks

Owing to high device density, scalability and non-volatility, Magnetic T...
research
11/29/2022

Device Modeling Bias in ReRAM-based Neural Network Simulations

Data-driven modeling approaches such as jump tables are promising techni...
research
04/15/2022

Experimentally realized memristive memory augmented neural network

Lifelong on-device learning is a key challenge for machine intelligence,...

Please sign up or login with your details

Forgot password? Click here to reset