On-chip learning in a conventional silicon MOSFET based Analog Hardware Neural Network

07/01/2019
by   Nilabjo Dey, et al.
11

On-chip learning in a crossbar array based analog hardware Neural Network (NN) has been shown to have major advantages in terms of speed and energy compared to training NN on a traditional computer. However analog hardware NN proposals and implementations thus far have mostly involved Non Volatile Memory (NVM) devices like Resistive Random Access Memory (RRAM), Phase Change Memory (PCM), spintronic devices or floating gate transistors as synapses. Fabricating systems based on RRAM, PCM or spintronic devices need in-house laboratory facilities and cannot be done through merchant foundries, unlike conventional silicon based CMOS chips. Floating gate transistors need large voltage pulses for weight update, making on-chip learning in such systems energy inefficient. This paper proposes and implements through SPICE simulations on-chip learning in analog hardware NN using only conventional silicon based MOSFETs (without any floating gate) as synapses since they are easy to fabricate. We first model the synaptic characteristic of our single transistor synapse using SPICE circuit simulator and benchmark it against experimentally obtained current-voltage characteristics of a transistor. Next we design a Fully Connected Neural Network (FCNN) crossbar array using such transistor synapses. We also design analog peripheral circuits for neuron and synaptic weight update calculation, needed for on-chip learning, again using conventional transistors. Simulating the entire system on SPICE simulator, we obtain high training and test accuracy on the standard Fisher's Iris dataset, widely used in machine learning. We also compare the speed and energy performance of our transistor based implementation of analog hardware NN with some previous implementations of NN with NVM devices and show comparable performance with respect to on-chip learning.

READ FULL TEXT

page 5

page 6

page 10

page 11

page 12

page 13

page 16

research
10/28/2019

Comparing domain wall synapse with other Non Volatile Memory devices for on-chip learning in Analog Hardware Neural Network

Resistive Random Access Memory (RRAM) and Phase Change Memory (PCM) devi...
research
04/15/2022

Experimentally realized memristive memory augmented neural network

Lifelong on-device learning is a key challenge for machine intelligence,...
research
11/25/2018

On-chip learning for domain wall synapse based Fully Connected Neural Network

Spintronic devices are considered as promising candidates in implementin...
research
04/06/2017

A Software-equivalent SNN Hardware using RRAM-array for Asynchronous Real-time Learning

Spiking Neural Network (SNN) naturally inspires hardware implementation ...
research
12/02/2012

Artificial Neural Network for Performance Modeling and Optimization of CMOS Analog Circuits

This paper presents an implementation of multilayer feed forward neural ...
research
09/27/2016

Training a Probabilistic Graphical Model with Resistive Switching Electronic Synapses

Current large scale implementations of deep learning and data mining req...
research
05/08/2018

Hierarchical Temporal Memory using Memristor Networks: A Survey

This paper presents a survey of the currently available hardware designs...

Please sign up or login with your details

Forgot password? Click here to reset