Synaptic metaplasticity with multi-level memristive devices

06/21/2023
by   Simone D'Agostino, et al.
0

Deep learning has made remarkable progress in various tasks, surpassing human performance in some cases. However, one drawback of neural networks is catastrophic forgetting, where a network trained on one task forgets the solution when learning a new one. To address this issue, recent works have proposed solutions based on Binarized Neural Networks (BNNs) incorporating metaplasticity. In this work, we extend this solution to quantized neural networks (QNNs) and present a memristor-based hardware solution for implementing metaplasticity during both inference and training. We propose a hardware architecture that integrates quantized weights in memristor devices programmed in an analog multi-level fashion with a digital processing unit for high-precision metaplastic storage. We validated our approach using a combined software framework and memristor based crossbar array for in-memory computing fabricated in 130 nm CMOS technology. Our experimental results show that a two-layer perceptron achieves 97 MNIST and Fashion-MNIST, equal to software baseline. This result demonstrates immunity to catastrophic forgetting and the resilience to analog device imperfections of the proposed solution. Moreover, our architecture is compatible with the memristor limited endurance and has a 15x reduction in memory

READ FULL TEXT
research
12/29/2019

MTJ-Based Hardware Synapse Design for Quantized Deep Neural Networks

Quantized neural networks (QNNs) are being actively researched as a solu...
research
07/31/2019

Overcoming Catastrophic Forgetting by Neuron-level Plasticity Control

To address the issue of catastrophic forgetting in neural networks, we p...
research
12/02/2016

Overcoming catastrophic forgetting in neural networks

The ability to learn tasks in a sequential fashion is crucial to the dev...
research
04/02/2020

Device-aware inference operations in SONOS nonvolatile memory arrays

Non-volatile memory arrays can deploy pre-trained neural network models ...
research
09/19/2017

An Analog Neural Network Computing Engine using CMOS-Compatible Charge-Trap-Transistor (CTT)

An analog neural network computing engine based on CMOS-compatible charg...
research
09/19/2017

A Memristive Neural Network Computing Engine using CMOS-Compatible Charge-Trap-Transistor (CTT)

A memristive neural network computing engine based on CMOS-compatible ch...
research
11/30/2019

A binary-activation, multi-level weight RNN and training algorithm for processing-in-memory inference with eNVM

We present a new algorithm for training neural networks with binary acti...

Please sign up or login with your details

Forgot password? Click here to reset