Training a Probabilistic Graphical Model with Resistive Switching Electronic Synapses

09/27/2016
by   S. Burc Eryilmaz, et al.
0

Current large scale implementations of deep learning and data mining require thousands of processors, massive amounts of off-chip memory, and consume gigajoules of energy. Emerging memory technologies such as nanoscale two-terminal resistive switching memory devices offer a compact, scalable and low power alternative that permits on-chip co-located processing and memory in fine-grain distributed parallel architecture. Here we report first use of resistive switching memory devices for implementing and training a Restricted Boltzmann Machine (RBM), a generative probabilistic graphical model as a key component for unsupervised learning in deep networks. We experimentally demonstrate a 45-synapse RBM realized with 90 resistive switching phase change memory (PCM) elements trained with a bio-inspired variant of the Contrastive Divergence (CD) algorithm, implementing Hebbian and anti-Hebbian weight updates. The resistive PCM devices show a two-fold to ten-fold reduction in error rate in a missing pixel pattern completion task trained over 30 epochs, compared to untrained case. Measured programming energy consumption is 6.1 nJ per epoch with the resistive switching PCM devices, a factor of 150 times lower than conventional processor-memory systems. We analyze and discuss the dependence of learning performance on cycle-to-cycle variations as well as number of gradual levels in the PCM analog memory devices.

READ FULL TEXT

page 2

page 4

page 5

research
03/10/2023

High-Speed and Energy-Efficient Non-Volatile Silicon Photonic Memory Based on Heterogeneously Integrated Memresonator

Recently, interest in programmable photonics integrated circuits has gro...
research
06/23/2016

Precise deep neural network computation on imprecise low-power analog hardware

There is an urgent need for compact, fast, and power-efficient hardware ...
research
05/23/2023

Bulk-Switching Memristor-based Compute-In-Memory Module for Deep Neural Network Training

The need for deep neural network (DNN) models with higher performance an...
research
07/01/2019

On-chip learning in a conventional silicon MOSFET based Analog Hardware Neural Network

On-chip learning in a crossbar array based analog hardware Neural Networ...
research
02/23/2021

Memory-efficient Speech Recognition on Smart Devices

Recurrent transducer models have emerged as a promising solution for spe...
research
06/27/2019

4K-Memristor Analog-Grade Passive Crossbar Circuit

The superior density of passive analog-grade memristive crossbars may en...
research
05/22/2023

IMBUE: In-Memory Boolean-to-CUrrent Inference ArchitecturE for Tsetlin Machines

In-memory computing for Machine Learning (ML) applications remedies the ...

Please sign up or login with your details

Forgot password? Click here to reset