Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

11/05/2013
by   Emre Neftci, et al.
0

Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.

READ FULL TEXT
research
02/05/2016

Computing with hardware neurons: spiking or classical? Perspectives of applied Spiking Neural Networks from the hardware side

While classical neural networks take a position of a leading method in t...
research
01/08/2019

SNRA: A Spintronic Neuromorphic Reconfigurable Array for In-Circuit Training and Evaluation of Deep Belief Networks

In this paper, a spintronic neuromorphic reconfigurable Array (SNRA) is ...
research
07/13/2020

Coarse scale representation of spiking neural networks: backpropagation through spikes and application to neuromorphic hardware

In this work we explore recurrent representations of leaky integrate and...
research
03/26/2015

Gibbs Sampling with Low-Power Spiking Digital Neurons

Restricted Boltzmann Machines and Deep Belief Networks have been success...
research
05/30/2022

Dictionary Learning with Accumulator Neurons

The Locally Competitive Algorithm (LCA) uses local competition between n...
research
01/16/2019

The Discrete Langevin Machine: Bridging the Gap Between Thermodynamic and Neuromorphic Systems

A formulation of Langevin dynamics for discrete systems is derived as a ...
research
02/18/2016

A Nonparametric Framework for Quantifying Generative Inference on Neuromorphic Systems

Restricted Boltzmann Machines and Deep Belief Networks have been success...

Please sign up or login with your details

Forgot password? Click here to reset