A wake-sleep algorithm for recurrent, spiking neural networks

03/18/2017
by   Johannes Thiele, et al.
0

We investigate a recently proposed model for cortical computation which performs relational inference. It consists of several interconnected, structurally equivalent populations of leaky integrate-and-fire (LIF) neurons, which are trained in a self-organized fashion with spike-timing dependent plasticity (STDP). Despite its robust learning dynamics, the model is susceptible to a problem typical for recurrent networks which use a correlation based (Hebbian) learning rule: if trained with high learning rates, the recurrent connections can cause strong feedback loops in the network dynamics, which lead to the emergence of attractor states. This causes a strong reduction in the number of representable patterns and a decay in the inference ability of the network. As a solution, we introduce a conceptually very simple "wake-sleep" algorithm: during the wake phase, training is executed normally, while during the sleep phase, the network "dreams" samples from its generative model, which are induced by random input. This process allows us to activate the attractor states in the network, which can then be unlearned effectively by an anti-Hebbian mechanism. The algorithm allows us to increase learning rates up to a factor of ten while avoiding clustering, which allows the network to learn several times faster. Also for low learning rates, where clustering is not an issue, it improves convergence speed and reduces the final inference error.

READ FULL TEXT

page 5

page 7

page 8

research
08/01/2019

Biologically inspired sleep algorithm for artificial neural networks

Sleep plays an important role in incremental learning and consolidation ...
research
05/26/2022

Learning in Feedback-driven Recurrent Spiking Neural Networks using full-FORCE Training

Feedback-driven recurrent spiking neural networks (RSNNs) are powerful c...
research
07/08/2023

Deep Unsupervised Learning Using Spike-Timing-Dependent Plasticity

Spike-Timing-Dependent Plasticity (STDP) is an unsupervised learning mec...
research
06/17/2017

Fatiguing STDP: Learning from Spike-Timing Codes in the Presence of Rate Codes

Spiking neural networks (SNNs) could play a key role in unsupervised mac...
research
09/22/2015

Learning Wake-Sleep Recurrent Attention Models

Despite their success, convolutional neural networks are computationally...
research
10/24/2018

Sleep-like slow oscillations induce hierarchical memory association and synaptic homeostasis in thalamo-cortical simulations

The occurrence of sleep is widespread over the large majority of animal ...

Please sign up or login with your details

Forgot password? Click here to reset