A learning framework for winner-take-all networks with stochastic synapses

08/14/2017
by   Hesham Mostafa, et al.
0

Many recent generative models make use of neural networks to transform the probability distribution of a simple low-dimensional noise process into the complex distribution of the data. This raises the question of whether biological networks operate along similar principles to implement a probabilistic model of the environment through transformations of intrinsic noise processes. The intrinsic neural and synaptic noise processes in biological networks, however, are quite different from the noise processes used in current abstract generative networks. This, together with the discrete nature of spikes and local circuit interactions among the neurons, raises several difficulties when using recent generative modeling frameworks to train biologically motivated models. In this paper, we show that a biologically motivated model based on multi-layer winner-take-all (WTA) circuits and stochastic synapses admits an approximate analytical description. This allows us to use the proposed networks in a variational learning setting where stochastic backpropagation is used to optimize a lower bound on the data log likelihood, thereby learning a generative model of the data. We illustrate the generality of the proposed networks and learning technique by using them in a structured output prediction task, and in a semi-supervised learning task. Our results extend the domain of application of modern stochastic network architectures to networks where synaptic transmission failure is the principal noise mechanism.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2019

Using local plasticity rules to train recurrent neural networks

To learn useful dynamics on long time scales, neurons must use plasticit...
research
08/13/2020

A statistical theory of semi-supervised learning

We currently lack a solid statistical understanding of semi-supervised l...
research
10/26/2017

On the role of synaptic stochasticity in training low-precision neural networks

Stochasticity and limited precision of synaptic weights in neural networ...
research
07/01/2017

Synthesizing Deep Neural Network Architectures using Biological Synaptic Strength Distributions

In this work, we perform an exploratory study on synthesizing deep neura...
research
10/04/2022

Adaptive Synaptic Failure Enables Sampling from Posterior Predictive Distributions in the Brain

Bayesian interpretations of neural processing require that biological me...
research
03/07/2004

Memorization in a neural network with adjustable transfer function and conditional gating

The main problem about replacing LTP as a memory mechanism has been to f...
research
06/28/2015

Neural Simpletrons - Minimalistic Directed Generative Networks for Learning with Few Labels

Classifiers for the semi-supervised setting often combine strong supervi...

Please sign up or login with your details

Forgot password? Click here to reset