How data, synapses and neurons interact with each other: a variational principle marrying gradient ascent and message passing

11/11/2019
by   Haiping Huang, et al.
0

Unsupervised learning requiring only raw data is not only a fundamental function of the cerebral cortex, but also a foundation for a next generation of artificial neural networks. However, a unified theoretical framework to treat sensory inputs, synapses and neural activity together is still lacking. The computational obstacle originates from the discrete nature of synapses, and complex interactions among these three essential elements of learning. Here, we propose a variational mean-field theory in which only the distribution of synaptic weight is considered. The unsupervised learning can then be decomposed into two interwoven steps: a maximization step is carried out as a gradient ascent of the lower-bound on the data log-likelihood, and an expectation step is carried out as a message passing procedure on an equivalent or dual neural network whose parameter is specified by the variational parameter of the weight distribution. Therefore, our framework explains how data (or sensory inputs), synapses and neural activities interact with each other to achieve the goal of extracting statistical regularities in sensory inputs. This variational framework is verified in restricted Boltzmann machines with planted synaptic weights and learning handwritten digits.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2015

Advanced Mean Field Theory of Restricted Boltzmann Machine

Learning in restricted Boltzmann machine is typically hard due to the co...
research
12/06/2016

Statistical mechanics of unsupervised feature learning in a restricted Boltzmann machine with binary synapses

Revealing hidden features in unlabeled data is called unsupervised featu...
research
12/06/2022

Statistical mechanics of continual learning: variational principle and mean-field potential

An obstacle to artificial general intelligence is set by the continual l...
research
07/09/2020

Training Restricted Boltzmann Machines with Binary Synapses using the Bayesian Learning Rule

Restricted Boltzmann machines (RBMs) with low-precision synapses are muc...
research
11/06/2019

Statistical physics of unsupervised learning with prior knowledge in neural networks

Integrating sensory inputs with prior beliefs from past experiences in u...
research
07/02/2020

MPLP: Learning a Message Passing Learning Protocol

We present a novel method for learning the weights of an artificial neur...

Please sign up or login with your details

Forgot password? Click here to reset