'Place-cell' emergence and learning of invariant data with restricted Boltzmann machines: breaking and dynamical restoration of continuous symmetries in the weight space

12/30/2019
by   Moshir Harsh, et al.
0

Distributions of data or sensory stimuli often enjoy underlying invariances. How and to what extent those symmetries are captured by unsupervised learning methods is a relevant question in machine learning and in computational neuroscience. We study here, through a combination of numerical and analytical tools, the learning dynamics of Restricted Boltzmann Machines (RBM), a neural network paradigm for representation learning. As learning proceeds from a random configuration of the network weights, we show the existence of, and characterize a symmetry-breaking phenomenon, in which the latent variables acquire receptive fields focusing on limited parts of the invariant manifold supporting the data. The symmetry is restored at large learning times through the diffusion of the receptive field over the invariant manifold; hence, the RBM effectively spans a continuous attractor in the space of network weights. This symmetry-breaking phenomenon takes place only if the amount of data available for training exceeds some critical value, depending on the network size and the intensity of symmetry-induced correlations in the data; below this 'retarded-learning' threshold, the network weights are essentially noisy and overfit the data.

READ FULL TEXT

page 6

page 7

page 9

page 10

page 16

research
04/30/2019

Minimal model of permutation symmetry in unsupervised learning

Permutation of any two hidden units yields invariant properties in typic...
research
07/04/2019

A Quantum Field Theory of Representation Learning

Continuous symmetries and their breaking play a prominent role in contem...
research
03/10/2021

Symmetry Breaking in Symmetric Tensor Decomposition

In this note, we consider the optimization problem associated with compu...
research
11/06/2019

Statistical physics of unsupervised learning with prior knowledge in neural networks

Integrating sensory inputs with prior beliefs from past experiences in u...
research
02/01/2022

Data-driven emergence of convolutional structure in neural networks

Exploiting data invariances is crucial for efficient learning in both ar...
research
03/17/2018

Replica Symmetry Breaking in Bipartite Spin Glasses and Neural Networks

Some interesting recent advances in the theoretical understanding of neu...
research
07/02/2020

Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

Deep neural networks are typically initialized with random weights, with...

Please sign up or login with your details

Forgot password? Click here to reset