A random energy approach to deep learning

12/17/2021
by   Rongrong Xie, et al.
0

We study a generic ensemble of deep belief networks which is parametrized by the distribution of energy levels of the hidden states of each layer. We show that, within a random energy approach, statistical dependence can propagate from the visible to deep layers only if each layer is tuned close to the critical point during learning. As a consequence, efficiently trained learning machines are characterised by a broad distribution of energy levels. The analysis of Deep Belief Networks and Restricted Boltzmann Machines on different datasets confirms these conclusions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2020

Minimax formula for the replica symmetric free energy of deep restricted Boltzmann machines

We study the free energy of a most used deep architecture for restricted...
research
11/14/2014

Deep Narrow Boltzmann Machines are Universal Approximators

We show that deep narrow Boltzmann machines are universal approximators ...
research
06/24/2015

A Novel Feature Extraction Method for Scene Recognition Based on Centered Convolutional Restricted Boltzmann Machines

Scene recognition is an important research topic in computer vision, whi...
research
12/09/2022

Attention in a family of Boltzmann machines emerging from modern Hopfield networks

Hopfield networks and Boltzmann machines (BMs) are fundamental energy-ba...
research
03/29/2013

Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units

We generalize recent theoretical work on the minimal number of layers of...
research
03/01/2013

Maximal Information Divergence from Statistical Models defined by Neural Networks

We review recent results about the maximal values of the Kullback-Leible...
research
05/11/2023

Investigating the generative dynamics of energy-based neural networks

Generative neural networks can produce data samples according to the sta...

Please sign up or login with your details

Forgot password? Click here to reset