Scaling and Resizing Symmetry in Feedforward Networks

06/26/2023
by   Carlos Cardona, et al.
0

Weights initialization in deep neural networks have a strong impact on the speed of converge of the learning map. Recent studies have shown that in the case of random initializations, a chaos/order phase transition occur in the space of variances of random weights and biases. Experiments then had shown that large improvements can be made, in terms of the training speed, if a neural network is initialized on values along the critical line of such phase transition. In this contribution, we show evidence that the scaling property exhibited by physical systems at criticality, is also present in untrained feedforward networks with random weights initialization at the critical line. Additionally, we suggest an additional data-resizing symmetry, which is directly inherited from the scaling symmetry at criticality.

READ FULL TEXT

page 9

page 10

page 14

research
05/16/2020

An Effective and Efficient Initialization Scheme for Training Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
05/16/2020

An Effective and Efficient Initialization Scheme for Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
12/30/2020

Perspective: A Phase Diagram for Deep Learning unifying Jamming, Feature Learning and Lazy Training

Deep learning algorithms are responsible for a technological revolution ...
research
07/02/2020

Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

Deep neural networks are typically initialized with random weights, with...
research
07/05/2023

Absorbing Phase Transitions in Artificial Deep Neural Networks

Theoretical understanding of the behavior of infinitely-wide neural netw...
research
06/06/2016

Feedforward Initialization for Fast Inference of Deep Generative Networks is biologically plausible

We consider deep multi-layered generative models such as Boltzmann machi...
research
04/09/2018

Universal and Succinct Source Coding of Deep Neural Networks

Deep neural networks have shown incredible performance for inference tas...

Please sign up or login with your details

Forgot password? Click here to reset