Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

07/02/2020
by   Yaniv Blumenfeld, et al.
0

Deep neural networks are typically initialized with random weights, with variances chosen to facilitate signal propagation and stable gradients. It is also believed that diversity of features is an important property of these initializations. We construct a deep convolutional network with identical features by initializing almost all the weights to 0. The architecture also enables perfect signal propagation and stable gradients, and achieves high accuracy on standard benchmarks. This indicates that random, diverse initializations are not necessary for training neural networks. An essential element in training this network is a mechanism of symmetry breaking; we study this phenomenon and find that standard GPU operations, which are non-deterministic, can serve as a sufficient source of symmetry breaking to enable training.

READ FULL TEXT

page 23

page 24

research
12/11/2019

Is Feature Diversity Necessary in Neural Network Initialization?

Standard practice in training neural networks involves initializing the ...
research
07/04/2020

Finding Symmetry Breaking Order Parameters with Euclidean Neural Networks

Curie's principle states that "when effects show certain asymmetry, this...
research
06/26/2023

Scaling and Resizing Symmetry in Feedforward Networks

Weights initialization in deep neural networks have a strong impact on t...
research
10/17/2017

Spontaneous Symmetry Breaking in Neural Networks

We propose a framework to understand the unprecedented performance and r...
research
10/25/2021

ZerO Initialization: Initializing Residual Networks with only Zeros and Ones

Deep neural networks are usually initialized with random weights, with a...
research
02/02/2018

Intriguing Properties of Randomly Weighted Networks: Generalizing While Learning Next to Nothing

Training deep neural networks results in strong learned representations ...

Please sign up or login with your details

Forgot password? Click here to reset