Is Feature Diversity Necessary in Neural Network Initialization?

12/11/2019
by   Yaniv Blumenfeld, et al.
0

Standard practice in training neural networks involves initializing the weights in an independent fashion. The results of recent work suggest that feature "diversity" at initialization plays an important role in training the network. However, other initialization schemes with reduced feature diversity have also been shown to be viable. In this work, we conduct a series of experiments aimed at elucidating the importance of feature diversity at initialization. Experimenting on a shallow network, we show that a complete lack of diversity is harmful to training, but its effect can be counteracted by a relatively small addition of noise. Furthermore, we construct a deep convolutional network with identical features at initialization and almost all of the weights initialized at 0 that can be trained to reach accuracy matching its standard-initialized counterpart.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2020

Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

Deep neural networks are typically initialized with random weights, with...
research
09/18/2019

A Study on Binary Neural Networks Initialization

Initialization plays a crucial role in training neural models. Binary Ne...
research
02/19/2021

On the Implicit Bias of Initialization Shape: Beyond Infinitesimal Mirror Descent

Recent work has highlighted the role of initialization scale in determin...
research
11/02/2021

Subquadratic Overparameterization for Shallow Neural Networks

Overparameterization refers to the important phenomenon where the width ...
research
05/17/2023

Understanding the Initial Condensation of Convolutional Neural Networks

Previous research has shown that fully-connected networks with small ini...
research
05/26/2020

Is deeper better? It depends on locality of relevant features

It has been recognized that a heavily overparameterized artificial neura...
research
08/14/2021

Neuron Campaign for Initialization Guided by Information Bottleneck Theory

Initialization plays a critical role in the training of deep neural netw...

Please sign up or login with your details

Forgot password? Click here to reset