Revisiting Initialization of Neural Networks

04/20/2020
by   Maciej Skorski, et al.
0

Good initialization of weights is crucial for effective training of deep neural networks. In this paper we discuss an intialization scheme based on rigorous estimation of the local curvature. The proposed approach is more systematic and general than the state-of-art intialization from Glorot at al. and follow-up works.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2020

An Effective and Efficient Initialization Scheme for Training Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
05/16/2020

An Effective and Efficient Initialization Scheme for Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
06/27/2022

AutoInit: Automatic Initialization via Jacobian Tuning

Good initialization is essential for training Deep Neural Networks (DNNs...
research
10/19/2017

Historical Document Image Segmentation with LDA-Initialized Deep Neural Networks

In this paper, we present a novel approach to perform deep neural networ...
research
01/21/2018

Curvature-based Comparison of Two Neural Networks

In this paper we show the similarities and differences of two deep neura...
research
08/14/2021

Neuron Campaign for Initialization Guided by Information Bottleneck Theory

Initialization plays a critical role in the training of deep neural netw...
research
04/04/2020

A Bayesian approach for initialization of weights in backpropagation neural net with application to character recognition

Convergence rate of training algorithms for neural networks is heavily a...

Please sign up or login with your details

Forgot password? Click here to reset