Correlated Initialization for Correlated Data

03/09/2020
by   Johannes Schneider, et al.
0

Spatial data exhibits the property that nearby points are correlated. This holds also for learnt representations across layers, but not for commonly used weight initialization methods. Our theoretical analysis reveals for uncorrelated initialization that (i) flow through layers suffers from much more rapid decrease and (ii) training of individual parameters is subject to more “zig-zagging”. We propose multiple methods for correlated initialization. For CNNs, they yield accuracy gains of several per cent in the absence of regularization. Even for properly tuned L2-regularization gains are often possible.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2022

When Does Re-initialization Work?

Re-initializing a neural network during training has been observed to im...
research
09/18/2019

A Study on Binary Neural Networks Initialization

Initialization plays a crucial role in training neural models. Binary Ne...
research
06/05/2019

How to Initialize your Network? Robust Initialization for WeightNorm & ResNets

Residual networks (ResNet) and weight normalization play an important ro...
research
01/27/2019

Fixup Initialization: Residual Learning Without Normalization

Normalization layers are a staple in state-of-the-art deep neural networ...
research
03/23/2021

Initializing ReLU networks in an expressive subspace of weights

Using a mean-field theory of signal propagation, we analyze the evolutio...
research
03/27/2019

A Sober Look at Neural Network Initializations

Initializing the weights and the biases is a key part of the training pr...
research
10/11/2018

Canadian Crime Rates in the Penalty Box

Over the 1962-2016 period, the Canadian violent crime rate has remained ...

Please sign up or login with your details

Forgot password? Click here to reset