A Weight Initialization Based on the Linear Product Structure for Neural Networks

09/01/2021
by   Qipin Chen, et al.
0

Weight initialization plays an important role in training neural networks and also affects tremendous deep learning applications. Various weight initialization strategies have already been developed for different activation functions with different neural networks. These initialization algorithms are based on minimizing the variance of the parameters between layers and might still fail when neural networks are deep, e.g., dying ReLU. To address this challenge, we study neural networks from a nonlinear computation point of view and propose a novel weight initialization strategy that is based on the linear product structure (LPS) of neural networks. The proposed strategy is derived from the polynomial approximation of activation functions by using theories of numerical algebraic geometry to guarantee to find all the local minima. We also provide a theoretical analysis that the LPS initialization has a lower probability of dying ReLU comparing to other existing initialization strategies. Finally, we test the LPS initialization algorithm on both fully connected neural networks and convolutional neural networks to show its feasibility, efficiency, and robustness on public datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2019

How to Initialize your Network? Robust Initialization for WeightNorm & ResNets

Residual networks (ResNet) and weight normalization play an important ro...
research
05/25/2020

Fractional moment-preserving initialization schemes for training fully-connected neural networks

A common approach to initialization in deep neural networks is to sample...
research
06/10/2019

Scaling Laws for the Principled Design, Initialization and Preconditioning of ReLU Networks

In this work, we describe a set of rules for the design and initializati...
research
07/31/2017

An Effective Training Method For Deep Convolutional Neural Network

In this paper, we propose the nonlinearity generation method to speed up...
research
06/01/2023

Initial Guessing Bias: How Untrained Networks Favor Some Classes

The initial state of neural networks plays a central role in conditionin...
research
11/02/2020

Reducing Neural Network Parameter Initialization Into an SMT Problem

Training a neural network (NN) depends on multiple factors, including bu...
research
05/10/2022

Neural Networks with Different Initialization Methods for Depression Detection

As a common mental disorder, depression is a leading cause of various di...

Please sign up or login with your details

Forgot password? Click here to reset