AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks

09/18/2021
by   Garrett Bingham, et al.
0

Neural networks require careful weight initialization to prevent signals from exploding or vanishing. Existing initialization schemes solve this problem in specific cases by assuming that the network has a certain activation function or topology. It is difficult to derive such weight initialization strategies, and modern architectures therefore often use these same initialization schemes even though their assumptions do not hold. This paper introduces AutoInit, a weight initialization algorithm that automatically adapts to different neural network architectures. By analytically tracking the mean and variance of signals as they propagate through the network, AutoInit is able to appropriately scale the weights at each layer to avoid exploding or vanishing signals. Experiments demonstrate that AutoInit improves performance of various convolutional and residual networks across a range of activation function, dropout, weight decay, learning rate, and normalizer settings. Further, in neural architecture search and activation function meta-learning, AutoInit automatically calculates specialized weight initialization strategies for thousands of unique architectures and hundreds of unique activation functions, and improves performance in vision, language, tabular, multi-task, and transfer learning scenarios. AutoInit thus serves as an automatic configuration tool that makes design of new neural network architectures more robust. The AutoInit package provides a wrapper around existing TensorFlow models and is available at https://github.com/cognizant-ai-labs/autoinit.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2020

A Survey on Activation Functions and their relation with Xavier and He Normal Initialization

In artificial neural network, the activation function and the weight ini...
research
04/06/2023

Optimizing Neural Networks through Activation Function Discovery and Automatic Weight Initialization

Automated machine learning (AutoML) methods improve upon existing models...
research
08/04/2021

Growing an architecture for a neural network

We propose a new kind of automatic architecture search algorithm. The al...
research
10/08/2020

Randomized Overdrive Neural Networks

By processing audio signals in the time-domain with randomly weighted te...
research
01/13/2023

Efficient Activation Function Optimization through Surrogate Modeling

Carefully designed activation functions can improve the performance of n...
research
02/16/2021

GradInit: Learning to Initialize Neural Networks for Stable and Efficient Training

Changes in neural architectures have fostered significant breakthroughs ...
research
06/25/2020

Learning compositional functions via multiplicative weight updates

Compositionality is a basic structural feature of both biological and ar...

Please sign up or login with your details

Forgot password? Click here to reset