Fixup Initialization: Residual Learning Without Normalization

01/27/2019
by   Hongyi Zhang, et al.
8

Normalization layers are a staple in state-of-the-art deep neural network architectures. They are widely believed to stabilize training, enable higher learning rate, accelerate convergence and improve generalization, though the reason for their effectiveness is still an active research topic. In this work, we challenge the commonly-held beliefs by showing that none of the perceived benefits is unique to normalization. Specifically, we propose fixed-update initialization (Fixup), an initialization motivated by solving the exploding and vanishing gradient problem at the beginning of training via properly rescaling a standard initialization. We find training residual networks with Fixup to be as stable as training with normalization -- even for networks with 10,000 layers. Furthermore, with proper regularization, Fixup enables residual networks without normalization to achieve state-of-the-art performance in image classification and machine translation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2021

ZerO Initialization: Initializing Residual Networks with only Zeros and Ones

Deep neural networks are usually initialized with random weights, with a...
research
04/15/2023

Non-Proportional Parametrizations for Stable Hypernetwork Learning

Hypernetworks are neural networks that generate the parameters of anothe...
research
09/09/2017

Deep Residual Networks and Weight Initialization

Residual Network (ResNet) is the state-of-the-art architecture that real...
research
10/05/2022

Dynamical Isometry for Residual Networks

The training success, training speed and generalization ability of neura...
research
05/15/2022

Guidelines for the Regularization of Gammas in Batch Normalization for Deep Residual Networks

L2 regularization for weights in neural networks is widely used as a sta...
research
03/09/2020

Correlated Initialization for Correlated Data

Spatial data exhibits the property that nearby points are correlated. Th...
research
12/02/2018

Analysis on Gradient Propagation in Batch Normalized Residual Networks

We conduct mathematical analysis on the effect of batch normalization (B...

Please sign up or login with your details

Forgot password? Click here to reset