Root Mean Square Layer Normalization

10/16/2019
by   Biao Zhang, et al.
0

Layer normalization (LayerNorm) has been successfully applied to various deep neural networks to help stabilize training and boost model convergence because of its capability in handling re-centering and re-scaling of both inputs and weight matrix. However, the computational overhead introduced by LayerNorm makes these improvements expensive and significantly slows the underlying network, e.g. RNN in particular. In this paper, we hypothesize that re-centering invariance in LayerNorm is dispensable and propose root mean square layer normalization, or RMSNorm. RMSNorm regularizes the summed inputs to a neuron in one layer according to root mean square (RMS), giving the model re-scaling invariance property and implicit learning rate adaptation ability. RMSNorm is computationally simpler and thus more efficient than LayerNorm. We also present partial RMSNorm, or pRMSNorm where the RMS is estimated from p the summed inputs without breaking the above properties. Extensive experiments on several tasks using diverse network architectures show that RMSNorm achieves comparable performance against LayerNorm but reduces the running time by 7 on different models. Source code is available at https://github.com/bzhangGo/rmsnorm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

MimicNorm: Weight Mean and Last BN Layer Mimic the Dynamic of Batch Normalization

Substantial experiments have validated the success of Batch Normalizatio...
research
09/24/2022

Two Bicomplex Least Mean Square (BLMS) algorithms

We study and introduce new gradient operators in the complex and bicompl...
research
05/11/2020

Normalized Convolutional Neural Network

In this paper, we propose Normalized Convolutional Neural Network(NCNN)....
research
06/10/2021

Investigating Alternatives to the Root Mean Square for Adaptive Gradient Methods

Adam is an adaptive gradient method that has experienced widespread adop...
research
11/24/2021

Auto robust relative radiometric normalization via latent change noise modelling

Relative radiometric normalization(RRN) of different satellite images of...
research
09/28/2022

Breaking Time Invariance: Assorted-Time Normalization for RNNs

Methods such as Layer Normalization (LN) and Batch Normalization (BN) ha...
research
03/07/2023

PSDNet: Determination of Particle Size Distributions Using Synthetic Soil Images and Convolutional Neural Networks

This project aimed to determine the grain size distribution of granular ...

Please sign up or login with your details

Forgot password? Click here to reset