Correct Normalization Matters: Understanding the Effect of Normalization On Deep Neural Network Models For Click-Through Rate Prediction

06/23/2020
by   Zhiqiang Wang, et al.
0

Normalization has become one of the most fundamental components in many deep neural networks for machine learning tasks while deep neural network has also been widely used in CTR estimation field. Among most of the proposed deep neural network models, few model utilize normalization approaches. Though some works such as Deep Cross Network (DCN) and Neural Factorization Machine (NFM) use Batch Normalization in MLP part of the structure, there isn't work to thoroughly explore the effect of the normalization on the DNN ranking systems. In this paper, we conduct a systematic study on the effect of widely used normalization schemas by applying the various normalization approaches to both feature embedding and MLP part in DNN model. Extensive experiments are conduct on three real-world datasets and the experiment results demonstrate that the correct normalization significantly enhances model's performance. We also propose a new and effective normalization approaches based on LayerNorm named variance only LayerNorm(VO-LN) in this work. A normalization enhanced DNN model named NormDNN is also proposed based on the above-mentioned observation. As for the reason why normalization works for DNN models in CTR estimation, we find that the variance of normalization plays the main role and give an explanation in this work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2020

Improve the Robustness and Accuracy of Deep Neural Network with L_2,∞ Normalization

In this paper, the robustness and accuracy of the deep neural network (D...
research
05/26/2023

Ghost Noise for Regularizing Deep Neural Networks

Batch Normalization (BN) is widely used to stabilize the optimization pr...
research
08/18/2020

Training Deep Neural Networks Without Batch Normalization

Training neural networks is an optimization problem, and finding a decen...
research
07/06/2022

Difference in Euclidean Norm Can Cause Semantic Divergence in Batch Normalization

In this paper, we show that the difference in Euclidean norm of samples ...
research
09/27/2020

Normalization Techniques in Training DNNs: Methodology, Analysis and Application

Normalization techniques are essential for accelerating the training and...
research
02/09/2021

MaskNet: Introducing Feature-Wise Multiplication to CTR Ranking Models by Instance-Guided Mask

Click-Through Rate(CTR) estimation has become one of the most fundamenta...
research
10/12/2018

Mode Normalization

Normalization methods are a central building block in the deep learning ...

Please sign up or login with your details

Forgot password? Click here to reset