Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

10/14/2017
by   Hyeonwoo Noh, et al.
0

Overfitting is one of the most critical challenges in deep neural networks, and there are various types of regularization methods to improve generalization performance. Injecting noises to hidden units during training, e.g., dropout, is known as a successful regularizer, but it is still not clear enough why such training techniques work well in practice and how we can maximize their benefit in the presence of two conflicting objectives---optimizing to true data distribution and preventing overfitting by regularization. This paper addresses the above issues by 1) interpreting that the conventional training methods with regularization by noise injection optimize the lower bound of the true objective and 2) proposing a technique to achieve a tighter lower bound using multiple noise samples per training example in a stochastic gradient descent iteration. We demonstrate the effectiveness of our idea in several computer vision applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2019

Ising-Dropout: A Regularization Method for Training and Compression of Deep Neural Networks

Overfitting is a major problem in training machine learning models, spec...
research
05/26/2023

Ghost Noise for Regularizing Deep Neural Networks

Batch Normalization (BN) is widely used to stabilize the optimization pr...
research
07/06/2020

Regularization Matters: A Nonparametric Perspective on Overparametrized Neural Network

Overparametrized neural networks trained by gradient descent (GD) can pr...
research
04/10/2016

Visualization Regularizers for Neural Network based Image Recognition

The success of deep neural networks is mostly due their ability to learn...
research
06/06/2021

Regularization in ResNet with Stochastic Depth

Regularization plays a major role in modern deep learning. From classic ...
research
03/09/2023

TANGOS: Regularizing Tabular Neural Networks through Gradient Orthogonalization and Specialization

Despite their success with unstructured data, deep neural networks are n...
research
06/04/2020

Robust Sampling in Deep Learning

Deep learning requires regularization mechanisms to reduce overfitting a...

Please sign up or login with your details

Forgot password? Click here to reset