RotationOut as a Regularization Method for Neural Network

11/18/2019
by   Kai Hu, et al.
0

In this paper, we propose a novel regularization method, RotationOut, for neural networks. Different from Dropout that handles each neuron/channel independently, RotationOut regards its input layer as an entire vector and introduces regularization by randomly rotating the vector. RotationOut can also be used in convolutional layers and recurrent layers with small modifications. We further use a noise analysis method to interpret the difference between RotationOut and Dropout in co-adaptation reduction. Using this method, we also show how to use RotationOut/Dropout together with Batch Normalization. Extensive experiments in vision and language tasks are conducted to show the effectiveness of the proposed method. Codes are available at <https://github.com/RotationOut/RotationOut>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2022

MaxMatch-Dropout: Subword Regularization for WordPiece

We present a subword regularization method for WordPiece, which uses a m...
research
10/22/2018

An Exploration of Dropout with RNNs for Natural Language Inference

Dropout is a crucial regularization technique for the Recurrent Neural N...
research
03/02/2023

Dropout Reduces Underfitting

Introduced by Hinton et al. in 2012, dropout has stood the test of time ...
research
11/28/2019

Continuous Dropout

Dropout has been proven to be an effective algorithm for training robust...
research
10/24/2016

Surprisal-Driven Zoneout

We propose a novel method of regularization for recurrent neural network...
research
09/11/2023

Efficient Finite Initialization for Tensorized Neural Networks

We present a novel method for initializing layers of tensorized neural n...
research
01/05/2021

AutoDropout: Learning Dropout Patterns to Regularize Deep Networks

Neural networks are often over-parameterized and hence benefit from aggr...

Please sign up or login with your details

Forgot password? Click here to reset