RotationOut as a Regularization Method for Neural Network

11/18/2019
by   Kai Hu, et al.
0

In this paper, we propose a novel regularization method, RotationOut, for neural networks. Different from Dropout that handles each neuron/channel independently, RotationOut regards its input layer as an entire vector and introduces regularization by randomly rotating the vector. RotationOut can also be used in convolutional layers and recurrent layers with small modifications. We further use a noise analysis method to interpret the difference between RotationOut and Dropout in co-adaptation reduction. Using this method, we also show how to use RotationOut/Dropout together with Batch Normalization. Extensive experiments in vision and language tasks are conducted to show the effectiveness of the proposed method. Codes are available at <https://github.com/RotationOut/RotationOut>.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset