On the relationship between Dropout and Equiangular Tight Frames

10/14/2018
by   Dor Bank, et al.
0

Dropout is a popular regularization technique in neural networks. Yet, the reason for its success is still not fully understood. This paper provides a new interpretation of Dropout from a frame theory perspective. This leads to a novel regularization technique for neural networks that minimizes the cross-correlation between filters in the network. We demonstrate its applicability in convolutional and fully connected layers in both feed-forward and recurrent networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2015

Efficient batchwise dropout training using submatrices

Dropout is a popular technique for regularizing artificial neural networ...
research
07/25/2019

DropAttention: A Regularization Method for Fully-Connected Self-Attention Networks

Variants dropout methods have been designed for the fully-connected laye...
research
02/21/2017

Delving Deeper into MOOC Student Dropout Prediction

In order to obtain reliable accuracy estimates for automatic MOOC dropou...
research
06/06/2021

Regularization in ResNet with Stochastic Depth

Regularization plays a major role in modern deep learning. From classic ...
research
01/04/2022

Sparse Super-Regular Networks

It has been argued by Thom and Palm that sparsely-connected neural netwo...
research
11/18/2016

Compacting Neural Network Classifiers via Dropout Training

We introduce dropout compaction, a novel method for training feed-forwar...
research
01/22/2018

The Hybrid Bootstrap: A Drop-in Replacement for Dropout

Regularization is an important component of predictive model building. T...

Please sign up or login with your details

Forgot password? Click here to reset