On the relationship between Dropout and Equiangular Tight Frames

10/14/2018
by   Dor Bank, et al.
0

Dropout is a popular regularization technique in neural networks. Yet, the reason for its success is still not fully understood. This paper provides a new interpretation of Dropout from a frame theory perspective. This leads to a novel regularization technique for neural networks that minimizes the cross-correlation between filters in the network. We demonstrate its applicability in convolutional and fully connected layers in both feed-forward and recurrent networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset