Max-Pooling Dropout for Regularization of Convolutional Neural Networks

12/04/2015
by   Haibing Wu, et al.
0

Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this insight, we advocate employing our proposed probabilistic weighted pooling, instead of commonly used max-pooling, to act as model averaging at test time. Empirical evidence validates the superiority of probabilistic weighted pooling. We also compare max-pooling dropout and stochastic pooling, both of which introduce stochasticity based on multinomial distributions at pooling stage.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2015

Towards Dropout Training for Convolutional Neural Networks

Recently, dropout has seen increasing use in deep learning. For deep con...
research
12/18/2014

Fractional Max-Pooling

Convolutional networks almost always incorporate some form of spatial po...
research
07/30/2020

Generalization Comparison of Deep Neural Networks via Output Sensitivity

Although recent works have brought some insights into the performance im...
research
06/11/2015

Spectral Representations for Convolutional Neural Networks

Discrete Fourier transforms provide a significant speedup in the computa...
research
01/16/2013

Stochastic Pooling for Regularization of Deep Convolutional Neural Networks

We introduce a simple and effective method for regularizing large convol...
research
09/29/2022

Enumeration of max-pooling responses with generalized permutohedra

We investigate the combinatorics of max-pooling layers, which are functi...
research
09/03/2021

Ordinal Pooling

In the framework of convolutional neural networks, downsampling is often...

Please sign up or login with your details

Forgot password? Click here to reset