Fractional Max-Pooling

12/18/2014
by   Benjamin Graham, et al.
0

Convolutional networks almost always incorporate some form of spatial pooling, and very often it is alpha times alpha max-pooling with alpha=2. Max-pooling act on the hidden layers of the network, reducing their size by an integer multiplicative factor alpha. The amazing by-product of discarding 75 of your data is that you build into the network a degree of invariance with respect to translations and elastic distortions. However, if you simply alternate convolutional layers with max-pooling layers, performance is limited due to the rapid reduction in spatial size, and the disjoint nature of the pooling regions. We have formulated a fractional version of max-pooling where alpha is allowed to take non-integer values. Our version of max-pooling is stochastic as there are lots of different ways of constructing suitable pooling regions. We find that our form of fractional max-pooling reduces overfitting on a variety of datasets: for instance, we improve on the state-of-the art for CIFAR-100 without even using dropout.

READ FULL TEXT
research
12/04/2015

Max-Pooling Dropout for Regularization of Convolutional Neural Networks

Recently, dropout has seen increasing use in deep learning. For deep con...
research
12/13/2017

The Enhanced Hybrid MobileNet

Although complicated and deep neural network models can achieve high acc...
research
01/30/2017

Emergence of Selective Invariance in Hierarchical Feed Forward Networks

Many theories have emerged which investigate how in- variance is generat...
research
11/23/2015

Recombinator Networks: Learning Coarse-to-Fine Feature Aggregation

Deep neural networks with alternating convolutional, max-pooling and dec...
research
08/15/2023

FeatGeNN: Improving Model Performance for Tabular Data with Correlation-based Feature Extraction

Automated Feature Engineering (AutoFE) has become an important task for ...
research
03/02/2022

The Theoretical Expressiveness of Maxpooling

Over the decade since deep neural networks became state of the art image...
research
04/11/2018

Detail-Preserving Pooling in Deep Networks

Most convolutional neural networks use some method for gradually downsca...

Please sign up or login with your details

Forgot password? Click here to reset