Emergence of Selective Invariance in Hierarchical Feed Forward Networks

01/30/2017
by   Dipan K. Pal, et al.
0

Many theories have emerged which investigate how in- variance is generated in hierarchical networks through sim- ple schemes such as max and mean pooling. The restriction to max/mean pooling in theoretical and empirical studies has diverted attention away from a more general way of generating invariance to nuisance transformations. We con- jecture that hierarchically building selective invariance (i.e. carefully choosing the range of the transformation to be in- variant to at each layer of a hierarchical network) is im- portant for pattern recognition. We utilize a novel pooling layer called adaptive pooling to find linear pooling weights within networks. These networks with the learnt pooling weights have performances on object categorization tasks that are comparable to max/mean pooling networks. In- terestingly, adaptive pooling can converge to mean pooling (when initialized with random pooling weights), find more general linear pooling schemes or even decide not to pool at all. We illustrate the general notion of selective invari- ance through object categorization experiments on large- scale datasets such as SVHN and ILSVRC 2012.

READ FULL TEXT

page 1

page 5

page 7

page 8

research
03/01/2021

Maximal function pooling with applications

Inspired by the Hardy-Littlewood maximal function, we propose a novel po...
research
12/18/2014

Fractional Max-Pooling

Convolutional networks almost always incorporate some form of spatial po...
research
09/29/2022

Enumeration of max-pooling responses with generalized permutohedra

We investigate the combinatorics of max-pooling layers, which are functi...
research
09/05/2018

Hierarchical Selective Recruitment in Linear-Threshold Brain Networks - Part II: Inter-Layer Dynamics and Top-Down Recruitment

Goal-driven selective attention (GDSA) is a remarkable function that all...
research
05/01/2020

Why and when should you pool? Analyzing Pooling in Recurrent Architectures

Pooling-based recurrent neural architectures consistently outperform the...
research
10/11/2022

Pooling Strategies for Simplicial Convolutional Networks

The goal of this paper is to introduce pooling strategies for simplicial...
research
11/25/2022

MorphPool: Efficient Non-linear Pooling Unpooling in CNNs

Pooling is essentially an operation from the field of Mathematical Morph...

Please sign up or login with your details

Forgot password? Click here to reset