Stride and Translation Invariance in CNNs

03/18/2021
by   Coenraad Mouton, et al.
0

Convolutional Neural Networks have become the standard for image classification tasks, however, these architectures are not invariant to translations of the input image. This lack of invariance is attributed to the use of stride which ignores the sampling theorem, and fully connected layers which lack spatial reasoning. We show that stride can greatly benefit translation invariance given that it is combined with sufficient similarity between neighbouring pixels, a characteristic which we refer to as local homogeneity. We also observe that this characteristic is dataset-specific and dictates the relationship between pooling kernel size and stride required for translation invariance. Furthermore we find that a trade-off exists between generalization and translation invariance in the case of pooling kernel size, as larger kernel sizes lead to better invariance but poorer generalization. Finally we explore the efficacy of other solutions proposed, namely global average pooling, anti-aliasing, and data augmentation, both empirically and through the lens of local homogeneity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2021

Tracking translation invariance in CNNs

Although Convolutional Neural Networks (CNNs) are widely used, their tra...
research
06/10/2021

DNN-Based Topology Optimisation: Spatial Invariance and Neural Tangent Kernel

We study the SIMP method with a density field generated by a fully-conne...
research
02/25/2021

Learning with invariances in random features and kernel models

A number of machine learning tasks entail a high degree of invariance: t...
research
10/12/2021

Convolutional Neural Networks Are Not Invariant to Translation, but They Can Learn to Be

When seeing a new object, humans can immediately recognize it across dif...
research
06/08/2020

On Universalized Adversarial and Invariant Perturbations

Convolutional neural networks or standard CNNs (StdCNNs) are translation...
research
11/16/2021

Learning with convolution and pooling operations in kernel methods

Recent empirical work has shown that hierarchical convolutional kernels ...

Please sign up or login with your details

Forgot password? Click here to reset