Tracking translation invariance in CNNs

04/13/2021
by   Johannes C. Myburgh, et al.
0

Although Convolutional Neural Networks (CNNs) are widely used, their translation invariance (ability to deal with translated inputs) is still subject to some controversy. We explore this question using translation-sensitivity maps to quantify how sensitive a standard CNN is to a translated input. We propose the use of Cosine Similarity as sensitivity metric over Euclidean Distance, and discuss the importance of restricting the dimensionality of either of these metrics when comparing architectures. Our main focus is to investigate the effect of different architectural components of a standard CNN on that network's sensitivity to translation. By varying convolutional kernel sizes and amounts of zero padding, we control the size of the feature maps produced, allowing us to quantify the extent to which these elements influence translation invariance. We also measure translation invariance at different locations within the CNN to determine the extent to which convolutional and fully connected layers, respectively, contribute to the translation invariance of a CNN as a whole. Our analysis indicates that both convolutional kernel size and feature map size have a systematic influence on translation invariance. We also see that convolutional layers contribute less than expected to translation invariance, when not specifically forced to do so.

READ FULL TEXT
research
03/18/2021

Stride and Translation Invariance in CNNs

Convolutional Neural Networks have become the standard for image classif...
research
10/12/2021

Convolutional Neural Networks Are Not Invariant to Translation, but They Can Learn to Be

When seeing a new object, humans can immediately recognize it across dif...
research
12/04/2020

An Empirical Method to Quantify the Peripheral Performance Degradation in Deep Networks

When applying a convolutional kernel to an image, if the output is to re...
research
07/15/2023

Improving Translation Invariance in Convolutional Neural Networks with Peripheral Prediction Padding

Zero padding is often used in convolutional neural networks to prevent t...
research
08/17/2019

A Sensitivity Analysis of Attention-Gated Convolutional Neural Networks for Sentence Classification

Recently, Attention-Gated Convolutional Neural Networks (AGCNNs) perform...
research
06/10/2021

DNN-Based Topology Optimisation: Spatial Invariance and Neural Tangent Kernel

We study the SIMP method with a density field generated by a fully-conne...
research
07/30/2023

Deep Convolutional Neural Networks with Zero-Padding: Feature Extraction and Learning

This paper studies the performance of deep convolutional neural networks...

Please sign up or login with your details

Forgot password? Click here to reset