Making Convolutional Networks Shift-Invariant Again

04/25/2019
by   Richard Zhang, et al.
10

Modern convolutional networks are not shift-invariant, as small input shifts or translations can cause drastic changes in the output. Commonly used downsampling methods, such as max-pooling, strided-convolution, and average-pooling, ignore the sampling theorem. The well-known signal processing fix is anti-aliasing by low-pass filtering before downsampling. However, simply inserting this module into deep networks leads to performance degradation; as a result, it is seldomly used today. We show that when integrated correctly, it is compatible with existing architectural components, such as max-pooling. The technique is general and can be incorporated across layer types and applications, such as image classification and conditional image generation. In addition to increased shift-invariance, we also observe, surprisingly, that anti-aliasing boosts accuracy in ImageNet classification, across several commonly-used architectures. This indicates that anti-aliasing serves as effective regularization. Our results demonstrate that this classical signal processing technique has been undeservingly overlooked in modern deep networks. Code and anti-aliased versions of popular networks will be made available at <https://richzhang.github.io/antialiased-cnns/> .

READ FULL TEXT

page 6

page 8

page 16

research
09/24/2021

Frequency Pooling: Shift-Equivalent and Anti-Aliasing Downsampling

Convolution utilizes a shift-equivalent prior of images, thus leading to...
research
07/01/2021

Improving Sound Event Classification by Increasing Shift Invariance in Convolutional Neural Networks

Recent studies have put into question the commonly assumed shift invaria...
research
11/28/2020

Truly shift-invariant convolutional neural networks

Thanks to the use of convolution and pooling layers, convolutional neura...
research
05/21/2021

ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction

This work attempts to provide a plausible theoretical framework that aim...
research
03/14/2023

Alias-Free Convnets: Fractional Shift Invariance via Polynomial Activations

Although CNNs are believed to be invariant to translations, recent works...
research
10/27/2020

Deep Networks from the Principle of Rate Reduction

This work attempts to interpret modern deep (convolutional) networks fro...
research
05/30/2018

Why do deep convolutional networks generalize so poorly to small image transformations?

Deep convolutional network architectures are often assumed to guarantee ...

Please sign up or login with your details

Forgot password? Click here to reset