Warped Convolutions: Efficient Invariance to Spatial Transformations

09/14/2016
by   João F. Henriques, et al.
0

Convolutional Neural Networks (CNNs) are extremely efficient, since they exploit the inherent translation-invariance of natural images. However, translation is just one of a myriad of useful spatial transformations. Can the same efficiency be attained when considering other spatial invariances? Such generalized convolutions have been considered in the past, but at a high computational cost. We present a construction that is simple and exact, yet has the same computational complexity that standard convolutions enjoy. It consists of a constant image warp followed by a simple convolution, which are standard blocks in deep learning toolboxes. With a carefully crafted warp, the resulting architecture can be made invariant to one of a wide range of spatial transformations. We show encouraging results in realistic scenarios, including the estimation of vehicle poses in the Google Earth dataset (rotation and scale), and face poses in Annotated Facial Landmarks in the Wild (3D rotations under perspective).

READ FULL TEXT

page 7

page 12

research
12/07/2020

Rotation-Invariant Point Convolution With Multiple Equivariant Alignments

Recent attempts at introducing rotation invariance or equivariance in 3D...
research
06/29/2023

Restore Translation Using Equivariant Neural Networks

Invariance to spatial transformations such as translations and rotations...
research
03/06/2018

HexaConv

The effectiveness of Convolutional Neural Networks stems in large part f...
research
03/27/2020

Dynamic Region-Aware Convolution

We propose a new convolution called Dynamic Region-Aware Convolution (DR...
research
03/09/2022

Resource-Efficient Invariant Networks: Exponential Gains by Unrolled Optimization

Achieving invariance to nuisance transformations is a fundamental challe...
research
02/08/2022

Improving the Sample-Complexity of Deep Classification Networks with Invariant Integration

Leveraging prior knowledge on intraclass variance due to transformations...
research
05/08/2023

Riesz networks: scale invariant neural networks in a single forward pass

Scale invariance of an algorithm refers to its ability to treat objects ...

Please sign up or login with your details

Forgot password? Click here to reset