HexaConv

03/06/2018
by   Emiel Hoogeboom, et al.
0

The effectiveness of Convolutional Neural Networks stems in large part from their ability to exploit the translation invariance that is inherent in many learning problems. Recently, it was shown that CNNs can exploit other invariances, such as rotation invariance, by using group convolutions instead of planar convolutions. However, for reasons of performance and ease of implementation, it has been necessary to limit the group convolution to transformations that can be applied to the filters without interpolation. Thus, for images with square pixels, only integer translations, rotations by multiples of 90 degrees, and reflections are admissible. Whereas the square tiling provides a 4-fold rotational symmetry, a hexagonal tiling of the plane has a 6-fold rotational symmetry. In this paper we show how one can efficiently implement planar convolution and group convolution over hexagonal lattices, by re-using existing highly optimized convolution routines. We find that, due to the reduced anisotropy of hexagonal filters, planar HexaConv provides better accuracy than planar convolution with square filters, given a fixed parameter budget. Furthermore, we find that the increased degree of symmetry of the hexagonal grid increases the effectiveness of group convolutions, by allowing for more parameter sharing. We show that our method significantly outperforms conventional CNNs on the AID aerial scene classification dataset, even outperforming ImageNet pre-trained models.

READ FULL TEXT
research
10/05/2020

Substitution planar tilings with n-fold rotational symmetry

We prove that the SubRosa substitution tilings with 2n-fold rotational s...
research
11/22/2021

Deformation Robust Roto-Scale-Translation Equivariant CNNs

Incorporating group symmetry directly into the learning process has prov...
research
01/24/2006

Geometric symmetry in the quadratic Fisher discriminant operating on image pixels

This article examines the design of Quadratic Fisher Discriminants (QFDs...
research
09/14/2016

Warped Convolutions: Efficient Invariance to Spatial Transformations

Convolutional Neural Networks (CNNs) are extremely efficient, since they...
research
07/02/2018

Sample Efficient Semantic Segmentation using Rotation Equivariant Convolutional Networks

We propose a semantic segmentation model that exploits rotation and refl...
research
02/09/2023

Gaussian Mask Convolution for Convolutional Neural Networks

Square convolution is a default unit in convolutional neural networks as...
research
02/07/2020

Attentive Group Equivariant Convolutional Networks

Although group convolutional networks are able to learn powerful represe...

Please sign up or login with your details

Forgot password? Click here to reset