Exploiting Redundancy: Separable Group Convolutional Networks on Lie Groups

10/25/2021
by   David M. Knigge, et al.
0

Group convolutional neural networks (G-CNNs) have been shown to increase parameter efficiency and model accuracy by incorporating geometric inductive biases. In this work, we investigate the properties of representations learned by regular G-CNNs, and show considerable parameter redundancy in group convolution kernels. This finding motivates further weight-tying by sharing convolution kernels over subgroups. To this end, we introduce convolution kernels that are separable over the subgroup and channel dimensions. In order to obtain equivariance to arbitrary affine Lie groups we provide a continuous parameterisation of separable convolution kernels. We evaluate our approach across several vision datasets, and show that our weight sharing leads to improved performance and computational efficiency. In many settings, separable G-CNNs outperform their non-separable counterpart, while only using a fraction of their training time. In addition, thanks to the increase in computational efficiency, we are able to implement G-CNNs equivariant to the Sim(2) group; the group of dilations, rotations and translations. Sim(2)-equivariance further improves performance on all tasks considered.

READ FULL TEXT
research
02/24/2016

Group Equivariant Convolutional Networks

We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a...
research
06/09/2021

Exploiting Learned Symmetries in Group Equivariant Convolutions

Group Equivariant Convolutions (GConvs) enable convolutional neural netw...
research
01/14/2021

Rescaling CNN through Learnable Repetition of Network Parameters

Deeper and wider CNNs are known to provide improved performance for deep...
research
09/26/2019

B-Spline CNNs on Lie Groups

Group convolutional neural networks (G-CNNs) can be used to improve clas...
research
04/05/2023

SMPConv: Self-moving Point Representations for Continuous Convolution

Continuous convolution has recently gained prominence due to its ability...
research
11/16/2021

Enabling equivariance for arbitrary Lie groups

Although provably robust to translational perturbations, convolutional n...
research
02/04/2021

CKConv: Continuous Kernel Convolution For Sequential Data

Conventional neural architectures for sequential data present important ...

Please sign up or login with your details

Forgot password? Click here to reset