Group Equivariant Convolutional Networks

02/24/2016
by   Taco S Cohen, et al.
0

We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers. G-convolutions increase the expressive capacity of the network without increasing the number of parameters. Group convolution layers are easy to use and can be implemented with negligible computational overhead for discrete groups generated by translations, reflections and rotations. G-CNNs achieve state of the art results on CIFAR10 and rotated MNIST.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/12/2018

3D G-CNNs for Pulmonary Nodule Detection

Convolutional Neural Networks (CNNs) require a large amount of annotated...
research
10/25/2021

Exploiting Redundancy: Separable Group Convolutional Networks on Lie Groups

Group convolutional neural networks (G-CNNs) have been shown to increase...
research
06/10/2021

Group Equivariant Subsampling

Subsampling is used in convolutional neural networks (CNNs) in the form ...
research
05/24/2019

Training decision trees as replacement for convolution layers

We present an alternative layer to convolution layers in convolutional n...
research
06/04/2021

DISCO: accurate Discrete Scale Convolutions

Scale is often seen as a given, disturbing factor in many vision tasks. ...
research
10/19/2021

Learning Equivariances and Partial Equivariances from Data

Group equivariant Convolutional Neural Networks (G-CNNs) constrain featu...
research
05/30/2022

Universality of group convolutional neural networks based on ridgelet analysis on groups

We investigate the approximation property of group convolutional neural ...

Please sign up or login with your details

Forgot password? Click here to reset