Intertwiners between Induced Representations (with Applications to the Theory of Equivariant Neural Networks)

03/28/2018
by   Taco S Cohen, et al.
0

Group equivariant and steerable convolutional neural networks (regular and steerable G-CNNs) have recently emerged as a very effective model class for learning from signal data such as 2D and 3D images, video, and other data where symmetries are present. In geometrical terms, regular G-CNNs represent data in terms of scalar fields ("feature channels"), whereas the steerable G-CNN can also use vector or tensor fields ("capsules") to represent data. In algebraic terms, the feature spaces in regular G-CNNs transform according to a regular representation of the group G, whereas the feature spaces in Steerable G-CNNs transform according to the more general induced representations of G. In order to make the network equivariant, each layer in a G-CNN is required to intertwine between the induced representations associated with its input and output space. In this paper we present a general mathematical framework for G-CNNs on homogeneous spaces like Euclidean space or the sphere. We show, using elementary methods, that the layers of an equivariant network are convolutional if and only if the input and output feature spaces transform according to an induced representation. This result, which follows from G.W. Mackey's abstract theory on induced representations, establishes G-CNNs as a universal class of equivariant network architectures, and generalizes the important recent work of Kondor & Trivedi on the intertwiners between regular representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2018

A General Theory of Equivariant CNNs on Homogeneous Spaces

Group equivariant convolutional neural networks (G-CNNs) have recently e...
research
12/27/2016

Steerable CNNs

It has long been recognized that the invariance and equivariance propert...
research
11/19/2019

General E(2)-Equivariant Steerable CNNs

The big empirical success of group equivariant networks has led in recen...
research
04/10/2020

Theoretical Aspects of Group Equivariant Neural Networks

Group equivariant neural networks have been explored in the past few yea...
research
05/28/2021

Geometric Deep Learning and Equivariant Neural Networks

We survey the mathematical foundations of geometric deep learning, focus...
research
06/03/2020

Non-Euclidean Universal Approximation

Modifications to a neural network's input and output layers are often re...
research
06/16/2022

Unified Fourier-based Kernel and Nonlinearity Design for Equivariant Networks on Homogeneous Spaces

We introduce a unified framework for group equivariant networks on homog...

Please sign up or login with your details

Forgot password? Click here to reset