LieTransformer: Equivariant self-attention for Lie Groups

12/20/2020
by   Michael Hutchinson, et al.
6

Group equivariant neural networks are used as building blocks of group invariant neural networks, which have been shown to improve generalisation performance and data efficiency through principled parameter sharing. Such works have mostly focused on group equivariant convolutions, building on the result that group equivariant linear maps are necessarily convolutions. In this work, we extend the scope of the literature to non-linear neural network modules, namely self-attention, that is emerging as a prominent building block of deep learning models. We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups. We demonstrate the generality of our approach by showing experimental results that are competitive to baseline methods on a wide range of tasks: shape counting on point clouds, molecular property regression and modelling particle trajectories under Hamiltonian dynamics.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

10/02/2020

Group Equivariant Stand-Alone Self-Attention For Vision

We provide a general self-attention formulation to impose group equivari...
02/25/2020

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

The translation equivariance of convolutional layers enables convolution...
04/19/2021

A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups

Symmetries and equivariance are fundamental to the generalization of neu...
06/08/2020

The Lipschitz Constant of Self-Attention

Lipschitz constants of neural networks have been explored in various con...
11/16/2021

Enabling equivariance for arbitrary Lie groups

Although provably robust to translational perturbations, convolutional n...
08/27/2020

Bi-invariant Two-Sample Tests in Lie Groups for Shape Analysis

We propose generalizations of the Hotelling's T^2 statistic and the Bhat...
09/26/2019

B-Spline CNNs on Lie Groups

Group convolutional neural networks (G-CNNs) can be used to improve clas...

Code Repositories

lie-transformer-pytorch

Implementation of Lie Transformer, Equivariant Self-Attention, in Pytorch


view repo

lie-transformer

LieTransformer: Equivariant Self-Attention for Lie Groups


view repo

PointVS

SE(3)-equivariant point cloud networks for virtual screening


view repo

TepidInvariance

SE(3)-invariant neural networks for hotspot prediction


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.