A General Framework For Proving The Equivariant Strong Lottery Ticket Hypothesis

06/09/2022
by   Damien Ferbach, et al.
0

The Strong Lottery Ticket Hypothesis (SLTH) stipulates the existence of a subnetwork within a sufficiently overparameterized (dense) neural network that – when initialized randomly and without any training – achieves the accuracy of a fully trained target network. Recent work by <cit.> demonstrates that the SLTH can also be extended to translation equivariant networks – i.e. CNNs – with the same level of overparametrization as needed for SLTs in dense networks. However, modern neural networks are capable of incorporating more than just translation symmetry, and developing general equivariant architectures such as rotation and permutation has been a powerful design principle. In this paper, we generalize the SLTH to functions that preserve the action of the group G – i.e. G-equivariant network – and prove, with high probability, that one can prune a randomly initialized overparametrized G-equivariant network to a G-equivariant subnetwork that approximates another fully trained G-equivariant network of fixed width and depth. We further prove that our prescribed overparametrization scheme is also optimal as a function of the error tolerance. We develop our theory for a large range of groups, including important ones such as subgroups of the Euclidean group E(n) and subgroups of the symmetric group G ≤𝒮_n – allowing us to find SLTs for MLPs, CNNs, E(2)-steerable CNNs, and permutation equivariant networks as specific instantiations of our unified framework which completely extends prior work. Empirically, we verify our theory by pruning overparametrized E(2)-steerable CNNs and message passing GNNs to match the performance of trained target networks within a given error tolerance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/21/2023

Approximately Equivariant Graph Networks

Graph neural networks (GNNs) are commonly described as being permutation...
research
02/03/2020

Proving the Lottery Ticket Hypothesis: Pruning is All You Need

The lottery ticket hypothesis (Frankle and Carbin, 2018), states that a ...
research
11/28/2022

You Can Have Better Graph Neural Networks by Not Training Weights at All: Finding Untrained GNNs Tickets

Recent works have impressively demonstrated that there exists a subnetwo...
research
06/14/2020

Optimal Lottery Tickets via SubsetSum: Logarithmic Over-Parameterization is Sufficient

The strong lottery ticket hypothesis (LTH) postulates that one can appro...
research
10/12/2021

Implicit Bias of Linear Equivariant Networks

Group equivariant convolutional neural networks (G-CNNs) are generalizat...
research
11/22/2021

Deformation Robust Roto-Scale-Translation Equivariant CNNs

Incorporating group symmetry directly into the learning process has prov...
research
11/22/2021

On the Existence of Universal Lottery Tickets

The lottery ticket hypothesis conjectures the existence of sparse subnet...

Please sign up or login with your details

Forgot password? Click here to reset