Approximation-Generalization Trade-offs under (Approximate) Group Equivariance

05/27/2023
by   Mircea Petrache, et al.
0

The explicit incorporation of task-specific inductive biases through symmetry has emerged as a general design precept in the development of high-performance machine learning models. For example, group equivariant neural networks have demonstrated impressive performance across various domains and applications such as protein and drug design. A prevalent intuition about such models is that the integration of relevant symmetry results in enhanced generalization. Moreover, it is posited that when the data and/or the model may only exhibit approximate or partial symmetry, the optimal or best-performing model is one where the model symmetry aligns with the data symmetry. In this paper, we conduct a formal unified investigation of these intuitions. To begin, we present general quantitative bounds that demonstrate how models capturing task-specific symmetries lead to improved generalization. In fact, our results do not require the transformations to be finite or even form a group and can work with partial or approximate equivariance. Utilizing this quantification, we examine the more general question of model mis-specification i.e. when the model symmetries don't align with the data symmetries. We establish, for a given symmetry group, a quantitative comparison between the approximate/partial equivariance of the model and that of the data distribution, precisely connecting model equivariance error and data equivariance error. Our result delineates conditions under which the model equivariance error is optimal, thereby yielding the best-performing model for the given task and data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2022

The Surprising Effectiveness of Equivariant Models in Domains with Latent Symmetry

Extensive work has demonstrated that equivariant neural networks can sig...
research
02/01/2023

Generative Adversarial Symmetry Discovery

Despite the success of equivariant neural networks in scientific applica...
research
10/24/2022

A PAC-Bayesian Generalization Bound for Equivariant Networks

Equivariant networks capture the inductive bias about the symmetry of th...
research
05/04/2022

Group-Invariant Quantum Machine Learning

Quantum Machine Learning (QML) models are aimed at learning from data en...
research
07/11/2022

Exploiting Different Symmetries for Trajectory Tracking Control with Application to Quadrotors

High performance trajectory tracking control of quadrotor vehicles is an...
research
06/01/2023

Regularizing Towards Soft Equivariance Under Mixed Symmetries

Datasets often have their intrinsic symmetries, and particular deep-lear...
research
04/18/2023

A Study of Neural Collapse Phenomenon: Grassmannian Frame, Symmetry, Generalization

In this paper, we extends original Neural Collapse Phenomenon by proving...

Please sign up or login with your details

Forgot password? Click here to reset