Invariant polynomials and machine learning

04/26/2021
by   Ward Haddadin, et al.
0

We present an application of invariant polynomials in machine learning. Using the methods developed in previous work, we obtain two types of generators of the Lorentz- and permutation-invariant polynomials in particle momenta; minimal algebra generators and Hironaka decompositions. We discuss and prove some approximation theorems to make use of these invariant generators in machine learning algorithms in general and in neural networks specifically. By implementing these generators in neural networks applied to regression tasks, we test the improvements in performance under a wide range of hyperparameter choices and find a reduction of the loss on training data and a significant reduction of the loss on validation data. For a different approach on quantifying the performance of these neural networks, we treat the problem from a Bayesian inference perspective and employ nested sampling techniques to perform model comparison. Beyond a certain network size, we find that networks utilising Hironaka decompositions perform the best.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2019

Counting invariant subspaces and decompositions of additive polynomials

The functional (de)composition of polynomials is a topic in pure and com...
research
02/27/2023

Invariant Layers for Graphs with Nodes of Different Types

Neural networks that satisfy invariance with respect to input permutatio...
research
10/15/2019

Improved Generalization Bound of Permutation Invariant Deep Neural Networks

We theoretically prove that a permutation invariant property of deep neu...
research
09/29/2022

Start Small: Training Game Level Generators from Nothing by Learning at Multiple Sizes

A procedural level generator is a tool that generates levels from noise....
research
07/21/2011

Spectral approximations in machine learning

In many areas of machine learning, it becomes necessary to find the eige...
research
11/23/2015

What Happened to My Dog in That Network: Unraveling Top-down Generators in Convolutional Neural Networks

Top-down information plays a central role in human perception, but plays...
research
10/09/2022

LieGG: Studying Learned Lie Group Generators

Symmetries built into a neural network have appeared to be very benefici...

Please sign up or login with your details

Forgot password? Click here to reset