Exponential Separations in Symmetric Neural Networks

06/02/2022
by   Aaron Zweig, et al.
0

In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network santoro2017simple architecture as a natural generalization of the DeepSets zaheer2017deep architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size N with elements in dimension D, which can be efficiently approximated by the former architecture, but provably requires width exponential in N and D for the latter.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2021

An Analysis of State-of-the-art Activation Functions For Supervised Deep Neural Network

This paper provides an analysis of state-of-the-art activation functions...
research
08/05/2022

Towards Antisymmetric Neural Ansatz Separation

We study separations between two fundamental models (or Ansätze) of anti...
research
08/16/2020

A Functional Perspective on Learning Symmetric Functions with Neural Networks

Symmetric functions, which take as input an unordered, fixed-size set, a...
research
02/24/2021

Abelian Neural Networks

We study the problem of modeling a binary operation that satisfies some ...
research
07/30/2020

On Representing (Anti)Symmetric Functions

Permutation-invariant, -equivariant, and -covariant functions and anti-s...
research
12/04/2019

Universal approximation of symmetric and anti-symmetric functions

We consider universal approximations of symmetric and anti-symmetric fun...
research
02/20/2020

On Learning Sets of Symmetric Elements

Learning from unordered sets is a fundamental learning setup, which is a...

Please sign up or login with your details

Forgot password? Click here to reset