Exponential Separations in Symmetric Neural Networks

06/02/2022
by   Aaron Zweig, et al.
0

In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network santoro2017simple architecture as a natural generalization of the DeepSets zaheer2017deep architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size N with elements in dimension D, which can be efficiently approximated by the former architecture, but provably requires width exponential in N and D for the latter.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/05/2021

An Analysis of State-of-the-art Activation Functions For Supervised Deep Neural Network

This paper provides an analysis of state-of-the-art activation functions...
08/05/2022

Towards Antisymmetric Neural Ansatz Separation

We study separations between two fundamental models (or Ansätze) of anti...
08/16/2020

A Functional Perspective on Learning Symmetric Functions with Neural Networks

Symmetric functions, which take as input an unordered, fixed-size set, a...
02/24/2021

Abelian Neural Networks

We study the problem of modeling a binary operation that satisfies some ...
07/30/2020

On Representing (Anti)Symmetric Functions

Permutation-invariant, -equivariant, and -covariant functions and anti-s...
10/25/2020

Neural Network Approximation: Three Hidden Layers Are Enough

A three-hidden-layer neural network with super approximation power is in...
12/04/2019

Universal approximation of symmetric and anti-symmetric functions

We consider universal approximations of symmetric and anti-symmetric fun...