Depth separation beyond radial functions

02/02/2021
by   Luca Venturi, et al.
0

High-dimensional depth separation results for neural networks show that certain functions can be efficiently approximated by two-hidden-layer networks but not by one-hidden-layer ones in high-dimensions d. Existing results of this type mainly focus on functions with an underlying radial or one-dimensional structure, which are usually not encountered in practice. The first contribution of this paper is to extend such results to a more general class of functions, namely functions with piece-wise oscillatory structure, by building on the proof strategy of (Eldan and Shamir, 2016). We complement these results by showing that, if the domain radius and the rate of oscillation of the objective function are constant, then approximation by one-hidden-layer networks holds at a poly(d) rate for any fixed error threshold. A common theme in the proof of such results is the fact that one-hidden-layer networks fail to approximate high-energy functions whose Fourier representation is spread in the domain. On the other hand, existing approximation results of a function by one-hidden-layer neural networks rely on the function having a sparse Fourier representation. The choice of the domain also represents a source of gaps between upper and lower approximation bounds. Focusing on a fixed approximation domain, namely the sphere 𝕊^d-1 in dimension d, we provide a characterization of both functions which are efficiently approximable by one-hidden-layer networks and of functions which are provably not, in terms of their Fourier expansion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2019

Depth Separations in Neural Networks: What is Actually Being Separated?

Existing depth separation results for constant-depth networks essentiall...
research
02/07/2022

Approximation error of single hidden layer neural networks with fixed weights

This paper provides an explicit formula for the approximation error of s...
research
01/30/2021

Size and Depth Separation in Approximating Natural Functions with Neural Networks

When studying the expressive power of neural networks, a main challenge ...
research
02/27/2017

Depth Separation for Neural Networks

Let f:S^d-1×S^d-1→S be a function of the form f(x,x') = g(〈x,x'〉) for g:...
research
04/03/2019

Deep Neural Networks for Rotation-Invariance Approximation and Learning

Based on the tree architecture, the objective of this paper is to design...
research
08/05/2022

Towards Antisymmetric Neural Ansatz Separation

We study separations between two fundamental models (or Ansätze) of anti...
research
12/04/2021

Optimization-Based Separations for Neural Networks

Depth separation results propose a possible theoretical explanation for ...

Please sign up or login with your details

Forgot password? Click here to reset