Neural networks and rational functions

06/11/2017
by   Matus Telgarsky, et al.
0

Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree O(polylog(1/ϵ)) which is ϵ-close, and similarly for any rational function there exists a ReLU network of size O(polylog(1/ϵ)) which is ϵ-close. By contrast, polynomials need degree Ω(poly(1/ϵ)) to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2020

Rational neural networks

We consider neural networks with rational activation functions. The choi...
research
05/29/2018

Representational Power of ReLU Networks and Polynomial Kernels: Beyond Worst-Case Analysis

There has been a large amount of interest, both in the past and particul...
research
05/18/2018

Tropical Geometry of Deep Neural Networks

We establish, for the first time, connections between feedforward neural...
research
05/31/2023

Alternating Minimization for Regression with Tropical Rational Functions

We propose an alternating minimization heuristic for regression over the...
research
08/30/2018

Rational Neural Networks for Approximating Jump Discontinuities of Graph Convolution Operator

For node level graph encoding, a recent important state-of-art method is...
research
12/02/2022

Sometimes Two Irrational Guards are Needed

In the art gallery problem, we are given a closed polygon P, with ration...
research
02/20/2020

On the Uniqueness of Simultaneous Rational Function Reconstruction

This paper focuses on the problem of reconstructing a vector of rational...

Please sign up or login with your details

Forgot password? Click here to reset