A three layer neural network can represent any discontinuous multivariate function

12/05/2020
by   Vugar Ismailov, et al.
0

In 1987, Hecht-Nielsen showed that any continuous multivariate function could be implemented by a certain type three-layer neural network. This result was very much discussed in neural network literature. In this paper we prove that not only continuous functions but also all discontinuous functions can be implemented by such neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2019

An Algorithm for Approximating Continuous Functions on Compact Subsets with a Neural Network with one Hidden Layer

George Cybenko's landmark 1989 paper showed that there exists a feedforw...
research
04/13/2021

Reducing Discontinuous to Continuous Parsing with Pointer Network Reordering

Discontinuous constituent parsers have always lagged behind continuous a...
research
12/17/2018

On the Continuity of Rotation Representations in Neural Networks

In neural networks, it is often desirable to work with various represent...
research
06/19/2020

No one-hidden-layer neural network can represent multivariable functions

In a function approximation with a neural network, an input dataset is m...
research
05/10/2019

Asymptotics of multivariate sequences in the presence of a lacuna

We explain a discontinuous drop in the exponential growth rate for certa...
research
02/07/2023

On the relationship between multivariate splines and infinitely-wide neural networks

We consider multivariate splines and show that they have a random featur...
research
12/03/2020

The Discontinuity Problem

Matthias Schröder has asked the question whether there is a weakest disc...

Please sign up or login with your details

Forgot password? Click here to reset