On the Approximation Properties of Neural Networks

04/04/2019
by   Jonathan W. Siegel, et al.
0

We prove two new results concerning the approximation properties of neural networks. Our first result gives conditions under which the outputs of the neurons in a two layer neural network are linearly independent functions. Our second result concerns the rate of approximation of a two layer neural network as the number of neurons increases. We improve upon existing results in the literature by significantly relaxing the required assumptions on the activation function and by providing a better rate of approximation. We also provide a simplified proof that the class of functions represented by a two-layer neural network is dense in any compact set if the activation function is not a polynomial.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset