On the Approximation Properties of Neural Networks
We prove two new results concerning the approximation properties of neural networks. Our first result gives conditions under which the outputs of the neurons in a two layer neural network are linearly independent functions. Our second result concerns the rate of approximation of a two layer neural network as the number of neurons increases. We improve upon existing results in the literature by significantly relaxing the required assumptions on the activation function and by providing a better rate of approximation. We also provide a simplified proof that the class of functions represented by a two-layer neural network is dense in any compact set if the activation function is not a polynomial.
READ FULL TEXT