On overcoming the Curse of Dimensionality in Neural Networks

09/02/2018
by   Karen Yeressian, et al.
0

Let H be a reproducing Kernel Hilbert space. For i=1,...,N, let x_i∈R^d and y_i∈R^m comprise our dataset. Let f^*∈ H be the unique global minimiser of the functional J(f) = 1/2 f_H^2 + 1/N∑_i=1^N1/2 f(x_i)-y_i^2. In this paper we show that for each n∈N there exists a two layer network where the first layer has nm number of basis functions Φ_x_i_k,j for i_1,...,i_n∈{1,...,N}, j=1,...,m and the second layer takes a weighted summation of the first layer, such that the functions f_n realised by these networks satisfy f_n-f^*_H≤ O(1/√(n))for all n∈N. Thus the error rate is independent of input dimension d, output dimension m and data size N.

READ FULL TEXT
research
06/05/2023

Global universal approximation of functional input maps on weighted spaces

We introduce so-called functional input neural networks defined on a pos...
research
06/05/2023

The L^∞ Learnability of Reproducing Kernel Hilbert Spaces

In this work, we analyze the learnability of reproducing kernel Hilbert ...
research
05/21/2020

Kolmogorov Width Decay and Poor Approximators in Machine Learning: Shallow Neural Networks, Random Feature Models and Neural Tangent Kernels

We establish a scale separation of Kolmogorov width type between subspac...
research
01/22/2018

Binary output layer of feedforward neural networks for solving multi-class classification problems

Considered in this short note is the design of output layer nodes of fee...
research
06/15/2020

Feature Space Saturation during Training

We propose layer saturation - a simple, online-computable method for ana...
research
08/21/2020

Indistinguishability Obfuscation from Well-Founded Assumptions

In this work, we show how to construct indistinguishability obfuscation ...
research
06/22/2021

The Rate of Convergence of Variation-Constrained Deep Neural Networks

Multi-layer feedforward networks have been used to approximate a wide ra...

Please sign up or login with your details

Forgot password? Click here to reset