Random Vector Functional Link Networks for Function Approximation on Manifolds

07/30/2020
by   Deanna Needell, et al.
5

The learning speed of feed-forward neural networks is notoriously slow and has presented a bottleneck in deep learning applications for several decades. For instance, gradient-based learning algorithms, which are used extensively to train neural networks, tend to work slowly when all of the network parameters must be iteratively tuned. To counter this, both researchers and practitioners have tried introducing randomness to reduce the learning requirement. Based on the original construction of Igelnik and Pao, single layer neural-networks with random input-to-hidden layer weights and biases have seen success in practice, but the necessary theoretical justification is lacking. In this paper, we begin to fill this theoretical gap. We provide a (corrected) rigorous proof that the Igelnik and Pao construction is a universal approximator for continuous functions on compact domains, with approximation error decaying asymptotically like O(1/√(n)) for the number n of network nodes. We then extend this result to the non-asymptotic setting, proving that one can achieve any desired approximation error with high probability provided n is sufficiently large. We further adapt this randomized neural network architecture to approximate functions on smooth, compact submanifolds of Euclidean space, providing theoretical guarantees in both the asymptotic and non-asymptotic forms. Finally, we illustrate our results on manifolds with numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2023

Efficient uniform approximation using Random Vector Functional Link networks

A Random Vector Functional Link (RVFL) network is a depth-2 neural netwo...
research
12/21/2021

NN2Poly: A polynomial representation for deep feed-forward artificial neural networks

Interpretability of neural networks and their underlying theoretical beh...
research
08/21/2017

On the approximation by single hidden layer feedforward neural networks with fixed weights

Feedforward neural networks have wide applicability in various disciplin...
research
01/13/2021

Quantitative Rates and Fundamental Obstructions to Non-Euclidean Universal Approximation with Deep Narrow Feed-Forward Networks

By incorporating structured pairs of non-trainable input and output laye...
research
02/13/2022

Random vector functional link network: recent developments, applications, and future directions

Neural networks have been successfully employed in various domain such a...
research
06/03/2019

Asymptotic Properties of Neural Network Sieve Estimators

Neural networks are one of the most popularly used methods in machine le...
research
06/03/2020

Non-Euclidean Universal Approximation

Modifications to a neural network's input and output layers are often re...

Please sign up or login with your details

Forgot password? Click here to reset