Efficient uniform approximation using Random Vector Functional Link networks

06/30/2023
by   Palina Salanevich, et al.
0

A Random Vector Functional Link (RVFL) network is a depth-2 neural network with random inner weights and biases. As only the outer weights of such architectures need to be learned, the learning process boils down to a linear optimization task, allowing one to sidestep the pitfalls of nonconvex optimization problems. In this paper, we prove that an RVFL with ReLU activation functions can approximate Lipschitz continuous functions provided its hidden layer is exponentially wide in the input dimension. Although it has been established before that such approximation can be achieved in L_2 sense, we prove it for L_∞ approximation error and Gaussian inner weights. To the best of our knowledge, our result is the first of this kind. We give a nonasymptotic lower bound for the number of hidden layer nodes, depending on, among other things, the Lipschitz constant of the target function, the desired accuracy, and the input dimension. Our method of proof is rooted in probability theory and harmonic analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2021

On the Approximation Power of Two-Layer Networks of Random ReLUs

This paper considers the following question: how well can depth-two ReLU...
research
07/30/2020

Random Vector Functional Link Networks for Function Approximation on Manifolds

The learning speed of feed-forward neural networks is notoriously slow a...
research
02/10/2018

Optimal approximation of continuous functions by very deep ReLU networks

We prove that deep ReLU neural networks with conventional fully-connecte...
research
02/28/2017

Deep Semi-Random Features for Nonlinear Function Approximation

We propose semi-random features for nonlinear function approximation. Th...
research
05/17/2022

Sharp asymptotics on the compression of two-layer neural networks

In this paper, we study the compression of a target two-layer neural net...
research
04/05/2020

On Sharpness of Error Bounds for Multivariate Neural Network Approximation

Sharpness of error bounds for best non-linear multivariate approximation...
research
03/29/2020

Are Direct Links Necessary in RVFL NNs for Regression?

A random vector functional link network (RVFL) is widely used as a unive...

Please sign up or login with your details

Forgot password? Click here to reset