On the universal approximation property of radial basis function neural networks

04/05/2023
by   Aysu Ismayilova, et al.
0

In this paper we consider a new class of RBF (Radial Basis Function) neural networks, in which smoothing factors are replaced with shifts. We prove under certain conditions on the activation function that these networks are capable of approximating any continuous multivariate function on any compact subset of the d-dimensional Euclidean space. For RBF networks with finitely many fixed centroids we describe conditions guaranteeing approximation with arbitrary precision.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2023

Universal approximation with complex-valued deep narrow neural networks

We study the universality of complex-valued neural networks with bounded...
research
03/09/2023

Provable Data Subset Selection For Efficient Neural Network Training

Radial basis function neural networks (RBFNN) are well-known for their c...
research
12/12/2019

Towards Expressive Priors for Bayesian Neural Networks: Poisson Process Radial Basis Function Networks

While Bayesian neural networks have many appealing characteristics, curr...
research
08/07/2023

Tractability of approximation by general shallow networks

In this paper, we present a sharper version of the results in the paper ...
research
01/22/2019

An Exact Reformulation of Feature-Vector-based Radial-Basis-Function Networks for Graph-based Observations

Radial-basis-function networks are traditionally defined for sets of vec...
research
09/27/2022

Continuous approximation by convolutional neural networks with a sigmoidal function

In this paper we present a class of convolutional neural networks (CNNs)...
research
07/06/2021

The QR decomposition for radial neural networks

We provide a theoretical framework for neural networks in terms of the r...

Please sign up or login with your details

Forgot password? Click here to reset