Measure theoretic results for approximation by neural networks with limited weights

04/04/2023
by   Vugar Ismailov, et al.
0

In this paper, we study approximation properties of single hidden layer neural networks with weights varying on finitely many directions and thresholds from an open interval. We obtain a necessary and at the same time sufficient measure theoretic condition for density of such networks in the space of continuous functions. Further, we prove a density result for neural networks with a specifically constructed activation function and a fixed number of neurons.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2022

Approximation error of single hidden layer neural networks with fixed weights

This paper provides an explicit formula for the approximation error of s...
research
04/04/2019

On the Approximation Properties of Neural Networks

We prove two new results concerning the approximation properties of neur...
research
06/21/2021

Approximation capabilities of measure-preserving neural networks

Measure-preserving neural networks are well-developed invertible models,...
research
03/29/2023

Optimal approximation of C^k-functions using shallow complex-valued neural networks

We prove a quantitative result for the approximation of functions of reg...
research
06/12/2022

Universality and approximation bounds for echo state networks with random weights

We study the uniform approximation of echo state networks with randomly ...
research
05/30/2017

Adaptive Estimation of the Neural Activation Extent in Computational Volume Conductor Models of Deep Brain Stimulation

Objective: The aim of this study is to propose an adaptive scheme embedd...
research
08/11/2019

Data-Driven Randomized Learning of Feedforward Neural Networks

Randomized methods of neural network learning suffer from a problem with...

Please sign up or login with your details

Forgot password? Click here to reset