On Sharpness of Error Bounds for Multivariate Neural Network Approximation

04/05/2020
by   Steffen Goebbels, et al.
0

Sharpness of error bounds for best non-linear multivariate approximation by sums of logistic activation functions and piecewise polynomials is investigated. The error bounds are given in terms of moduli of smoothness. They describe approximation properties of single hidden layer feedforward neural networks with multiple input nodes. Sharpness with respect to Lipschitz classes is established by constructing counterexamples with a non-linear, quantitative extension of the uniform boundedness principle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2018

Bounds on the Approximation Power of Feedforward Neural Networks

The approximation power of general feedforward neural networks with piec...
research
08/10/2020

Intelligent Matrix Exponentiation

We present a novel machine learning architecture that uses the exponenti...
research
06/15/2021

Predicting Unreliable Predictions by Shattering a Neural Network

Piecewise linear neural networks can be split into subfunctions, each wi...
research
06/30/2023

Efficient uniform approximation using Random Vector Functional Link networks

A Random Vector Functional Link (RVFL) network is a depth-2 neural netwo...
research
08/18/2022

Quantitative Universal Approximation Bounds for Deep Belief Networks

We show that deep belief networks with binary hidden units can approxima...
research
12/16/2021

Approximation of functions with one-bit neural networks

This paper examines the approximation capabilities of coarsely quantized...
research
08/11/2023

On the error of best polynomial approximation of composite functions

The purpose of the paper is to provide a characterization of the error o...

Please sign up or login with your details

Forgot password? Click here to reset