On the approximation of functions by tanh neural networks

04/18/2021
by   Tim De Ryck, et al.
0

We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size of the neural networks. We show that tanh neural networks with only two hidden layers suffice to approximate functions at comparable or better rates than much deeper ReLU neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2019

Error bounds for approximations with deep ReLU neural networks in W^s,p norms

We analyze approximation rates of deep ReLU neural networks for Sobolev-...
research
05/15/2020

Learning the gravitational force law and other analytic functions

Large neural network models have been successful in learning functions o...
research
02/25/2021

Quantitative approximation results for complex-valued neural networks

We show that complex-valued neural networks with the modReLU activation ...
research
10/15/2018

A Priori Estimates of the Generalization Error for Two-layer Neural Networks

New estimates for the generalization error are established for the two-l...
research
08/18/2023

Capacity Bounds for Hyperbolic Neural Network Representations of Latent Tree Structures

We study the representation capacity of deep hyperbolic neural networks ...
research
01/11/2023

Exploring the Approximation Capabilities of Multiplicative Neural Networks for Smooth Functions

Multiplication layers are a key component in various influential neural ...
research
06/20/2018

Reinforcement Learning using Augmented Neural Networks

Neural networks allow Q-learning reinforcement learning agents such as d...

Please sign up or login with your details

Forgot password? Click here to reset