Approximating smooth functions by deep neural networks with sigmoid activation function

10/08/2020
by   Sophie Langer, et al.
0

We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any d-dimensional, smooth function on a compact set with a rate of order W^-p/d, where W is the number of nonzero weights in the network and p is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order M^d achieve an approximation rate of M^-2p. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights W_0 in the network and show an approximation rate of W_0^-p/d. This more general result finally helps us to understand which network topology guarantees a special target accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2020

Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function

This article contributes to the current statistical theory of deep neura...
research
02/13/2018

Deep Neural Networks Learn Non-Smooth Functions Effectively

We theoretically discuss why deep neural networks (DNNs) performs better...
research
06/22/2018

On the Spectral Bias of Deep Neural Networks

It is well known that over-parametrized deep neural networks (DNNs) are ...
research
02/13/2018

Information Scaling Law of Deep Neural Networks

With the rapid development of Deep Neural Networks (DNNs), various netwo...
research
11/26/2018

A Differential Topological View of Challenges in Learning with Feedforward Neural Networks

Among many unsolved puzzles in theories of Deep Neural Networks (DNNs), ...
research
01/30/2019

On Correlation of Features Extracted by Deep Neural Networks

Redundancy in deep neural network (DNN) models has always been one of th...
research
02/01/2018

Deep Learning with Data Dependent Implicit Activation Function

Though deep neural networks (DNNs) achieve remarkable performances in ma...

Please sign up or login with your details

Forgot password? Click here to reset