On the rate of convergence of fully connected very deep neural network regression estimates

08/29/2019
by   Michael Kohler, et al.
24

Recent results in nonparametric regression show that deep learning, i.e., neural networks estimates with many hidden layers, are able to circumvent the so-called curse of dimensionality in case that suitable restrictions on the structure of the regression function hold. One key feature of the neural networks used in these results is that they are not fully connected. In this paper we show that we can get similar results also for fully connected multilayer feedforward neural networks with ReLU activation functions, provided the number of neurons per hidden layer is fixed and the number of hidden layers tends to infinity for sample size tending to infinity. The proof is based on new approximation results concerning fully connected deep neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/12/2020

Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function

This article contributes to the current statistical theory of deep neura...
02/04/2021

Universal Approximation Theorems of Fully Connected Binarized Neural Networks

Neural networks (NNs) are known for their high predictive accuracy in co...
06/08/2021

Householder-Absolute Neural Layers For High Variability and Deep Trainability

We propose a new architecture for artificial neural networks called Hous...
04/03/2022

Correlation Functions in Random Fully Connected Neural Networks at Finite Width

This article considers fully connected neural networks with Gaussian ran...
06/06/2021

On the Power of Shallow Learning

A deluge of recent work has explored equivalences between wide neural ne...
07/29/2021

Structure and Performance of Fully Connected Neural Networks: Emerging Complex Network Properties

Understanding the behavior of Artificial Neural Networks is one of the m...
02/11/2020

Neural Rule Ensembles: Encoding Sparse Feature Interactions into Neural Networks

Artificial Neural Networks form the basis of very powerful learning meth...