Error estimate for a universal function approximator of ReLU network with a local connection

09/03/2020
by   Jae-Mo Kang, et al.
0

Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We analyze the approximation error of the specific neural network architecture with a local connection and higher application than one with the full connection because the local-connected network can be used to explain diverse neural networks such as CNNs. Our error estimate depends on two parameters: one controlling the depth of the hidden layer, and the other, the width of the hidden layers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2022

Neural Network Architecture Beyond Width and Depth

This paper proposes a new neural network architecture by introducing an ...
research
09/01/2021

Approximation Properties of Deep ReLU CNNs

This paper is devoted to establishing L^2 approximation properties for d...
research
05/29/2023

Minimum Width of Leaky-ReLU Neural Networks for Uniform Universal Approximation

The study of universal approximation properties (UAP) for neural network...
research
06/09/2023

Hidden symmetries of ReLU networks

The parameter space for any fixed architecture of feedforward ReLU neura...
research
12/09/2015

Gamma Belief Networks

To infer multilayer deep representations of high-dimensional discrete an...
research
10/29/2020

Do Wide and Deep Networks Learn the Same Things? Uncovering How Neural Network Representations Vary with Width and Depth

A key factor in the success of deep neural networks is the ability to sc...
research
07/04/2022

Approximation bounds for convolutional neural networks in operator learning

Recently, deep Convolutional Neural Networks (CNNs) have proven to be su...

Please sign up or login with your details

Forgot password? Click here to reset