Nonlinear Weighted Directed Acyclic Graph and A Priori Estimates for Neural Networks

03/30/2021
by   Yuqing Li, et al.
0

In an attempt to better understand structural benefits and generalization power of deep neural networks, we firstly present a novel graph theoretical formulation of neural network models, including fully connected, residual network (ResNet) and densely connected networks (DenseNet). Secondly, we extend the error analysis of the population risk for two layer network <cit.> and ResNet <cit.> to DenseNet, and show further that for neural networks satisfying certain mild conditions, similar estimates can be obtained. These estimates are a priori in nature since they depend sorely on the information prior to the training process, in particular, the bounds for the estimation errors are independent of the input dimension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2019

A Priori Estimates of the Population Risk for Residual Networks

Optimal a priori estimates are derived for the population risk of a regu...
research
10/15/2018

A Priori Estimates of the Generalization Error for Two-layer Neural Networks

New estimates for the generalization error are established for the two-l...
research
07/07/2020

Towards an Understanding of Residual Networks Using Neural Tangent Hierarchy (NTH)

Gradient descent yields zero training loss in polynomial time for deep n...
research
06/17/2021

Exploring the Properties and Evolution of Neural Network Eigenspaces during Training

In this work we explore the information processing inside neural network...
research
08/05/2020

Densely Connected Residual Network for Attack Recognition

High false alarm rate and low detection rate are the major sticking poin...
research
10/29/2020

What can we learn from gradients?

Recent work (<cit.>) has shown that it is possible to reconstruct the in...
research
05/04/2021

A Priori Generalization Error Analysis of Two-Layer Neural Networks for Solving High Dimensional Schrödinger Eigenvalue Problems

This paper analyzes the generalization error of two-layer neural network...

Please sign up or login with your details

Forgot password? Click here to reset