Vanilla feedforward neural networks as a discretization of dynamic systems

09/22/2022
by   Yifei Duan, et al.
0

Deep learning has made significant applications in the field of data science and natural science. Some studies have linked deep neural networks to dynamic systems, but the network structure is restricted to the residual network. It is known that residual networks can be regarded as a numerical discretization of dynamic systems. In this paper, we back to the classical network structure and prove that the vanilla feedforward networks could also be a numerical discretization of dynamic systems, where the width of the network is equal to the dimension of the input and output. Our proof is based on the properties of the leaky-ReLU function and the numerical technique of splitting method to solve differential equations. Our results could provide a new perspective for understanding the approximation properties of feedforward neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2020

Expressivity of Deep Neural Networks

In this review paper, we give a comprehensive overview of the large vari...
research
09/09/2019

Differential equations as models of deep neural networks

In this work we systematically analyze general properties of differentia...
research
11/23/2020

Synthesis and Pruning as a Dynamic Compression Strategy for Efficient Deep Neural Networks

The brain is a highly reconfigurable machine capable of task-specific ad...
research
10/30/2021

Approximation properties of Residual Neural Networks for Kolmogorov PDEs

In recent years residual neural networks (ResNets) as introduced by [He,...
research
09/03/2023

Implicit regularization of deep residual networks towards neural ODEs

Residual neural networks are state-of-the-art deep learning models. Thei...
research
10/03/2018

Optimization Algorithm Inspired Deep Neural Network Structure Design

Deep neural networks have been one of the dominant machine learning appr...
research
02/14/2020

Why Do Deep Residual Networks Generalize Better than Deep Feedforward Networks? – A Neural Tangent Kernel Perspective

Deep residual networks (ResNets) have demonstrated better generalization...

Please sign up or login with your details

Forgot password? Click here to reset