Continuous limits of residual neural networks in case of large input data

12/28/2021
by   M. Herty, et al.
0

Residual deep neural networks (ResNets) are mathematically described as interacting particle systems. In the case of infinitely many layers the ResNet leads to a system of coupled system of ordinary differential equations known as neural differential equations. For large scale input data we derive a mean–field limit and show well–posedness of the resulting description. Further, we analyze the existence of solutions to the training process by using both a controllability and an optimal control point of view. Numerical investigations based on the solution of a formal optimality system illustrate the theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2022

Do Residual Neural Networks discretize Neural Ordinary Differential Equations?

Neural Ordinary Differential Equations (Neural ODEs) are the continuous ...
research
05/11/2023

Generalization bounds for neural ordinary differential equations and deep residual networks

Neural ordinary differential equations (neural ODEs) are a popular famil...
research
05/27/2019

Neural Stochastic Differential Equations

Deep neural networks whose parameters are distributed according to typic...
research
05/24/2018

Residual Networks as Geodesic Flows of Diffeomorphisms

This paper addresses the understanding and characterization of residual ...
research
05/25/2021

Scaling Properties of Deep Residual Networks

Residual networks (ResNets) have displayed impressive results in pattern...
research
12/15/2022

Asymptotic Analysis of Deep Residual Networks

We investigate the asymptotic properties of deep Residual networks (ResN...
research
07/27/2018

On the overfly algorithm in deep learning of neural networks

In this paper we investigate the supervised backpropagation training of ...

Please sign up or login with your details

Forgot password? Click here to reset