DeepAI AI Chat
Log In Sign Up

Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition

01/23/2020
by   Lars Grüne, et al.
University of Bayreuth
0

We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/18/2020

Computing Lyapunov functions using deep neural networks

We propose a deep neural network architecture and a training algorithm f...
07/30/2019

Approximation Capabilities of Neural Ordinary Differential Equations

Neural Ordinary Differential Equations have been recently proposed as an...
12/04/2020

Universal Approximation Property of Neural Ordinary Differential Equations

Neural ordinary differential equations (NODEs) is an invertible neural n...
09/27/2021

Lyapunov-Net: A Deep Neural Network Architecture for Lyapunov Function Approximation

We develop a versatile deep neural network architecture, called Lyapunov...