Neural Network Representation of Time Integrators

11/30/2022
by   Rainald Löhner, et al.
0

Deep neural network (DNN) architectures are constructed that are the exact equivalent of explicit Runge-Kutta schemes for numerical time integration. The network weights and biases are given, i.e., no training is needed. In this way, the only task left for physics-based integrators is the DNN approximation of the right-hand side. This allows to clearly delineate the approximation estimates for right-hand side errors and time integration errors. The architecture required for the integration of a simple mass-damper-stiffness case is included as an example.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2023

Error estimates of deep learning methods for the nonstationary Magneto-hydrodynamics equations

In this study, we prove rigourous bounds on the error and stability anal...
research
02/28/2022

Arbitrarily high order implicit ODE integration by correcting a neural network approximation with Newton's method

As a method of universal approximation deep neural networks (DNNs) are c...
research
05/24/2023

Deep Ritz Method with Adaptive Quadrature for Linear Elasticity

In this paper, we study the deep Ritz method for solving the linear elas...
research
10/18/2021

Permutation Invariance of Deep Neural Networks with ReLUs

Consider a deep neural network (DNN) that is being used to suggest the d...
research
08/19/2020

Inner Cell Mass and Trophectoderm Segmentation in Human Blastocyst Images using Deep Neural Network

Embryo quality assessment based on morphological attributes is important...
research
10/06/2019

Adaptive Krylov-Type Time Integration Methods

The Rosenbrock-Krylov family of time integration schemes is an extension...

Please sign up or login with your details

Forgot password? Click here to reset