Modular Neural Ordinary Differential Equations

09/15/2021
by   Max Zhu, et al.
0

The laws of physics have been written in the language of dif-ferential equations for centuries. Neural Ordinary Differen-tial Equations (NODEs) are a new machine learning architecture which allows these differential equations to be learned from a dataset. These have been applied to classical dynamics simulations in the form of Lagrangian Neural Net-works (LNNs) and Second Order Neural Differential Equations (SONODEs). However, they either cannot represent the most general equations of motion or lack interpretability. In this paper, we propose Modular Neural ODEs, where each force component is learned with separate modules. We show how physical priors can be easily incorporated into these models. Through a number of experiments, we demonstrate these result in better performance, are more interpretable, and add flexibility due to their modularity.

READ FULL TEXT
research
11/25/2017

Efficiently and easily integrating differential equations with JiTCODE, JiTCDDE, and JiTCSDE

We present a family of Python modules for the numerical integration of o...
research
01/12/2023

Model-free machine learning of conservation laws from data

We present a machine learning based method for learning first integrals ...
research
12/14/2020

Bayesian Neural Ordinary Differential Equations

Recently, Neural Ordinary Differential Equations has emerged as a powerf...
research
09/06/2020

OnsagerNet: Learning Stable and Interpretable Dynamics using a Generalized Onsager Principle

We propose a systematic method for learning stable and interpretable dyn...
research
04/02/2019

Augmented Neural ODEs

We show that Neural Ordinary Differential Equations (ODEs) learn represe...
research
06/12/2020

On Second Order Behaviour in Augmented Neural ODEs

Neural Ordinary Differential Equations (NODEs) are a new class of models...

Please sign up or login with your details

Forgot password? Click here to reset