Thermodynamics-informed graph neural networks

by   Quercus Hernandez, et al.

In this paper we present a deep learning method to predict the temporal evolution of dissipative dynamic systems. We propose using both geometric and thermodynamic inductive biases to improve accuracy and generalization of the resulting integration scheme. The first is achieved with Graph Neural Networks, which induces a non-Euclidean geometrical prior with permutation invariant node and edge update functions. The second bias is forced by learning the GENERIC structure of the problem, an extension of the Hamiltonian formalism, to model more general non-conservative dynamics. Several examples are provided in both Eulerian and Lagrangian description in the context of fluid and solid mechanics respectively, achieving relative mean errors of less than 3 examples. Two ablation studies are provided based on recent works in both physics-informed and geometric deep learning.


page 8

page 9


Learning Trajectories of Hamiltonian Systems with Neural Networks

Modeling of conservative systems with neural networks is an area of acti...

Deconstructing the Inductive Biases of Hamiltonian Neural Networks

Physics-inspired neural networks (NNs), such as Hamiltonian or Lagrangia...

Generalizable Machine Learning in Neuroscience using Graph Neural Networks

Although a number of studies have explored deep learning in neuroscience...

Physics-Informed Deep Neural Operator Networks

Standard neural networks can approximate general nonlinear operators, re...

Geometric and Physical Quantities improve E(3) Equivariant Message Passing

Including covariant information, such as position, force, velocity or sp...

Lagrangian and Hamiltonian Mechanics for Probabilities on the Statistical Manifold

We provide an Information-Geometric formulation of Classical Mechanics o...

GFINNs: GENERIC Formalism Informed Neural Networks for Deterministic and Stochastic Dynamical Systems

We propose the GENERIC formalism informed neural networks (GFINNs) that ...