DeepAI AI Chat
Log In Sign Up

Symplectic Recurrent Neural Networks

by   Zhengdao Chen, et al.

We propose Symplectic Recurrent Neural Networks (SRNNs) as learning algorithms that capture the dynamics of physical systems from observed trajectories. An SRNN models the Hamiltonian function of the system by a neural network and furthermore leverages symplectic integration, multiple-step training and initial state optimization to address the challenging numerical issues associated with Hamiltonian systems. We show SRNNs succeed reliably on complex and noisy Hamiltonian systems. We also show how to augment the SRNN integration scheme in order to handle stiff dynamical systems such as bouncing billiards.


page 1

page 2

page 3

page 4


Mastering high-dimensional dynamics with Hamiltonian neural networks

We detail how incorporating physics into neural network design can signi...

Port-Hamiltonian Neural Networks for Learning Explicit Time-Dependent Dynamical Systems

Accurately learning the temporal behavior of dynamical systems requires ...

Learning Trajectories of Hamiltonian Systems with Neural Networks

Modeling of conservative systems with neural networks is an area of acti...

On Robust Classification using Contractive Hamiltonian Neural ODEs

Deep neural networks can be fragile and sensitive to small input perturb...

Port-Hamiltonian Neural Networks with State Dependent Ports

Hybrid machine learning based on Hamiltonian formulations has recently b...

Symplectically Integrated Symbolic Regression of Hamiltonian Dynamical Systems

Here we present Symplectically Integrated Symbolic Regression (SISR), a ...

Constructing Gradient Controllable Recurrent Neural Networks Using Hamiltonian Dynamics

Recurrent neural networks (RNNs) have gained a great deal of attention i...