Disentangling ODE parameters from dynamics in VAEs

08/26/2021
by   Stathi Fotiadis, et al.
8

Deep networks have become increasingly of interest in dynamical system prediction, but generalization remains elusive. In this work, we consider the physical parameters of ODEs as factors of variation of the data generating process. By leveraging ideas from supervised disentanglement in VAEs, we aim to separate the ODE parameters from the dynamics in the latent space. Experiments show that supervised disentanglement allows VAEs to capture the variability in the dynamics and extrapolate better to ODE parameter spaces that were not present in the training data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2023

GPLaSDI: Gaussian Process-based Interpretable Latent Space Dynamics Identification through Deep Autoencoder

Numerically solving partial differential equations (PDEs) can be challen...
research
12/20/2014

Discovering Hidden Factors of Variation in Deep Networks

Deep learning has enjoyed a great deal of success because of its ability...
research
06/22/2020

Phase space learning with neural networks

This work proposes an autoencoder neural network as a non-linear general...
research
10/29/2022

Data-driven low-dimensional dynamic model of Kolmogorov flow

Reduced order models (ROMs) that capture flow dynamics are of interest f...
research
01/28/2023

On the Lipschitz Constant of Deep Networks and Double Descent

Existing bounds on the generalization error of deep networks assume some...
research
09/24/2019

Recurrent Independent Mechanisms

Learning modular structures which reflect the dynamics of the environmen...
research
12/18/2022

The Underlying Correlated Dynamics in Neural Training

Training of neural networks is a computationally intensive task. The sig...

Please sign up or login with your details

Forgot password? Click here to reset