Learning Stochastic Dynamical Systems as an Implicit Regularization with Graph Neural Networks

07/12/2023
by   Jin Guo, et al.
0

Stochastic Gumbel graph networks are proposed to learn high-dimensional time series, where the observed dimensions are often spatially correlated. To that end, the observed randomness and spatial-correlations are captured by learning the drift and diffusion terms of the stochastic differential equation with a Gumble matrix embedding, respectively. In particular, this novel framework enables us to investigate the implicit regularization effect of the noise terms in S-GGNs. We provide a theoretical guarantee for the proposed S-GGNs by deriving the difference between the two corresponding loss functions in a small neighborhood of weight. Then, we employ Kuramoto's model to generate data for comparing the spectral density from the Hessian Matrix of the two loss functions. Experimental results on real-world data, demonstrate that S-GGNs exhibit superior convergence, robustness, and generalization, compared with state-of-the-arts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2020

On implicit regularization: Morse functions and applications to matrix factorization

In this paper, we revisit implicit regularization from the ground up usi...
research
05/26/2016

Generalization Properties and Implicit Regularization for Multiple Passes SGM

We study the generalization properties of stochastic gradient methods fo...
research
03/11/2021

Implicit energy regularization of neural ordinary-differential-equation control

Although optimal control problems of dynamical systems can be formulated...
research
06/19/2020

A general framework for defining and optimizing robustness

Robustness of neural networks has recently attracted a great amount of i...
research
06/05/2019

Neural SDE: Stabilizing Neural ODE Networks with Stochastic Noise

Neural Ordinary Differential Equation (Neural ODE) has been proposed as ...
research
06/06/2023

Learning Dynamical Systems from Noisy Data with Inverse-Explicit Integrators

We introduce the mean inverse integrator (MII), a novel approach to incr...
research
10/26/2022

Sparsity in Continuous-Depth Neural Networks

Neural Ordinary Differential Equations (NODEs) have proven successful in...

Please sign up or login with your details

Forgot password? Click here to reset