Graph Neural Ordinary Differential Equations

11/18/2019
by   Michael Poli, et al.
0

We extend the framework of graph neural networks (GNN) to continuous time. Graph neural ordinary differential equations (GDEs) are introduced as the counterpart to GNNs where the input–output relationship is determined by a continuum of GNN layers. The GDE framework is shown to be compatible with the majority of commonly used GNN models with minimal modification to the original formulations. We evaluate the effectiveness of GDEs on both static as well as dynamic datasets: results prove their general effectiveness even in cases where the data is not generated by continuous time processes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2021

Continuous-Depth Neural Models for Dynamic Graph Prediction

We introduce the framework of continuous-depth graph neural networks (GN...
research
08/23/2023

Graph Neural Stochastic Differential Equations

We present a novel model Graph Neural Stochastic Differential Equations ...
research
05/31/2022

Continuous Temporal Graph Networks for Event-Based Graph Data

There has been an increasing interest in modeling continuous-time dynami...
research
04/06/2023

Unconstrained Parametrization of Dissipative and Contracting Neural Ordinary Differential Equations

In this work, we introduce and study a class of Deep Neural Networks (DN...
research
02/08/2021

Enhance Information Propagation for Graph Neural Network by Heterogeneous Aggregations

Graph neural networks are emerging as continuation of deep learning succ...
research
02/04/2022

Graph-Coupled Oscillator Networks

We propose Graph-Coupled Oscillator Networks (GraphCON), a novel framewo...
research
06/03/2022

Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules

Neural ordinary differential equations (ODEs) have attracted much attent...

Please sign up or login with your details

Forgot password? Click here to reset