Invertible Neural Networks for Graph Prediction

06/02/2022
by   Chen Xu, et al.
82

In this work, we address conditional generation using deep invertible neural networks. This is a type of problem where one aims to infer the most probable inputs X given outcomes Y. We call our method invertible graph neural network (iGNN) due to the primary focus on generating node features on graph data. A notable feature of our proposed methods is that during network training, we revise the typically-used loss objective in normalizing flow and consider Wasserstein-2 regularization to facilitate the training process. Algorithmic-wise, we adopt an end-to-end training approach since our objective is to address prediction and generation in the forward and backward processes at once through a single model. Theoretically, we characterize the conditions for identifiability of a true mapping, the existence and invertibility of the mapping, and the expressiveness of iGNN in learning the mapping. Experimentally, we verify the performance of iGNN on both simulated and real-data datasets. We demonstrate through extensive numerical experiments that iGNN shows clear improvement over competing conditional generation benchmarks on high-dimensional and/or non-convex data.

READ FULL TEXT

page 20

page 24

page 25

page 36

research
02/26/2019

Graph Neural Processes: Towards Bayesian Graph Neural Networks

We introduce Graph Neural Processes (GNP), inspired by the recent work i...
research
09/04/2021

Training Graph Neural Networks by Graphon Estimation

In this work, we propose to train a graph neural network via resampling ...
research
02/24/2021

Pre-Training on Dynamic Graph Neural Networks

The pre-training on the graph neural network model can learn the general...
research
06/12/2020

Effective Training Strategies for Deep Graph Neural Networks

Graph Neural Networks (GNNs) tend to suffer performance degradation as m...
research
08/19/2019

On Regularization Properties of Artificial Datasets for Deep Learning

The paper discusses regularization properties of artificial data for dee...
research
07/12/2018

Training Neural Networks Using Features Replay

Training a neural network using backpropagation algorithm requires passi...
research
06/10/2023

Any-dimensional equivariant neural networks

Traditional supervised learning aims to learn an unknown mapping by fitt...

Please sign up or login with your details

Forgot password? Click here to reset