Graph Normalizing Flows

05/30/2019
by   Jenny Liu, et al.
9

We introduce graph normalizing flows: a new, reversible graph neural network model for prediction and generation. On supervised tasks, graph normalizing flows perform similarly to message passing neural networks, but at a significantly reduced memory footprint, allowing them to scale to larger graphs. In the unsupervised case, we combine graph normalizing flows with a novel graph auto-encoder to create a generative model of graph structures. Our model is permutation-invariant, generating entire graphs with a single feed-forward pass, and achieves competitive results with the state-of-the art auto-regressive models, while being better suited to parallel computing architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2023

Vector-Quantized Graph Auto-Encoder

In this work, we addresses the problem of modeling distributions of grap...
research
06/04/2020

Auto-decoding Graphs

We present an approach to synthesizing new graph structures from empiric...
research
03/25/2022

SpeqNets: Sparsity-aware Permutation-equivariant Graph Networks

While (message-passing) graph neural networks have clear limitations in ...
research
03/02/2021

Autobahn: Automorphism-based Graph Neural Nets

We introduce Automorphism-based graph neural networks (Autobahn), a new ...
research
06/07/2023

Permutation Equivariant Graph Framelets for Heterophilous Semi-supervised Learning

The nature of heterophilous graphs is significantly different with that ...
research
07/19/2022

XG-BoT: An Explainable Deep Graph Neural Network for Botnet Detection and Forensics

In this paper, we proposed XG-BoT, an explainable deep graph neural netw...
research
08/30/2021

Adversarial Stein Training for Graph Energy Models

Learning distributions over graph-structured data is a challenging task ...

Please sign up or login with your details

Forgot password? Click here to reset