Efficient Graph Generation with Graph Recurrent Attention Networks

10/02/2019
by   Renjie Liao, et al.
16

We propose a new family of efficient and expressive deep generative models of graphs, called Graph Recurrent Attention Networks (GRANs). Our model generates graphs one block of nodes and associated edges at a time. The block size and sampling stride allow us to trade off sample quality for efficiency. Compared to previous RNN-based graph generative models, our framework better captures the auto-regressive conditioning between the already-generated and to-be-generated parts of the graph using Graph Neural Networks (GNNs) with attention. This not only reduces the dependency on node ordering but also bypasses the long-term bottleneck caused by the sequential nature of RNNs. Moreover, we parameterize the output distribution per block using a mixture of Bernoulli, which captures the correlations among generated edges within the block. Finally, we propose to handle node orderings in generation by marginalizing over a family of canonical orderings. On standard benchmarks, we achieve state-of-the-art time efficiency and sample quality compared to previous models. Additionally, we show our model is capable of generating large graphs of up to 5K nodes with good quality. To the best of our knowledge, GRAN is the first deep graph generative model that can scale to this size. Our code is released at: https://github.com/lrjconan/GRAN.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2020

Building LEGO Using Deep Generative Models of Graphs

Generative models are now used to create a variety of high-quality digit...
research
01/31/2020

Edge-based sequential graph generation with recurrent neural networks

Graph generation with Machine Learning is an open problem with applicati...
research
06/11/2021

Order Matters: Probabilistic Modeling of Node Sequence for Graph Generation

A graph generative model defines a distribution over graphs. One type of...
research
03/07/2022

TIGGER: Scalable Generative Modelling for Temporal Interaction Graphs

There has been a recent surge in learning generative models for graphs. ...
research
11/19/2022

NVDiff: Graph Generation through the Diffusion of Node Vectors

Learning to generate graphs is challenging as a graph is a set of pairwi...
research
07/18/2021

GraphGen-Redux: a Fast and Lightweight Recurrent Model for labeled Graph Generation

The problem of labeled graph generation is gaining attention in the Deep...
research
07/17/2019

DeepNC: Deep Generative Network Completion

Most network data are collected from only partially observable networks ...

Please sign up or login with your details

Forgot password? Click here to reset