Autoregressive Models for Sequences of Graphs

03/18/2019
by   Daniele Zambon, et al.
0

This paper proposes an autoregressive (AR) model for sequences of graphs, which generalises traditional AR models. A first novelty consists in formalising the AR model for a very general family of graphs, characterised by a variable topology, and attributes associated with nodes and edges. A graph neural network (GNN) is also proposed to learn the AR function associated with the graph-generating process (GGP), and subsequently predict the next graph in a sequence. The proposed method is compared with four baselines on synthetic GGPs, denoting a significantly better performance on all considered problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2020

An EM Approach to Non-autoregressive Conditional Sequence Generation

Autoregressive (AR) models have been the dominating approach to conditio...
research
09/06/2023

Geometric Infinitely Divisible Autoregressive Models

In this article, we discuss some geometric infinitely divisible (gid) ra...
research
06/08/2022

Autoregressive Perturbations for Data Poisoning

The prevalence of data scraping from social media as a means to obtain d...
research
03/30/2023

TreePiece: Faster Semantic Parsing via Tree Tokenization

Autoregressive (AR) encoder-decoder neural networks have proved successf...
research
04/04/2021

TSNAT: Two-Step Non-Autoregressvie Transformer Models for Speech Recognition

The autoregressive (AR) models, such as attention-based encoder-decoder ...
research
10/17/2019

Autoregressive Models: What Are They Good For?

Autoregressive (AR) models have become a popular tool for unsupervised l...

Please sign up or login with your details

Forgot password? Click here to reset