Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

04/03/2018
by   Kun Xu, et al.
0

Celebrated Sequence to Sequence learning (Seq2Seq) and its fruitful variants are powerful models to achieve excellent performance on the tasks that map sequences to sequences. However, these are many machine learning tasks with inputs naturally represented in a form of graphs, which imposes significant challenges to existing Seq2Seq models for lossless conversion from its graph form to the sequence. In this work, we present a general end-to-end approach to map the input graph to a sequence of vectors, and then another attention-based LSTM to decode the target sequence from these vectors. Specifically, to address inevitable information loss for data conversion, we introduce a novel graph-to-sequence neural network model that follows the encoder-decoder architecture. Our method first uses an improved graph-based neural network to generate the node and graph embeddings by a novel aggregation strategy to incorporate the edge direction information into the node embeddings. We also propose an attention based mechanism that aligns node embeddings and decoding sequence to better cope with large graphs. Experimental results on bAbI task, Shortest Path Task, and Natural Language Generation Task demonstrate that our model achieves the state-of-the-art performance and significantly outperforms other baselines. We also show that with the proposed aggregation strategy, our proposed model is able to quickly converge to good performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2016

Neural Machine Transliteration: Preliminary Results

Machine transliteration is the process of automatically transforming the...
research
06/09/2020

Graph-Aware Transformer: Is Attention All Graphs Need?

Graphs are the natural data structure to represent relational and struct...
research
05/18/2023

Attention-based Encoder-Decoder Network for End-to-End Neural Speaker Diarization with Target Speaker Attractor

This paper proposes a novel Attention-based Encoder-Decoder network for ...
research
05/09/2018

A Click Sequence Model for Web Search

Getting a better understanding of user behavior is important for advanci...
research
04/22/2020

Supervised Grapheme-to-Phoneme Conversion of Orthographic Schwas in Hindi and Punjabi

Hindi grapheme-to-phoneme (G2P) conversion is mostly trivial, with one e...
research
05/26/2020

How to Grow a (Product) Tree: Personalized Category Suggestions for eCommerce Type-Ahead

In an attempt to balance precision and recall in the search page, leadin...
research
03/04/2023

Seq-HyGAN: Sequence Classification via Hypergraph Attention Network

Sequence classification has a wide range of real-world applications in d...

Please sign up or login with your details

Forgot password? Click here to reset