Self-supervised Graph Masking Pre-training for Graph-to-Text Generation

10/19/2022
by   Jiuzhou Han, et al.
0

Large-scale pre-trained language models (PLMs) have advanced Graph-to-Text (G2T) generation by processing the linearised version of a graph. However, the linearisation is known to ignore the structural information. Additionally, PLMs are typically pre-trained on free text which introduces domain mismatch between pre-training and downstream G2T generation tasks. To address these shortcomings, we propose graph masking pre-training strategies that neither require supervision signals nor adjust the architecture of the underlying pre-trained encoder-decoder model. When used with a pre-trained T5, our approach achieves new state-of-the-art results on WebNLG+2020 and EventNarrative G2T generation datasets. Our method also shows to be very effective in the low-resource setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2022

Graph Pre-training for AMR Parsing and Generation

Abstract meaning representation (AMR) highlights the core semantic infor...
research
04/27/2022

DialogVED: A Pre-trained Latent Variable Encoder-Decoder Model for Dialog Response Generation

Dialog response generation in open domain is an important research topic...
research
12/20/2022

Pre-trained Language Models for Keyphrase Generation: A Thorough Empirical Study

Neural models that do not rely on pre-training have excelled in the keyp...
research
07/31/2017

Low-Resource Neural Headline Generation

Recent neural headline generation models have shown great results, but a...
research
06/19/2021

JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs

Existing pre-trained models for knowledge-graph-to-text (KG-to-text) gen...
research
11/25/2019

Importance-Aware Learning for Neural Headline Editing

Many social media news writers are not professionally trained. Therefore...
research
09/12/2019

UER: An Open-Source Toolkit for Pre-training Models

Existing works, including ELMO and BERT, have revealed the importance of...

Please sign up or login with your details

Forgot password? Click here to reset