Structural Adapters in Pretrained Language Models for AMR-to-text Generation

03/16/2021
by   Leonardo F. R. Ribeiro, et al.
0

Previous work on text generation from graph-structured data relies on pretrained language models (PLMs) and utilizes graph linearization heuristics rather than explicitly considering the graph structure. Efficiently encoding the graph structure in PLMs is challenging because they were pretrained on natural language, and modeling structured data may lead to catastrophic forgetting of distributional knowledge. In this paper, we propose StructAdapt, an adapter method to encode graph structure into PLMs. Contrary to prior work, StructAdapt effectively models interactions among the nodes based on the graph connectivity, only training graph structure-aware adapter parameters. In this way, we avoid catastrophic forgetting while maintaining the topological structure of the graph. We empirically show the benefits of explicitly encoding graph structure into PLMs using adapters and achieve state-of-the-art results on two AMR-to-text datasets, training only 5.1

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2020

Investigating Pretrained Language Models for Graph-to-Text Generation

Graph-to-text generation, a subtask of data-to-text generation, aims to ...
research
09/15/2022

Graph-to-Text Generation with Dynamic Structure Pruning

Most graph-to-text works are built on the encoder-decoder framework with...
research
10/23/2018

Deep Graph Convolutional Encoders for Structured Data to Text Generation

Most previous work on neural text generation from graph-structured data ...
research
12/31/2020

Promoting Graph Awareness in Linearized Graph-to-Text Generation

Generating text from structured inputs, such as meaning representations ...
research
06/15/2012

Improving the Asymmetric TSP by Considering Graph Structure

Recent works on cost based relaxations have improved Constraint Programm...
research
02/02/2022

Understanding Knowledge Integration in Language Models with Graph Convolutions

Pretrained language models (LMs) do not capture factual knowledge very w...
research
02/12/2023

Investigating the Effect of Relative Positional Embeddings on AMR-to-Text Generation with Structural Adapters

Text generation from Abstract Meaning Representation (AMR) has substanti...

Please sign up or login with your details

Forgot password? Click here to reset