Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing

10/29/2021
by   Jiawei Zhou, et al.
0

Predicting linearized Abstract Meaning Representation (AMR) graphs using pre-trained sequence-to-sequence Transformer models has recently led to large improvements on AMR parsing benchmarks. These parsers are simple and avoid explicit modeling of structure but lack desirable properties such as graph well-formedness guarantees or built-in graph-sentence alignments. In this work we explore the integration of general pre-trained sequence-to-sequence language models and a structure-aware transition-based approach. We depart from a pointer-based transition system and propose a simplified transition set, designed to better exploit pre-trained language models for structured fine-tuning. We also explore modeling the parser state within the pre-trained encoder-decoder architecture and different vocabulary strategies for the same purpose. We provide a detailed comparison with recent progress in AMR parsing and show that the proposed parser retains the desirable properties of previous transition-based approaches, while being simpler and reaching the new parsing state of the art for AMR 2.0, without the need for graph re-categorization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

Improving AMR Parsing with Sequence-to-Sequence Pre-training

In the literature, the research on abstract meaning representation (AMR)...
research
05/26/2023

Slide, Constrain, Parse, Repeat: Synchronous SlidingWindows for Document AMR Parsing

The sliding window approach provides an elegant way to handle contexts o...
research
09/09/2021

Graph-Based Decoding for Task Oriented Semantic Parsing

The dominant paradigm for semantic parsing in recent years is to formula...
research
06/13/2022

Transition-based Abstract Meaning Representation Parsing with Contextual Embeddings

The ability to understand and generate languages sets human cognition ap...
research
03/15/2021

A Transition-based Parser for Unscoped Episodic Logical Forms

"Episodic Logic:Unscoped Logical Form" (EL-ULF) is a semantic representa...
research
12/15/2021

Learning to Transpile AMR into SPARQL

We propose a transition-based system to transpile Abstract Meaning Repre...
research
10/15/2021

Hierarchical Curriculum Learning for AMR Parsing

Abstract Meaning Representation (AMR) parsing translates sentences to th...

Please sign up or login with your details

Forgot password? Click here to reset