AMR Parsing as Sequence-to-Graph Transduction

05/21/2019
by   Sheng Zhang, et al.
0

We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3 (70.2

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2019

Broad-Coverage Semantic Parsing as Transduction

We unify different broad-coverage semantic parsing tasks under a transdu...
research
04/12/2020

AMR Parsing via Graph-Sequence Iterative Inference

We propose a new end-to-end model that treats AMR parsing as a series of...
research
09/09/2021

Graph-Based Decoding for Task Oriented Semantic Parsing

The dominant paradigm for semantic parsing in recent years is to formula...
research
09/12/2018

Knowledge-Aware Conversational Semantic Parsing Over Web Tables

Conversational semantic parsing over tables requires knowledge acquiring...
research
10/05/2020

Improving AMR Parsing with Sequence-to-Sequence Pre-training

In the literature, the research on abstract meaning representation (AMR)...
research
10/06/2020

On the Role of Supervision in Unsupervised Constituency Parsing

We analyze several recent unsupervised constituency parsing models, whic...
research
07/25/2023

Holistic Exploration on Universal Decompositional Semantic Parsing: Architecture, Data Augmentation, and LLM Paradigm

In this paper, we conduct a holistic exploration of the Universal Decomp...

Please sign up or login with your details

Forgot password? Click here to reset