Addressing the Data Sparsity Issue in Neural AMR Parsing

02/16/2017
by   Xiaochang Peng, et al.
0

Neural attention models have achieved great success in different NLP tasks. How- ever, they have not fulfilled their promise on the AMR parsing task due to the data sparsity issue. In this paper, we de- scribe a sequence-to-sequence model for AMR parsing and present different ways to tackle the data sparsity problem. We show that our methods achieve significant improvement over a baseline neural atten- tion model and our results are also compet- itive against state-of-the-art systems that do not use extra linguistic resources.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

Structural generalization is hard for sequence-to-sequence models

Sequence-to-sequence (seq2seq) models have been successful across many N...
research
04/26/2017

Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

Sequence-to-sequence models have shown strong performance across a broad...
research
06/08/2020

O(n) Connections are Expressive Enough: Universal Approximability of Sparse Transformers

Transformer networks use pairwise attention to compute contextual embedd...
research
02/05/2023

Unleashing the True Potential of Sequence-to-Sequence Models for Sequence Tagging and Structure Parsing

Sequence-to-Sequence (S2S) models have achieved remarkable success on va...
research
09/15/2022

The Impact of Edge Displacement Vaserstein Distance on UD Parsing Performance

We contribute to the discussion on parsing performance in NLP by introdu...
research
01/21/2016

On Structured Sparsity of Phonological Posteriors for Linguistic Parsing

The speech signal conveys information on different time scales from shor...
research
11/05/2016

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

Transfer and multi-task learning have traditionally focused on either a ...

Please sign up or login with your details

Forgot password? Click here to reset