Encoder-Decoder Shift-Reduce Syntactic Parsing

06/24/2017
by   Jiangming Liu, et al.
0

Starting from NMT, encoder-decoder neu- ral networks have been used for many NLP problems. Graph-based models and transition-based models borrowing the en- coder components achieve state-of-the-art performance on dependency parsing and constituent parsing, respectively. How- ever, there has not been work empirically studying the encoder-decoder neural net- works for transition-based parsing. We apply a simple encoder-decoder to this end, achieving comparable results to the parser of Dyer et al. (2015) on standard de- pendency parsing, and outperforming the parser of Vinyals et al. (2015) on con- stituent parsing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/06/2016

Deep Biaffine Attention for Neural Dependency Parsing

This paper builds off recent work from Kiperwasser & Goldberg (2016) usi...
research
05/27/2020

Transition-based Semantic Dependency Parsing with Pointer Networks

Transition-based parsers implemented with Pointer Networks have become t...
research
12/02/2016

Shift-Reduce Constituent Parsing with Neural Lookahead Features

Transition-based models can be fast and accurate for constituent parsing...
research
08/30/2019

Hierarchical Pointer Net Parsing

Transition-based top-down parsing with pointer networks has achieved sta...
research
11/10/2019

Rethinking Self-Attention: An Interpretable Self-Attentive Encoder-Decoder Parser

Attention mechanisms have improved the performance of NLP tasks while pr...
research
09/21/2021

Something Old, Something New: Grammar-based CCG Parsing with Transformer Models

This report describes the parsing problem for Combinatory Categorial Gra...
research
03/17/2019

Technical notes: Syntax-aware Representation Learning With Pointer Networks

This is a work-in-progress report, which aims to share preliminary resul...

Please sign up or login with your details

Forgot password? Click here to reset