Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set

08/30/2017
by   Tianze Shi, et al.
0

We first present a minimal feature set for transition-based dependency parsing, continuing a recent trend started by Kiperwasser and Goldberg (2016a) and Cross and Huang (2016a) of using bi-directional LSTM features. We plug our minimal feature set into the dynamic-programming framework of Huang and Sagae (2010) and Kuhlmann et al. (2011) to produce the first implementation of worst-case O(n^3) exact decoders for arc-hybrid and arc-eager transition systems. With our minimal features, we also present O(n^3) global training methods. Finally, using ensembles including our new parsers, we achieve the best unlabeled attachment score reported (to our knowledge) on the Chinese Treebank and the "second-best-in-class" result on the English Penn Treebank.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2018

Global Transition-based Non-projective Dependency Parsing

Shi, Huang, and Lee (2017) obtained state-of-the-art results for English...
research
06/21/2016

Incremental Parsing with Minimal Features Using Bi-Directional LSTM

Recently, neural network approaches for parsing have largely automated t...
research
07/09/2020

Greedy Transition-Based Dependency Parsing with Discrete and Continuous Supertag Features

We study the effect of rich supertag features in greedy transition-based...
research
06/28/2012

Elimination of Spurious Ambiguity in Transition-Based Dependency Parsing

We present a novel technique to remove spurious ambiguity from transitio...
research
11/28/2017

Hybrid Oracle: Making Use of Ambiguity in Transition-based Chinese Dependency Parsing

In the training of transition-based dependency parsers, an oracle is use...
research
04/27/2018

Improving Coverage and Runtime Complexity for Exact Inference in Non-Projective Transition-Based Dependency Parsers

We generalize Cohen, Gómez-Rodríguez, and Satta's (2011) parser to a fam...
research
08/06/2016

Bi-directional Attention with Agreement for Dependency Parsing

We develop a novel bi-directional attention model for dependency parsing...

Please sign up or login with your details

Forgot password? Click here to reset