Dependency Parsing with LSTMs: An Empirical Evaluation

04/22/2016
by   Adhiguna Kuncoro, et al.
0

We propose a transition-based dependency parser using Recurrent Neural Networks with Long Short-Term Memory (LSTM) units. This extends the feedforward neural network parser of Chen and Manning (2014) and enables modelling of entire sequences of shift/reduce transition decisions. On the Google Web Treebank, our LSTM parser is competitive with the best feedforward parser on overall accuracy and notably achieves more than 3 dependencies, which has proved difficult for previous transition-based parsers due to error propagation and limited context information. Our findings additionally suggest that dropout regularisation on the embedding layer is crucial to improve the LSTM's generalisation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2015

Improved Transition-Based Parsing by Modeling Characters instead of Words with LSTMs

We present extensions to a continuous-state dependency parsing method th...
research
08/29/2017

A Simple LSTM model for Transition-based Dependency Parsing

We present a simple LSTM-based transition-based dependency parser. Our m...
research
05/12/2017

Arc-swift: A Novel Transition System for Dependency Parsing

Transition-based dependency parsers often need sequences of local shift ...
research
05/29/2015

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

We propose a technique for learning representations of parser states in ...
research
10/25/2017

Non-Projective Dependency Parsing with Non-Local Transitions

We present a novel transition system, based on the Covington non-project...
research
02/26/2019

Recursive Subtree Composition in LSTM-Based Dependency Parsing

The need for tree structure modelling on top of sequence modelling is an...
research
06/19/2015

Structured Training for Neural Network Transition-Based Parsing

We present structured perceptron training for neural network transition-...

Please sign up or login with your details

Forgot password? Click here to reset