A Simple LSTM model for Transition-based Dependency Parsing

08/29/2017
by   Mohab Elkaref, et al.
0

We present a simple LSTM-based transition-based dependency parser. Our model is composed of a single LSTM hidden layer replacing the hidden layer in the usual feed-forward network architecture. We also propose a new initialization method that uses the pre-trained weights from a feed-forward neural network to initialize our LSTM-based model. We also show that using dropout on the input layer has a positive effect on performance. Our final parser achieves a 93.06 unlabeled and 91.01 additionally replace LSTMs with GRUs and Elman units in our model and explore the effectiveness of our initialization method on individual gates constituting all three types of RNN units.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2016

Dependency Parsing with LSTMs: An Empirical Evaluation

We propose a transition-based dependency parser using Recurrent Neural N...
research
10/21/2018

Transition-based Parsing with Lighter Feed-Forward Networks

We explore whether it is possible to build lighter parsers, that are sta...
research
02/26/2019

Recursive Subtree Composition in LSTM-Based Dependency Parsing

The need for tree structure modelling on top of sequence modelling is an...
research
09/25/2019

Explaining and Interpreting LSTMs

While neural networks have acted as a strong unifying force in the desig...
research
09/01/2017

Arc-Standard Spinal Parsing with Stack-LSTMs

We present a neural transition-based parser for spinal trees, a dependen...
research
10/05/2021

On the Impact of Stable Ranks in Deep Nets

A recent line of work has established intriguing connections between the...
research
01/04/2017

Neural Probabilistic Model for Non-projective MST Parsing

In this paper, we propose a probabilistic parsing model, which defines a...

Please sign up or login with your details

Forgot password? Click here to reset