A non-projective greedy dependency parser with bidirectional LSTMs

07/11/2017
by   David Vilares, et al.
0

The LyS-FASTPARSE team presents BIST-COVINGTON, a neural implementation of the Covington (2001) algorithm for non-projective dependency parsing. The bidirectional LSTM approach by Kipperwasser and Goldberg (2016) is used to train a greedy parser with a dynamic oracle to mitigate error propagation. The model participated in the CoNLL 2017 UD Shared Task. In spite of not using any ensemble methods and using the baseline segmentation and PoS tagging, the parser obtained good results on both macro-average LAS and UAS in the big treebanks category (55 languages), ranking 7th out of 33 teams. In the all treebanks category (LAS and UAS) we ranked 16th and 12th. The gap between the all and big categories is mainly due to the poor performance on four parallel PUD treebanks, suggesting that some `suffixed' treebanks (e.g. Spanish-AnCora) perform poorly on cross-treebank settings, which does not occur with the corresponding `unsuffixed' treebank (e.g. Spanish). By changing that, we obtain the 11th best LAS among all runs (official and unofficial). The code is made available at https://github.com/CoNLL-UD-2017/LyS-FASTPARSE

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2018

Universal Dependency Parsing with a General Transition-Based DAG Parser

This paper presents our experiments with applying TUPA to the CoNLL 2018...
research
03/14/2016

Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations

We present a simple and effective scheme for dependency parsing which is...
research
02/22/2017

Tackling Error Propagation through Reinforcement Learning: A Case of Greedy Dependency Parsing

Error propagation is a common problem in NLP. Reinforcement learning exp...
research
09/01/2017

Arc-Standard Spinal Parsing with Stack-LSTMs

We present a neural transition-based parser for spinal trees, a dependen...
research
03/23/2015

Yara Parser: A Fast and Accurate Dependency Parser

Dependency parsers are among the most crucial tools in natural language ...
research
03/11/2016

Training with Exploration Improves a Greedy Stack-LSTM Parser

We adapt the greedy Stack-LSTM dependency parser of Dyer et al. (2015) t...
research
06/30/2022

Masked Part-Of-Speech Model: Does Modeling Long Context Help Unsupervised POS-tagging?

Previous Part-Of-Speech (POS) induction models usually assume certain in...

Please sign up or login with your details

Forgot password? Click here to reset