Transition-Based Dependency Parsing using Perceptron Learner

Syntactic parsing using dependency structures has become a standard technique in natural language processing with many different parsing models, in particular data-driven models that can be trained on syntactically annotated corpora. In this paper, we tackle transition-based dependency parsing using a Perceptron Learner. Our proposed model, which adds more relevant features to the Perceptron Learner, outperforms a baseline arc-standard parser. We beat the UAS of the MALT and LSTM parsers. We also give possible ways to address parsing of non-projective trees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2019

Vietnamese transition-based dependency parsing with supertag features

In recent years, dependency parsing is a fascinating research topic and ...
research
03/02/2017

Lock-Free Parallel Perceptron for Graph-based Dependency Parsing

Dependency parsing is an important NLP task. A popular approach for depe...
research
04/17/2020

Neural Approaches for Data Driven Dependency Parsing in Sanskrit

Data-driven approaches for dependency parsing have been of great interes...
research
06/19/2015

Structured Training for Neural Network Transition-Based Parsing

We present structured perceptron training for neural network transition-...
research
07/20/2015

Notes About a More Aware Dependency Parser

In this paper I explain the reasons that led me to research and conceive...
research
09/28/2022

Data-driven Parsing Evaluation for Child-Parent Interactions

We present a syntactic dependency treebank for naturalistic child and ch...
research
10/11/2016

Keystroke dynamics as signal for shallow syntactic parsing

Keystroke dynamics have been extensively used in psycholinguistic and wr...

Please sign up or login with your details

Forgot password? Click here to reset