Improved Transition-Based Parsing by Modeling Characters instead of Words with LSTMs

08/04/2015
by   Miguel Ballesteros, et al.
0

We present extensions to a continuous-state dependency parsing method that makes it applicable to morphologically rich languages. Starting with a high-performance transition-based parser that uses long short-term memory (LSTM) recurrent neural networks to learn representations of the parser state, we replace lookup-based word representations with representations constructed from the orthographic representations of the words, also using LSTMs. This allows statistical sharing across word forms that are similar on the surface. Experiments for morphologically rich languages show that the parsing model benefits from incorporating the character-based encodings of words.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2017

Character Composition Model with Convolutional Neural Networks for Dependency Parsing on Morphologically Rich Languages

We present a transition-based dependency parser that uses a convolutiona...
research
04/22/2016

Dependency Parsing with LSTMs: An Empirical Evaluation

We propose a transition-based dependency parser using Recurrent Neural N...
research
07/18/2019

What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?

This article is a linguistic investigation of a neural parser. We look a...
research
05/29/2017

On Multilingual Training of Neural Dependency Parsers

We show that a recently proposed neural dependency parser can be improve...
research
05/04/2020

From SPMRL to NMRL: What Did We Learn (and Unlearn) in a Decade of Parsing Morphologically-Rich Languages (MRLs)?

It has been exactly a decade since the first establishment of SPMRL, a r...
research
08/27/2018

Parameter sharing between dependency parsers for related languages

Previous work has suggested that parameter sharing between transition-ba...
research
11/15/2017

A Sequential Neural Encoder with Latent Structured Description for Modeling Sentences

In this paper, we propose a sequential neural encoder with latent struct...

Please sign up or login with your details

Forgot password? Click here to reset