Easy-First Dependency Parsing with Hierarchical Tree LSTMs

03/01/2016
by   Eliyahu Kiperwasser, et al.
0

We suggest a compositional vector representation of parse trees that relies on a recursive combination of recurrent-neural network encoders. To demonstrate its effectiveness, we use the representation as the backbone of a greedy, bottom-up dependency parser, achieving state-of-the-art accuracies for English and Chinese, without relying on external word embeddings. The parser's implementation is available for download at the first author's webpage.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2016

Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations

We present a simple and effective scheme for dependency parsing which is...
research
05/21/2015

A Re-ranking Model for Dependency Parser with Recursive Convolutional Neural Network

In this work, we address the problem to model all the nodes (words or ph...
research
12/22/2014

Joint RNN-Based Greedy Parsing and Word Composition

This paper introduces a greedy parser based on neural networks, which le...
research
11/08/2018

Effective Subtree Encoding for Easy-First Dependency Parsing

Easy-first parsing relies on subtree re-ranking to build the complete pa...
research
03/11/2016

Training with Exploration Improves a Greedy Stack-LSTM Parser

We adapt the greedy Stack-LSTM dependency parser of Dyer et al. (2015) t...
research
07/18/2019

What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?

This article is a linguistic investigation of a neural parser. We look a...
research
09/02/2020

A Practical Chinese Dependency Parser Based on A Large-scale Dataset

Dependency parsing is a longstanding natural language processing task, w...

Please sign up or login with your details

Forgot password? Click here to reset