Arc-Standard Spinal Parsing with Stack-LSTMs

09/01/2017
by   Miguel Ballesteros, et al.
0

We present a neural transition-based parser for spinal trees, a dependency representation of constituent trees. The parser uses Stack-LSTMs that compose constituent nodes with dependency-based derivations. In experiments, we show that this model adapts to different styles of dependency relations, but this choice has little effect for predicting constituent structure, suggesting that LSTMs induce useful states by themselves.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2017

AMR Parsing using Stack-LSTMs

We present a transition-based AMR parser that directly generates AMR par...
research
12/15/2016

Transition-based Parsing with Context Enhancement and Future Reward Reranking

This paper presents a novel reranking model, future reward reranking, to...
research
01/12/2022

Biaffine Discourse Dependency Parsing

We provide a study of using the biaffine model for neural discourse depe...
research
07/18/2019

What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?

This article is a linguistic investigation of a neural parser. We look a...
research
07/11/2017

A non-projective greedy dependency parser with bidirectional LSTMs

The LyS-FASTPARSE team presents BIST-COVINGTON, a neural implementation ...
research
06/01/2021

Replicating and Extending "Because Their Treebanks Leak": Graph Isomorphism, Covariants, and Parser Performance

Søgaard (2020) obtained results suggesting the fraction of trees occurri...
research
08/29/2017

A Simple LSTM model for Transition-based Dependency Parsing

We present a simple LSTM-based transition-based dependency parser. Our m...

Please sign up or login with your details

Forgot password? Click here to reset