Constituency Parsing with a Self-Attentive Encoder

05/02/2018
by   Nikita Kitaev, et al.
0

We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead to improvements to a state-of-the-art discriminative constituency parser. The use of attention makes explicit the manner in which information is propagated between different locations in the sentence, which we use to both analyze our model and propose potential improvements. For example, we find that separating positional and content information in the encoder can lead to improved parsing accuracy. Additionally, we evaluate different approaches for lexical representation. Our parser achieves new state-of-the-art results for single models trained on the Penn Treebank: 93.55 F1 without the use of any external data, and 95.13 F1 when using pre-trained word representations. Our parser also outperforms the previous best-published accuracy figures on 8 of the 9 languages in the SPMRL dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2019

Rethinking Self-Attention: An Interpretable Self-Attentive Encoder-Decoder Parser

Attention mechanisms have improved the performance of NLP tasks while pr...
research
12/31/2018

Multilingual Constituency Parsing with Self-Attention and Pre-Training

We extend our previous work on constituency parsing (Kitaev and Klein, 2...
research
11/08/2018

Effective Subtree Encoding for Easy-First Dependency Parsing

Easy-first parsing relies on subtree re-ranking to build the complete pa...
research
07/05/2016

Global Neural CCG Parsing with Optimality Guarantees

We introduce the first global recursive neural parsing model with optima...
research
07/10/2017

Improving Neural Parsing by Disentangling Model Combination and Reranking Effects

Recent work has proposed several generative neural models for constituen...
research
07/21/2017

End-to-end Neural Coreference Resolution

We introduce the first end-to-end coreference resolution model and show ...
research
09/24/2019

Neural Generative Rhetorical Structure Parsing

Rhetorical structure trees have been shown to be useful for several docu...

Please sign up or login with your details

Forgot password? Click here to reset