Global Neural CCG Parsing with Optimality Guarantees

07/05/2016 ∙ by Kenton Lee, et al. ∙ 0

We introduce the first global recursive neural parsing model with optimality guarantees during decoding. To support global features, we give up dynamic programs and instead search directly in the space of all possible subtrees. Although this space is exponentially large in the sentence length, we show it is possible to learn an efficient A* parser. We augment existing parsing models, which have informative bounds on the outside score, with a global model that has loose bounds but only needs to model non-local phenomena. The global model is trained with a new objective that encourages the parser to explore a tiny fraction of the search space. The approach is applied to CCG parsing, improving state-of-the-art accuracy by 0.4 F1. The parser finds the optimal parse for 99.9

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 2

Code Repositories

neuralccg

Codebase for Global Neural CCG Parsing with Optimality Guarantees


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.