Two Local Models for Neural Constituent Parsing

08/14/2018
by   Zhiyang Teng, et al.
0

Non-local features have been exploited by syntactic parsers for capturing dependencies between sub output structures. Such features have been a key to the success of state-of-the-art statistical parsers. With the rise of deep learning, however, it has been shown that local output decisions can give highly competitive accuracies, thanks to the power of dense neural input representations that embody global syntactic information. We investigate two conceptually simple local neural models for constituent parsing, which make local decisions to constituent spans and CFG rules, respectively. Consistent with previous findings along the line, our best model gives highly competitive results, achieving the labeled bracketing F1 scores of 92.4 on CTB 5.1.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2017

In-Order Transition-based Constituent Parsing

Both bottom-up and top-down strategies have been used for neural transit...
research
12/16/2022

Fast Rule-Based Decoding: Revisiting Syntactic Rules in Neural Constituency Parsing

Most recent studies on neural constituency parsing focus on encoder stru...
research
12/20/2016

Span-Based Constituency Parsing with a Structure-Label System and Provably Optimal Dynamic Oracles

Parsing accuracy using efficient greedy transition systems has improved ...
research
10/20/2021

Discontinuous Grammar as a Foreign Language

In order to achieve deep natural language understanding, syntactic const...
research
11/18/2019

Deep and Dense Sarcasm Detection

Recent work in automated sarcasm detection has placed a heavy focus on c...
research
11/18/2019

Dense and Deep Sarcasm Detection

Recent work in automated sarcasm detection has placed a heavy focus on c...
research
05/26/2019

SemBleu: A Robust Metric for AMR Parsing Evaluation

Evaluating AMR parsing accuracy involves comparing pairs of AMR graphs. ...

Please sign up or login with your details

Forgot password? Click here to reset