On Parsing as Tagging

11/14/2022
by   Afra Amini, et al.
0

There have been many proposals to reduce constituency parsing to tagging in the literature. To better understand what these approaches have in common, we cast several existing proposals into a unifying pipeline consisting of three steps: linearization, learning, and decoding. In particular, we show how to reduce tetratagging, a state-of-the-art constituency tagger, to shift–reduce parsing by performing a right-corner transformation on the grammar and making a specific independence assumption. Furthermore, we empirically evaluate our taxonomy of tagging pipelines with different choices of linearizers, learners, and decoders. Based on the results in English and a set of 8 typologically diverse languages, we conclude that the linearization of the derivation tree and its alignment with the input sequence is the most critical factor in achieving accurate taggers.

READ FULL TEXT
research
02/28/2019

Better, Faster, Stronger Sequence Tagging Constituent Parsers

Sequence tagging models for constituent parsing are faster, but less acc...
research
04/25/2017

Joint POS Tagging and Dependency Parsing with Transition-based Neural Networks

While part-of-speech (POS) tagging and dependency parsing are observed t...
research
08/09/2019

Artificially Evolved Chunks for Morphosyntactic Analysis

We introduce a language-agnostic evolutionary technique for automaticall...
research
05/27/2020

Enriched In-Order Linearization for Faster Sequence-to-Sequence Constituent Parsing

Sequence-to-sequence constituent parsing requires a linearization to rep...
research
06/20/2017

Alignment Elimination from Adams' Grammars

Adams' extension of parsing expression grammars enables specifying inden...
research
10/21/2022

Shift-Reduce Task-Oriented Semantic Parsing with Stack-Transformers

Intelligent voice assistants, such as Apple Siri and Amazon Alexa, are w...
research
08/14/2019

Establishing Strong Baselines for the New Decade: Sequence Tagging, Syntactic and Semantic Parsing with BERT

This paper presents new state-of-the-art models for three tasks, part-of...

Please sign up or login with your details

Forgot password? Click here to reset