Is POS Tagging Necessary or Even Helpful for Neural Dependency Parsing?

03/06/2020
by   Yu Zhang, et al.
0

In the pre deep learning era, part-of-speech tags have been considered as indispensable ingredients for feature engineering in dependency parsing due to their important role in alleviating data sparseness of purely lexical features, and quite a few works focus on joint tagging and parsing models to avoid error propagation. In contrast, recent studies suggest that POS tagging becomes much less important or even useless for neural parsing, especially when using character-based word representations such as CharLSTM. Yet there still lacks a full and systematic investigation on this interesting issue, both empirically and linguistically. To answer this, we design four typical multi-task learning frameworks (i.e., Share-Loose, Share-Tight, Stack-Discrete, Stack-Hidden), for joint tagging and parsing based on the state-of-the-art biaffine parser. Considering that it is much cheaper to annotate POS tags than parse trees, we also investigate the utilization of large-scale heterogeneous POS-tag data. We conduct experiments on both English and Chinese datasets, and the results clearly show that POS tagging (both homogeneous and heterogeneous) can still significantly improve parsing performance when using the Stack-Hidden joint framework. We conduct detailed analysis and gain more insights from the linguistic aspect.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2018

A neural joint model for Vietnamese word segmentation, POS tagging and dependency parsing

We propose the first joint model for Vietnamese word segmentation, part-...
research
05/16/2017

A Novel Neural Network Model for Joint POS Tagging and Graph-based Dependency Parsing

We present a novel neural network model that learns POS tagging and grap...
research
04/25/2017

Joint POS Tagging and Dependency Parsing with Transition-based Neural Networks

While part-of-speech (POS) tagging and dependency parsing are observed t...
research
02/28/2019

Better, Faster, Stronger Sequence Tagging Constituent Parsers

Sequence tagging models for constituent parsing are faster, but less acc...
research
03/21/2016

Stack-propagation: Improved Representation Learning for Syntax

Traditional syntax models typically leverage part-of-speech (POS) inform...
research
06/08/2023

Hexatagging: Projective Dependency Parsing as Tagging

We introduce a novel dependency parser, the hexatagger, that constructs ...
research
08/27/2018

An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing

We provide a comprehensive analysis of the interactions between pre-trai...

Please sign up or login with your details

Forgot password? Click here to reset