Read, Tag, and Parse All at Once, or Fully-neural Dependency Parsing

09/12/2016
by   Jan Chorowski, et al.
0

We present a dependency parser implemented as a single deep neural network that reads orthographic representations of words and directly generates dependencies and their labels. Unlike typical approaches to parsing, the model doesn't require part-of-speech (POS) tagging of the sentences. With proper regularization and additional supervision achieved with multitask learning we reach state-of-the-art performance on Slavic languages from the Universal Dependencies treebank: with no linguistic features other than characters, our parser is as accurate as a transition- based system trained on perfect POS tags.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2018

An improved neural network model for joint POS tagging and dependency parsing

We propose a novel neural network model for joint part-of-speech (POS) t...
research
05/29/2017

On Multilingual Training of Neural Dependency Parsers

We show that a recently proposed neural dependency parser can be improve...
research
03/21/2016

Stack-propagation: Improved Representation Learning for Syntax

Traditional syntax models typically leverage part-of-speech (POS) inform...
research
11/26/2016

Fill it up: Exploiting partial dependency annotations in a minimum spanning tree parser

Unsupervised models of dependency parsing typically require large amount...
research
07/16/2021

POS tagging, lemmatization and dependency parsing of West Frisian

We present a lemmatizer/POS-tagger/dependency parser for West Frisian us...
research
06/01/2020

Distilling Neural Networks for Greener and Faster Dependency Parsing

The carbon footprint of natural language processing research has been in...
research
03/18/2015

Learning to Search for Dependencies

We demonstrate that a dependency parser can be built using a credit assi...

Please sign up or login with your details

Forgot password? Click here to reset