A Modest Pareto Optimisation Analysis of Dependency Parsers in 2021

06/08/2021
by   Mark Anderson, et al.
0

We evaluate three leading dependency parser systems from different paradigms on a small yet diverse subset of languages in terms of their accuracy-efficiency Pareto front. As we are interested in efficiency, we evaluate core parsers without pretrained language models (as these are typically huge networks and would constitute most of the compute time) or other augmentations that can be transversally applied to any of them. Biaffine parsing emerges as a well-balanced default choice, with sequence-labelling parsing being preferable if inference speed (but not training energy cost) is the priority.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2016

Dependency Language Models for Transition-based Dependency Parsing

In this paper, we present an approach to improve the accuracy of a stron...
research
09/19/2019

Low-Resource Parsing with Crosslingual Contextualized Representations

Despite advances in dependency parsing, languages with small treebanks s...
research
06/01/2020

Distilling Neural Networks for Greener and Faster Dependency Parsing

The carbon footprint of natural language processing research has been in...
research
01/28/2021

Syntactic Nuclei in Dependency Parsing – A Multilingual Exploration

Standard models for syntactic dependency parsing take words to be the el...
research
06/08/2023

Hexatagging: Projective Dependency Parsing as Tagging

We introduce a novel dependency parser, the hexatagger, that constructs ...
research
04/30/2020

Exploring Contextualized Neural Language Models for Temporal Dependency Parsing

Extracting temporal relations between events and time expressions has ma...
research
09/18/2019

Default Disambiguation for Online Parsers

Since composed grammars are often ambiguous, grammar composition require...

Please sign up or login with your details

Forgot password? Click here to reset