Parameter sharing between dependency parsers for related languages

08/27/2018
by   Miryam de Lhoneux, et al.
0

Previous work has suggested that parameter sharing between transition-based neural dependency parsers for related languages can lead to better performance, but there is no consensus on what parameters to share. We present an evaluation of 27 different parameter sharing strategies across 10 languages, representing five pairs of related languages, each pair from a different language family. We find that sharing transition classifier parameters always helps, whereas the usefulness of sharing word and/or character LSTM parameters varies. Based on this result, we propose an architecture where the transition classifier is shared, and the sharing of word and character parameters is controlled by a parameter that can be tuned on validation data. This model is linguistically motivated and obtains significant improvements over a monolingually trained baseline. We also find that sharing transition classifier parameters helps when training a parser on unrelated language pairs, but we find that, in the case of unrelated languages, sharing too many parameters does not help.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2017

Character Composition Model with Convolutional Neural Networks for Dependency Parsing on Morphologically Rich Languages

We present a transition-based dependency parser that uses a convolutiona...
research
09/01/2018

Parameter Sharing Methods for Multilingual Self-Attentional Translation Models

In multilingual neural machine translation, it has been shown that shari...
research
08/04/2015

Improved Transition-Based Parsing by Modeling Characters instead of Words with LSTMs

We present extensions to a continuous-state dependency parsing method th...
research
08/27/2018

An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing

We provide a comprehensive analysis of the interactions between pre-trai...
research
06/15/2023

Understanding Parameter Sharing in Transformers

Parameter sharing has proven to be a parameter-efficient approach. Previ...
research
10/12/2020

HUJI-KU at MRP 2020: Two Transition-based Neural Parsers

This paper describes the HUJI-KU system submission to the shared task on...
research
07/18/2019

What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions?

This article is a linguistic investigation of a neural parser. We look a...

Please sign up or login with your details

Forgot password? Click here to reset