Do We Need Word Order Information for Cross-lingual Sequence Labeling

01/30/2020
by   Zihan Liu, et al.
0

Most of the recent work in cross-lingual adaptation does not consider the word order variances in different languages. We hypothesize that cross-lingual models that fit into the source language word order might fail to handle target languages whose word orders are different. To test our conjecture, we build an order-agnostic model for cross-lingual sequence labeling tasks. Our model does not encode the word order information of the input sequences, and the predictions for each token are based on the attention on the whole sequence. Experimental results on dialogue natural language understanding, part-of-speech tagging, and named entity recognition tasks show that getting rid of word order information is able to achieve better zero-shot cross-lingual performance than baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2019

Zero-shot Cross-lingual Dialogue Systems with Transferable Latent Variables

Despite the surging demands for multilingual task-oriented dialog system...
research
04/28/2020

Self-Attention with Cross-Lingual Position Representation

Position encoding (PE), an essential part of self-attention networks (SA...
research
10/23/2020

Unsupervised Cross-lingual Adaptation for Sequence Tagging and Beyond

Cross-lingual adaptation with multilingual pre-trained language models (...
research
02/22/2023

Impact of Subword Pooling Strategy on Cross-lingual Event Detection

Pre-trained multilingual language models (e.g., mBERT, XLM-RoBERTa) have...
research
03/20/2016

Multi-Task Cross-Lingual Sequence Tagging from Scratch

We present a deep hierarchical recurrent neural network for sequence tag...
research
01/09/2016

Empirical Gaussian priors for cross-lingual transfer learning

Sequence model learning algorithms typically maximize log-likelihood min...
research
08/18/2021

Contributions of Transformer Attention Heads in Multi- and Cross-lingual Tasks

This paper studies the relative importance of attention heads in Transfo...

Please sign up or login with your details

Forgot password? Click here to reset