Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking

10/19/2020
by   Elliot Schumacher, et al.
0

Cross-language entity linking grounds mentions in multiple languages to a single-language knowledge base. We propose a neural ranking architecture for this task that uses multilingual BERT representations of the mention and the context in a neural network. We find that the multilingual ability of BERT leads to robust performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We investigate the zero-shot degradation and find that it can be partially mitigated by a proposed auxiliary training objective, but that the remaining error can best be attributed to domain shift rather than language transfer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2021

Improving Zero-Shot Multi-Lingual Entity Linking

Entity linking – the task of identifying references in free text to rele...
research
06/04/2019

How multilingual is Multilingual BERT?

In this paper, we show that Multilingual BERT (M-BERT), released by Devl...
research
10/05/2021

Analyzing the Effects of Reasoning Types on Cross-Lingual Transfer Performance

Multilingual language models achieve impressive zero-shot accuracies in ...
research
05/31/2020

Neural Entity Linking: A Survey of Models based on Deep Learning

In this survey, we provide a comprehensive description of recent neural ...
research
10/12/2020

Zero-shot Entity Linking with Efficient Long Range Sequence Modeling

This paper considers the problem of zero-shot entity linking, in which a...
research
11/08/2019

How Language-Neutral is Multilingual BERT?

Multilingual BERT (mBERT) provides sentence representations for 104 lang...
research
03/23/2021

Multilingual Autoregressive Entity Linking

We present mGENRE, a sequence-to-sequence system for the Multilingual En...

Please sign up or login with your details

Forgot password? Click here to reset