mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models

10/15/2021
by   Ryokan Ri, et al.
0

Recent studies have shown that multilingual pretrained language models can be effectively improved with cross-lingual alignment information from Wikipedia entities. However, existing methods only exploit entity information in pretraining and do not explicitly use entities in downstream tasks. In this study, we explore the effectiveness of leveraging entity representations for downstream cross-lingual tasks. We train a multilingual language model with 24 languages with entity representations and show the model consistently outperforms word-based pretrained models in various cross-lingual transfer tasks. We also analyze the model and the key insight is that incorporating entity representations into the input allows us to extract more language-agnostic features. We also evaluate the model with a multilingual cloze prompt task with the mLAMA dataset. We show that entity-based prompt elicits correct factual knowledge more likely than using only word representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2021

Cross-Lingual Language Model Meta-Pretraining

The success of pretrained cross-lingual language models relies on two es...
research
11/22/2021

Knowledge Based Multilingual Language Model

Knowledge enriched language representation learning has shown promising ...
research
10/21/2022

SpaBERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation

Named geographic entities (geo-entities for short) are the building bloc...
research
06/27/2022

Endowing Language Models with Multimodal Knowledge Graph Representations

We propose a method to make natural language understanding models more p...
research
09/07/2021

Mixed Attention Transformer for Leveraging Word-Level Knowledge to Neural Cross-Lingual Information Retrieval

Pretrained contextualized representations offer great success for many d...
research
09/16/2021

Locating Language-Specific Information in Contextualized Embeddings

Multilingual pretrained language models (MPLMs) exhibit multilinguality ...
research
10/15/2019

Aligning Cross-Lingual Entities with Multi-Aspect Information

Multilingual knowledge graphs (KGs), such as YAGO and DBpedia, represent...

Please sign up or login with your details

Forgot password? Click here to reset