Priberam Labs at the NTCIR-15 SHINRA2020-ML: Classification Task

05/12/2021
by   Ruben Cardoso, et al.
0

Wikipedia is an online encyclopedia available in 285 languages. It composes an extremely relevant Knowledge Base (KB), which could be leveraged by automatic systems for several purposes. However, the structure and organisation of such information are not prone to automatic parsing and understanding and it is, therefore, necessary to structure this knowledge. The goal of the current SHINRA2020-ML task is to leverage Wikipedia pages in order to categorise their corresponding entities across 268 hierarchical categories, belonging to the Extended Named Entity (ENE) ontology. In this work, we propose three distinct models based on the contextualised embeddings yielded by Multilingual BERT. We explore the performances of a linear layer with and without explicit usage of the ontology's hierarchy, and a Gated Recurrent Units (GRU) layer. We also test several pooling strategies to leverage BERT's embeddings and selection criteria based on the labels' scores. We were able to achieve good performance across a large variety of languages, including those not seen during the fine-tuning process (zero-shot languages).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2019

Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF

In this paper we tackle multilingual named entity recognition task. We u...
research
04/28/2020

Extending Multilingual BERT to Low-Resource Languages

Multilingual BERT (M-BERT) has been a huge success in both supervised an...
research
04/11/2022

Entities, Dates, and Languages: Zero-Shot on Historical Texts with T0

In this work, we explore whether the recently demonstrated zero-shot abi...
research
01/21/2020

Classifying Wikipedia in a fine-grained hierarchy: what graphs can contribute

Wikipedia is a huge opportunity for machine learning, being the largest ...
research
10/07/2020

Cross-lingual Extended Named Entity Classification of Wikipedia Articles

The FPT.AI team participated in the SHINRA2020-ML subtask of the NTCIR-1...
research
07/28/2023

WC-SBERT: Zero-Shot Text Classification via SBERT with Self-Training for Wikipedia Categories

Our research focuses on solving the zero-shot text classification proble...

Please sign up or login with your details

Forgot password? Click here to reset