Enhanced Meta-Learning for Cross-lingual Named Entity Recognition with Minimal Resources

11/14/2019
by   Qianhui Wu, et al.
0

For languages with no annotated resources, transferring knowledge from rich-resource languages is an effective solution for named entity recognition (NER). While all existing methods directly transfer from source-learned model to a target language, in this paper, we propose to fine-tune the learned model with a few similar examples given a test case, which could benefit the prediction by leveraging the structural and semantic information conveyed in such similar examples. To this end, we present a meta-learning algorithm to find a good model parameter initialization that could fast adapt to the given test case and propose to construct multiple pseudo-NER tasks for meta-training by computing sentence similarities. To further improve the model's generalization ability across different languages, we introduce a masking scheme and augment the loss function with an additional maximum term during meta-training. We conduct extensive experiments on cross-lingual named entity recognition with minimal resources over five target languages. The results show that our approach significantly outperforms existing state-of-the-art methods across the board.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2019

Zero-Resource Cross-Lingual Named Entity Recognition

Recently, neural methods have achieved state-of-the-art (SOTA) results i...
research
09/09/2019

What Matters for Neural Cross-Lingual Named Entity Recognition: An Empirical Analysis

Building named entity recognition (NER) models for languages that do not...
research
08/29/2018

Neural Cross-Lingual Named Entity Recognition with Minimal Resources

For languages with no annotated resources, unsupervised transfer of natu...
research
07/15/2020

UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data

Prior works in cross-lingual named entity recognition (NER) with no/litt...
research
12/07/2022

WIDER CLOSER: Mixture of Short-channel Distillers for Zero-shot Cross-lingual Named Entity Recognition

Zero-shot cross-lingual named entity recognition (NER) aims at transferr...
research
08/23/2019

A Little Annotation does a Lot of Good: A Study in Bootstrapping Low-resource Named Entity Recognizers

Most state-of-the-art models for named entity recognition (NER) rely on ...
research
08/17/2023

mCL-NER: Cross-Lingual Named Entity Recognition via Multi-view Contrastive Learning

Cross-lingual named entity recognition (CrossNER) faces challenges stemm...

Please sign up or login with your details

Forgot password? Click here to reset