Enhancing Cross-lingual Natural Language Inference by Soft Prompting with Multilingual Verbalizer

05/22/2023
by   Shuang Li, et al.
0

Cross-lingual natural language inference is a fundamental problem in cross-lingual language understanding. Many recent works have used prompt learning to address the lack of annotated parallel corpora in XNLI. However, these methods adopt discrete prompting by simply translating the templates to the target language and need external expert knowledge to design the templates. Besides, discrete prompts of human-designed template words are not trainable vectors and can not be migrated to target languages in the inference stage flexibly. In this paper, we propose a novel Soft prompt learning framework with the Multilingual Verbalizer (SoftMV) for XNLI. SoftMV first constructs cloze-style question with soft prompts for the input sample. Then we leverage bilingual dictionaries to generate an augmented multilingual question for the original question. SoftMV adopts a multilingual verbalizer to align the representations of original and augmented multilingual questions into the same semantic space with consistency regularization. Experimental results on XNLI demonstrate that SoftMV can achieve state-of-the-art performance and significantly outperform the previous methods under the few-shot and full-shot cross-lingual transfer settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2023

Is Prompt-Based Finetuning Always Better than Vanilla Finetuning? Insights from Cross-Lingual Language Understanding

Multilingual pretrained language models (MPLMs) have demonstrated substa...
research
03/17/2021

SML: a new Semantic Embedding Alignment Transformer for efficient cross-lingual Natural Language Inference

The ability of Transformers to perform with precision a variety of tasks...
research
08/27/2021

From Pivots to Graphs: Augmented CycleDensity as a Generalization to One Time InverseConsultation

This paper describes an approach used to generate new translations using...
research
10/13/2021

Simple or Complex? Complexity-Controllable Question Generation with Soft Templates and Deep Mixture of Experts Model

The ability to generate natural-language questions with controlled compl...
research
09/10/2020

FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding

Large-scale cross-lingual language models (LM), such as mBERT, Unicoder ...
research
12/04/2022

Cross-lingual Similarity of Multilingual Representations Revisited

Related works used indexes like CKA and variants of CCA to measure the s...
research
06/19/2023

Multilingual Few-Shot Learning via Language Model Retrieval

Transformer-based language models have achieved remarkable success in fe...

Please sign up or login with your details

Forgot password? Click here to reset