A deep Natural Language Inference predictor without language-specific training data

09/06/2023
by   Lorenzo Corradi, et al.
1

In this paper we present a technique of NLP to tackle the problem of inference relation (NLI) between pairs of sentences in a target language of choice without a language-specific training dataset. We exploit a generic translation dataset, manually translated, along with two instances of the same pre-trained model - the first to generate sentence embeddings for the source language, and the second fine-tuned over the target language to mimic the first. This technique is known as Knowledge Distillation. The model has been evaluated over machine translated Stanford NLI test dataset, machine translated Multi-Genre NLI test dataset, and manually translated RTE3-ITA test dataset. We also test the proposed architecture over different tasks to empirically demonstrate the generality of the NLI task. The model has been evaluated over the native Italian ABSITA dataset, on the tasks of Sentiment Analysis, Aspect-Based Sentiment Analysis, and Topic Recognition. We emphasise the generality and exploitability of the Knowledge Distillation technique that outperforms other methodologies based on machine translation, even though the former was not directly trained on the data it was tested over.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2022

CL-XABSA: Contrastive Learning for Cross-lingual Aspect-based Sentiment Analysis

As an extensive research in the field of Natural language processing (NL...
research
12/31/2020

Exploring Monolingual Data for Neural Machine Translation with Knowledge Distillation

We explore two types of monolingual data that can be included in knowled...
research
11/14/2018

The ADAPT System Description for the IWSLT 2018 Basque to English Translation Task

In this paper we present the ADAPT system built for the Basque to Englis...
research
02/28/2023

Language-Universal Adapter Learning with Knowledge Distillation for End-to-End Multilingual Speech Recognition

In this paper, we propose a language-universal adapter learning framewor...
research
08/02/2019

Self-Knowledge Distillation in Natural Language Processing

Since deep learning became a key player in natural language processing (...
research
05/16/2022

Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt

Data-free knowledge distillation (DFKD) conducts knowledge distillation ...

Please sign up or login with your details

Forgot password? Click here to reset