A Multi-task Multi-stage Transitional Training Framework for Neural Chat Translation

01/27/2023
by   Chulun Zhou, et al.
0

Neural chat translation (NCT) aims to translate a cross-lingual chat between speakers of different languages. Existing context-aware NMT models cannot achieve satisfactory performances due to the following inherent problems: 1) limited resources of annotated bilingual dialogues; 2) the neglect of modelling conversational properties; 3) training discrepancy between different stages. To address these issues, in this paper, we propose a multi-task multi-stage transitional (MMT) training framework, where an NCT model is trained using the bilingual chat translation dataset and additional monolingual dialogues. We elaborately design two auxiliary tasks, namely utterance discrimination and speaker discrimination, to introduce the modelling of dialogue coherence and speaker characteristic into the NCT model. The training process consists of three stages: 1) sentence-level pre-training on large-scale parallel corpus; 2) intermediate training with auxiliary tasks using additional monolingual dialogues; 3) context-aware fine-tuning with gradual transition. Particularly, the second stage serves as an intermediate phase that alleviates the training discrepancy between the pre-training and fine-tuning stages. Moreover, to make the stage transition smoother, we train the NCT model using a gradual transition strategy, i.e., gradually transiting from using monolingual to bilingual dialogues. Extensive experiments on two language pairs demonstrate the effectiveness and superiority of our proposed training framework.

READ FULL TEXT

page 1

page 3

page 5

page 14

page 16

research
05/08/2022

Scheduled Multi-task Learning for Neural Chat Translation

Neural Chat Translation (NCT) aims to translate conversational text into...
research
09/02/2021

Towards Making the Most of Dialogue Characteristics for Neural Chat Translation

Neural Chat Translation (NCT) aims to translate conversational text betw...
research
10/06/2020

Multi-task Learning for Multilingual Neural Machine Translation

While monolingual data has been shown to be useful in improving bilingua...
research
06/01/2021

Towards Quantifiable Dialogue Coherence Evaluation

Automatic dialogue coherence evaluation has attracted increasing attenti...
research
09/14/2021

Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission

This paper describes Netmarble's submission to WMT21 Automatic Post-Edit...
research
06/17/2019

A Multi-Task Architecture on Relevance-based Neural Query Translation

We describe a multi-task learning approach to train a Neural Machine Tra...

Please sign up or login with your details

Forgot password? Click here to reset