Towards Reliable Neural Machine Translation with Consistency-Aware Meta-Learning

03/20/2023
by   Rongxiang Weng, et al.
0

Neural machine translation (NMT) has achieved remarkable success in producing high-quality translations. However, current NMT systems suffer from a lack of reliability, as their outputs that are often affected by lexical or syntactic changes in inputs, resulting in large variations in quality. This limitation hinders the practicality and trustworthiness of NMT. A contributing factor to this problem is that NMT models trained with the one-to-one paradigm struggle to handle the source diversity phenomenon, where inputs with the same meaning can be expressed differently. In this work, we treat this problem as a bilevel optimization problem and present a consistency-aware meta-learning (CAML) framework derived from the model-agnostic meta-learning (MAML) algorithm to address it. Specifically, the NMT model with CAML (named CoNMT) first learns a consistent meta representation of semantically equivalent sentences in the outer loop. Subsequently, a mapping from the meta representation to the output sentence is learned in the inner loop, allowing the NMT model to translate semantically equivalent sentences to the same target sentence. We conduct experiments on the NIST Chinese to English task, three WMT translation tasks, and the TED M2O task. The results demonstrate that CoNMT effectively improves overall translation quality and reliably handles diverse inputs.

READ FULL TEXT
research
11/07/2016

Neural Machine Translation with Reconstruction

Although end-to-end Neural Machine Translation (NMT) has achieved remark...
research
01/11/2019

ParaBank: Monolingual Bitext Generation and Sentential Paraphrasing via Lexically-constrained Neural Machine Translation

We present ParaBank, a large-scale English paraphrase dataset that surpa...
research
08/25/2018

Meta-Learning for Low-Resource Neural Machine Translation

In this paper, we propose to extend the recently introduced model-agnost...
research
06/12/2018

Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model

Neural machine translation (NMT) systems are usually trained on a large ...
research
11/25/2022

Competency-Aware Neural Machine Translation: Can Machine Translation Know its Own Translation Quality?

Neural machine translation (NMT) is often criticized for failures that h...
research
10/04/2018

AutoLoss: Learning Discrete Schedules for Alternate Optimization

Many machine learning problems involve iteratively and alternately optim...
research
10/09/2020

Uncertainty-Aware Semantic Augmentation for Neural Machine Translation

As a sequence-to-sequence generation task, neural machine translation (N...

Please sign up or login with your details

Forgot password? Click here to reset