DeepAI AI Chat
Log In Sign Up

Specializing Multi-domain NMT via Penalizing Low Mutual Information

10/24/2022
by   Jiyoung Lee, et al.
KAIST 수리과학과
NAVER Corp.
0

Multi-domain Neural Machine Translation (NMT) trains a single model with multiple domains. It is appealing because of its efficacy in handling multiple domains within one model. An ideal multi-domain NMT should learn distinctive domain characteristics simultaneously, however, grasping the domain peculiarity is a non-trivial task. In this paper, we investigate domain-specific information through the lens of mutual information (MI) and propose a new objective that penalizes low MI to become higher. Our method achieved the state-of-the-art performance among the current competitive multi-domain NMT models. Also, we empirically show our objective promotes low MI to be higher resulting in domain-specialized multi-domain NMT.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/06/2018

Multi-Domain Neural Machine Translation

We present an approach to neural machine translation (NMT) that supports...
10/20/2022

Can Domains Be Transferred Across Languages in Multi-Domain Multilingual Neural Machine Translation?

Previous works mostly focus on either multilingual or multi-domain aspec...
11/08/2019

Domain Robustness in Neural Machine Translation

Translating text that diverges from the training domain is a key challen...
11/07/2019

Multi-Domain Neural Machine Translation with Word-Level Adaptive Layer-wise Domain Mixing

Many multi-domain neural machine translation (NMT) models achieve knowle...
11/22/2019

Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks

The key challenge of multi-domain translation lies in simultaneously enc...
09/16/2021

Translation Transformers Rediscover Inherent Data Domains

Many works proposed methods to improve the performance of Neural Machine...
10/13/2022

DICTDIS: Dictionary Constrained Disambiguation for Improved NMT

Domain-specific neural machine translation (NMT) systems (e.g., in educa...