Multi-Domain Neural Machine Translation

05/06/2018
by   Sander Tars, et al.
0

We present an approach to neural machine translation (NMT) that supports multiple domains in a single model and allows switching between the domains when translating. The core idea is to treat text domains as distinct languages and use multilingual NMT methods to create multi-domain translation systems, we show that this approach results in significant translation quality gains over fine-tuning. We also explore whether the knowledge of pre-specified text domains is necessary, turns out that it is after all, but also that when it is not known quite high translation quality can be reached.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2022

Benchmarking Azerbaijani Neural Machine Translation

Little research has been done on Neural Machine Translation (NMT) for Az...
research
10/24/2022

Specializing Multi-domain NMT via Penalizing Low Mutual Information

Multi-domain Neural Machine Translation (NMT) trains a single model with...
research
10/11/2022

Checks and Strategies for Enabling Code-Switched Machine Translation

Code-switching is a common phenomenon among multilingual speakers, where...
research
10/31/2022

Domain Curricula for Code-Switched MT at MixMT 2022

In multilingual colloquial settings, it is a habitual occurrence to comp...
research
11/07/2019

Multi-Domain Neural Machine Translation with Word-Level Adaptive Layer-wise Domain Mixing

Many multi-domain neural machine translation (NMT) models achieve knowle...
research
11/22/2019

Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks

The key challenge of multi-domain translation lies in simultaneously enc...
research
12/02/2019

Merging External Bilingual Pairs into Neural Machine Translation

As neural machine translation (NMT) is not easily amenable to explicit c...

Please sign up or login with your details

Forgot password? Click here to reset