Multi-Domain Neural Machine Translation

05/06/2018
by   Sander Tars, et al.
0

We present an approach to neural machine translation (NMT) that supports multiple domains in a single model and allows switching between the domains when translating. The core idea is to treat text domains as distinct languages and use multilingual NMT methods to create multi-domain translation systems, we show that this approach results in significant translation quality gains over fine-tuning. We also explore whether the knowledge of pre-specified text domains is necessary, turns out that it is after all, but also that when it is not known quite high translation quality can be reached.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset