Multi-Domain Neural Machine Translation with Word-Level Adaptive Layer-wise Domain Mixing

11/07/2019
by   Haoming Jiang, et al.
0

Many multi-domain neural machine translation (NMT) models achieve knowledge transfer by enforcing one encoder to learn shared embedding across domains. However, this design lacks adaptation to individual domains. To overcome this limitation, we propose a novel multi-domain NMT model using individual modules for each domain, on which we apply word-level, adaptive and layer-wise domain mixing. We first observe that words in a sentence are often related to multiple domains. Hence, we assume each word has a domain proportion, which indicates its domain preference. Then word representations are obtained by mixing their embedding in individual domains based on their domain proportions. We show this can be achieved by carefully designing multi-head dot-product attention modules for different domains, and eventually taking weighted averages of their parameters by word-level layer-wise domain proportions. Through this, we can achieve effective domain knowledge sharing, and capture fine-grained domain-specific knowledge as well. Our experiments show that our proposed model outperforms existing ones in several NMT tasks.

READ FULL TEXT
research
06/02/2019

Domain Adaptation of Neural Machine Translation by Lexicon Induction

It has been previously noted that neural machine translation (NMT) is ve...
research
05/06/2018

Multi-Domain Neural Machine Translation

We present an approach to neural machine translation (NMT) that supports...
research
05/06/2023

Label-Free Multi-Domain Machine Translation with Stage-wise Training

Most multi-domain machine translation models rely on domain-annotated da...
research
10/24/2022

Specializing Multi-domain NMT via Penalizing Low Mutual Information

Multi-domain Neural Machine Translation (NMT) trains a single model with...
research
11/22/2019

Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks

The key challenge of multi-domain translation lies in simultaneously enc...
research
11/02/2020

Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation

Neural machine translation (NMT) models usually suffer from catastrophic...
research
04/05/2022

Domain-Aware Contrastive Knowledge Transfer for Multi-domain Imbalanced Data

In many real-world machine learning applications, samples belong to a se...

Please sign up or login with your details

Forgot password? Click here to reset