Log In Sign Up

Translation Transformers Rediscover Inherent Data Domains

by   Maksym Del, et al.

Many works proposed methods to improve the performance of Neural Machine Translation (NMT) models in a domain/multi-domain adaptation scenario. However, an understanding of how NMT baselines represent text domain information internally is still lacking. Here we analyze the sentence representations learned by NMT Transformers and show that these explicitly include the information on text domains, even after only seeing the input sentences without domains labels. Furthermore, we show that this internal information is enough to cluster sentences by their underlying domains without supervision. We show that NMT models produce clusters better aligned to the actual domains compared to pre-trained language models (LMs). Notably, when computed on document-level, NMT cluster-to-domain correspondence nears 100 with an approach to NMT domain adaptation using automatically extracted domains. Whereas previous work relied on external LMs for text clustering, we propose re-using the NMT model as a source of unsupervised clusters. We perform an extensive experimental study comparing two approaches across two data scenarios, three language pairs, and both sentence-level and document-level clustering, showing equal or significantly superior performance compared to LMs.


page 5

page 14


Domain Adaptation in Neural Machine Translation using a Qualia-Enriched FrameNet

In this paper we present Scylla, a methodology for domain adaptation of ...

Unsupervised Domain Clusters in Pretrained Language Models

The notion of "in-domain data" in NLP is often over-simplistic and vague...

Multi-Domain Adaptation in Neural Machine Translation Through Multidimensional Tagging

Many modern Neural Machine Translation (NMT) systems are trained on nonh...

Fast Domain Adaptation for Neural Machine Translation

Neural Machine Translation (NMT) is a new approach for automatic transla...

DaLC: Domain Adaptation Learning Curve Prediction for Neural Machine Translation

Domain Adaptation (DA) of Neural Machine Translation (NMT) model often r...

Master Thesis: Neural Sign Language Translation by Learning Tokenization

In this thesis, we propose a multitask learning based method to improve ...

Specializing Multi-domain NMT via Penalizing Low Mutual Information

Multi-domain Neural Machine Translation (NMT) trains a single model with...