Translation Transformers Rediscover Inherent Data Domains

09/16/2021
by   Maksym Del, et al.
0

Many works proposed methods to improve the performance of Neural Machine Translation (NMT) models in a domain/multi-domain adaptation scenario. However, an understanding of how NMT baselines represent text domain information internally is still lacking. Here we analyze the sentence representations learned by NMT Transformers and show that these explicitly include the information on text domains, even after only seeing the input sentences without domains labels. Furthermore, we show that this internal information is enough to cluster sentences by their underlying domains without supervision. We show that NMT models produce clusters better aligned to the actual domains compared to pre-trained language models (LMs). Notably, when computed on document-level, NMT cluster-to-domain correspondence nears 100 with an approach to NMT domain adaptation using automatically extracted domains. Whereas previous work relied on external LMs for text clustering, we propose re-using the NMT model as a source of unsupervised clusters. We perform an extensive experimental study comparing two approaches across two data scenarios, three language pairs, and both sentence-level and document-level clustering, showing equal or significantly superior performance compared to LMs.

READ FULL TEXT

page 5

page 14

research
04/20/2023

Exploring Paracrawl for Document-level Neural Machine Translation

Document-level neural machine translation (NMT) has outperformed sentenc...
research
04/05/2020

Unsupervised Domain Clusters in Pretrained Language Models

The notion of "in-domain data" in NLP is often over-simplistic and vague...
research
02/21/2022

Domain Adaptation in Neural Machine Translation using a Qualia-Enriched FrameNet

In this paper we present Scylla, a methodology for domain adaptation of ...
research
12/16/2019

Iterative Dual Domain Adaptation for Neural Machine Translation

Previous studies on the domain adaptation for neural machine translation...
research
04/20/2022

DaLC: Domain Adaptation Learning Curve Prediction for Neural Machine Translation

Domain Adaptation (DA) of Neural Machine Translation (NMT) model often r...
research
10/24/2022

Specializing Multi-domain NMT via Penalizing Low Mutual Information

Multi-domain Neural Machine Translation (NMT) trains a single model with...
research
02/28/2019

Non-Parametric Adaptation for Neural Machine Translation

Neural Networks trained with gradient descent are known to be susceptibl...

Please sign up or login with your details

Forgot password? Click here to reset