Explicit Sentence Compression for Neural Machine Translation

12/27/2019
by   Zuchao Li, et al.
0

State-of-the-art Transformer-based neural machine translation (NMT) systems still follow a standard encoder-decoder framework, in which source sentence representation can be well done by an encoder with self-attention mechanism. Though Transformer-based encoder may effectively capture general information in its resulting source sentence representation, the backbone information, which stands for the gist of a sentence, is not specifically focused on. In this paper, we propose an explicit sentence compression method to enhance the source sentence representation for NMT. In practice, an explicit sentence compression goal used to learn the backbone information in a sentence. We propose three ways, including backbone source-side fusion, target-side fusion, and both-side fusion, to integrate the compressed sentence into NMT. Our empirical tests on the WMT English-to-French and English-to-German translation tasks show that the proposed sentence compression method significantly improves the translation performances over strong baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2020

Explicit Reordering for Neural Machine Translation

In Transformer-based neural machine translation (NMT), the positional en...
research
02/11/2021

Text Compression-aided Transformer Encoding

Text encoding is one of the most important steps in Natural Language Pro...
research
05/23/2022

Towards Opening the Black Box of Neural Machine Translation: Source and Target Interpretations of the Transformer

In Neural Machine Translation (NMT), each token prediction is conditione...
research
12/12/2022

P-Transformer: Towards Better Document-to-Document Neural Machine Translation

Directly training a document-to-document (Doc2Doc) neural machine transl...
research
05/16/2018

Are BLEU and Meaning Representation in Opposition?

One of possible ways of obtaining continuous-space sentence representati...
research
11/01/2018

Towards Linear Time Neural Machine Translation with Capsule Networks

In this study, we first investigate a novel capsule network with dynamic...
research
12/10/2020

Rewriter-Evaluator Framework for Neural Machine Translation

Encoder-decoder architecture has been widely used in neural machine tran...

Please sign up or login with your details

Forgot password? Click here to reset