Source-side Prediction for Neural Headline Generation

12/22/2017
by   Shun Kiyono, et al.
0

The encoder-decoder model is widely used in natural language generation tasks. However, the model sometimes suffers from repeated redundant generation, misses important phrases, and includes irrelevant entities. Toward solving these problems we propose a novel source-side token prediction module. Our method jointly estimates the probability distributions over source and target vocabularies to capture a correspondence between source and target tokens. The experiments show that the proposed model outperforms the current state-of-the-art method in the headline generation task. Additionally, we show that our method has an ability to learn a reasonable token-wise correspondence without knowing any true alignments.

READ FULL TEXT

page 10

page 12

research
11/02/2020

Focus on the present: a regularization method for the ASR source-target attention layer

This paper introduces a novel method to diagnose the source-target atten...
research
10/20/2022

Wait-info Policy: Balancing Source and Target at Information Level for Simultaneous Machine Translation

Simultaneous machine translation (SiMT) outputs the translation while re...
research
05/23/2022

Towards Opening the Black Box of Neural Machine Translation: Source and Target Interpretations of the Transformer

In Neural Machine Translation (NMT), each token prediction is conditione...
research
10/22/2022

Information-Transport-based Policy for Simultaneous Translation

Simultaneous translation (ST) outputs translation while receiving the so...
research
09/09/2021

HintedBT: Augmenting Back-Translation with Quality and Transliteration Hints

Back-translation (BT) of target monolingual corpora is a widely used dat...
research
08/06/2020

FastLR: Non-Autoregressive Lipreading Model with Integrate-and-Fire

Lipreading is an impressive technique and there has been a definite impr...
research
10/24/2022

Mutual Information Alleviates Hallucinations in Abstractive Summarization

Despite significant progress in the quality of language generated from a...

Please sign up or login with your details

Forgot password? Click here to reset