Generating News Headlines with Recurrent Neural Networks

12/05/2015
by   Konstantin Lopyrev, et al.
0

We describe an application of an encoder-decoder recurrent neural network with LSTM units and attention to generating headlines from the text of news articles. We find that the model is quite effective at concisely paraphrasing news articles. Furthermore, we study how the neural network decides which input words to pay attention to, and specifically we identify the function of the different neurons in a simplified attention mechanism. Interestingly, our simplified attention mechanism performs better that the more complex attention mechanism on a held out set of articles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2019

NPA: Neural News Recommendation with Personalized Attention

News recommendation is very important to help users find interested news...
research
10/24/2022

Classification of Misinformation in New Articles using Natural Language Processing and a Recurrent Neural Network

This paper seeks to address the classification of misinformation in news...
research
01/28/2021

Development of a Vertex Finding Algorithm using Recurrent Neural Network

Deep learning is a rapidly-evolving technology with possibility to signi...
research
09/04/2017

Satirical News Detection and Analysis using Attention Mechanism and Linguistic Features

Satirical news is considered to be entertainment, but it is potentially ...
research
04/14/2018

An interpretable LSTM neural network for autoregressive exogenous model

In this paper, we propose an interpretable LSTM recurrent neural network...
research
04/25/2019

Importance of Copying Mechanism for News Headline Generation

News headline generation is an essential problem of text summarization b...
research
06/11/2018

Let's do it "again": A First Computational Approach to Detecting Adverbial Presupposition Triggers

We introduce the task of predicting adverbial presupposition triggers su...

Please sign up or login with your details

Forgot password? Click here to reset