Bengali Abstractive News Summarization(BANS): A Neural Attention Approach

12/03/2020
by   Prithwiraj Bhattacharjee, et al.
0

Abstractive summarization is the process of generating novel sentences based on the information extracted from the original text document while retaining the context. Due to abstractive summarization's underlying complexities, most of the past research work has been done on the extractive summarization approach. Nevertheless, with the triumph of the sequence-to-sequence (seq2seq) model, abstractive summarization becomes more viable. Although a significant number of notable research has been done in the English language based on abstractive summarization, only a couple of works have been done on Bengali abstractive news summarization (BANS). In this article, we presented a seq2seq based Long Short-Term Memory (LSTM) network model with attention at encoder-decoder. Our proposed system deploys a local attention-based model that produces a long sequence of words with lucid and human-like generated sentences with noteworthy information of the original document. We also prepared a dataset of more than 19k articles and corresponding human-written summaries collected from bangla.bdnews24.com1 which is till now the most extensive dataset for Bengali news document summarization and publicly published in Kaggle2. We evaluated our model qualitatively and quantitatively and compared it with other published results. It showed significant improvement in terms of human evaluation scores with state-of-the-art approaches for BANS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2017

Neural Extractive Summarization with Side Information

Most extractive summarization methods focus on the main body of the docu...
research
10/08/2019

Read, Highlight and Summarize: A Hierarchical Neural Semantic Encoder-based Approach

Traditional sequence-to-sequence (seq2seq) models and other variations o...
research
02/27/2019

An Editorial Network for Enhanced Document Summarization

We suggest a new idea of Editorial Network - a mixed extractive-abstract...
research
10/25/2019

Attention Optimization for Abstractive Document Summarization

Attention plays a key role in the improvement of sequence-to-sequence-ba...
research
02/09/2020

Attend to the beginning: A study on using bidirectional attention for extractive summarization

Forum discussion data differ in both structure and properties from gener...
research
04/21/2020

Neural Abstractive Summarization with Structural Attention

Attentional, RNN-based encoder-decoder architectures have achieved impre...
research
04/05/2021

Efficient Attentions for Long Document Summarization

The quadratic computational and memory complexities of large Transformer...

Please sign up or login with your details

Forgot password? Click here to reset