Toward Extractive Summarization of Online Forum Discussions via Hierarchical Attention Networks

05/25/2018
by   Sansiri Tarnpradab, et al.
0

Forum threads are lengthy and rich in content. Concise thread summaries will benefit both newcomers seeking information and those who participate in the discussion. Few studies, however, have examined the task of forum thread summarization. In this work we make the first attempt to adapt the hierarchical attention networks for thread summarization. The model draws on the recent development of neural attention mechanisms to build sentence and thread representations and use them for summarization. Our results indicate that the proposed approach can outperform a range of competitive baselines. Further, a redundancy removal step is crucial for achieving outstanding results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/10/2015

Extending a Single-Document Summarizer to Multi-Document: a Hierarchical Approach

The increasing amount of online content motivated the development of mul...
research
09/02/2015

A Neural Attention Model for Abstractive Sentence Summarization

Summarization based on text extraction is inherently limited, but genera...
research
03/25/2021

Improving Online Forums Summarization via Unifying Hierarchical Attention Networks with Convolutional Neural Networks

Online discussion forums are prevalent and easily accessible, thus allow...
research
01/26/2018

A Formal Definition of Importance for Summarization

Research on summarization has mainly been driven by empirical approaches...
research
09/16/2020

A System for Interleaving Discussion and Summarization in Online Collaboration

In many instances of online collaboration, ideation and deliberation abo...
research
01/31/2021

Contextualized Rewriting for Text Summarization

Extractive summarization suffers from irrelevance, redundancy and incohe...
research
09/24/2019

In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

Various Seq2Seq learning models designed for machine translation were ap...

Please sign up or login with your details

Forgot password? Click here to reset