Improving Online Forums Summarization via Unifying Hierarchical Attention Networks with Convolutional Neural Networks

03/25/2021
by   Sansiri Tarnpradab, et al.
0

Online discussion forums are prevalent and easily accessible, thus allowing people to share ideas and opinions by posting messages in the discussion threads. Forum threads that significantly grow in length can become difficult for participants, both newcomers and existing, to grasp main ideas. This study aims to create an automatic text summarizer for online forums to mitigate this problem. We present a framework based on hierarchical attention networks, unifying Bidirectional Long Short-Term Memory (Bi-LSTM) and Convolutional Neural Network (CNN) to build sentence and thread representations for the forum summarization. In this scheme, Bi-LSTM derives a representation that comprises information of the whole sentence and whole thread; whereas, CNN recognizes high-level patterns of dominant units with respect to the sentence and thread context. The attention mechanism is applied on top of CNN to further highlight the high-level representations that capture any important units contributing to a desirable summary. Extensive performance evaluation based on three datasets, two of which are real-life online forums and one is news dataset, reveals that the proposed model outperforms several competitive baselines.

READ FULL TEXT
research
11/02/2018

Combining Long Short Term Memory and Convolutional Neural Network for Cross-Sentence n-ary Relation Extraction

We propose in this paper a combined model of Long Short Term Memory and ...
research
03/22/2017

Hierarchical RNN with Static Sentence-Level Attention for Text-Based Speaker Change Detection

Traditional speaker change detection in dialogues is typically based on ...
research
05/25/2018

Toward Extractive Summarization of Online Forum Discussions via Hierarchical Attention Networks

Forum threads are lengthy and rich in content. Concise thread summaries ...
research
10/08/2019

Read, Highlight and Summarize: A Hierarchical Neural Semantic Encoder-based Approach

Traditional sequence-to-sequence (seq2seq) models and other variations o...
research
04/04/2016

Image Captioning with Deep Bidirectional LSTMs

This work presents an end-to-end trainable deep bidirectional LSTM (Long...
research
09/16/2020

A System for Interleaving Discussion and Summarization in Online Collaboration

In many instances of online collaboration, ideation and deliberation abo...

Please sign up or login with your details

Forgot password? Click here to reset