Question Generation from Paragraphs: A Tale of Two Hierarchical Models

11/08/2019
by   Vishwajeet Kumar, et al.
0

Automatic question generation from paragraphs is an important and challenging problem, particularly due to the long context from paragraphs. In this paper, we propose and study two hierarchical models for the task of question generation from paragraphs. Specifically, we propose (a) a novel hierarchical BiLSTM model with selective attention and (b) a novel hierarchical Transformer architecture, both of which learn hierarchical representations of paragraphs. We model a paragraph in terms of its constituent sentences, and a sentence in terms of its constituent words. While the introduction of the attention mechanism benefits the hierarchical BiLSTM model, the hierarchical Transformer, with its inherent attention and positional encoding mechanisms also performs better than flat transformer model. We conducted empirical evaluation on the widely used SQuAD and MS MARCO datasets using standard metrics. The results demonstrate the overall effectiveness of the hierarchical models over their flat counterparts. Qualitatively, our hierarchical models are able to generate fluent and relevant questions

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2021

Hi-Transformer: Hierarchical Interactive Transformer for Efficient and Effective Long Document Modeling

Transformer is important for text modeling. However, it has difficulty i...
research
05/24/2016

Hierarchical Memory Networks

Memory networks are neural networks with an explicit memory component th...
research
04/08/2020

Pruning and Sparsemax Methods for Hierarchical Attention Networks

This paper introduces and evaluates two novel Hierarchical Attention Net...
research
05/10/2018

Obligation and Prohibition Extraction Using Hierarchical RNNs

We consider the task of detecting contractual obligations and prohibitio...
research
08/10/2022

TagRec++: Hierarchical Label Aware Attention Network for Question Categorization

Online learning systems have multiple data repositories in the form of t...
research
03/06/2021

ReadNet: A Hierarchical Transformer Framework for Web Article Readability Analysis

Analyzing the readability of articles has been an important sociolinguis...
research
04/01/2021

WakaVT: A Sequential Variational Transformer for Waka Generation

Poetry generation has long been a challenge for artificial intelligence....

Please sign up or login with your details

Forgot password? Click here to reset