A Gated Self-attention Memory Network for Answer Selection

09/13/2019
by   Tuan Lai, et al.
0

Answer selection is an important research problem, with applications in many areas. Previous deep learning based approaches for the task mainly adopt the Compare-Aggregate architecture that performs word-level comparison followed by aggregation. In this work, we take a departure from the popular Compare-Aggregate architecture, and instead, propose a new gated self-attention memory network for the task. Combined with a simple transfer learning technique from a large-scale online corpus, our model outperforms previous methods by a large margin, achieving new state-of-the-art results on two standard answer selection datasets: TrecQA and WikiQA.

READ FULL TEXT
research
05/26/2019

Gated Group Self-Attention for Answer Selection

Answer selection (answer ranking) is one of the key steps in many kinds ...
research
11/06/2016

A Compare-Aggregate Model for Matching Text Sequences

Many NLP tasks including machine comprehension, answer selection and tex...
research
05/30/2019

A Compare-Aggregate Model with Latent Clustering for Answer Selection

In this paper, we propose a novel method for a sentence-level answer-sel...
research
03/19/2021

MDMMT: Multidomain Multimodal Transformer for Video Retrieval

We present a new state-of-the-art on the text to video retrieval task on...
research
11/14/2019

Iterative Answer Prediction with Pointer-Augmented Multimodal Transformers for TextVQA

Many visual scenes contain text that carries crucial information, and it...
research
11/21/2018

Gated Context Aggregation Network for Image Dehazing and Deraining

Image dehazing aims to recover the uncorrupted content from a hazy image...
research
08/23/2020

Neighbourhood-Insensitive Point Cloud Normal Estimation Network

We introduce a novel self-attention-based normal estimation network that...

Please sign up or login with your details

Forgot password? Click here to reset