Focused Hierarchical RNNs for Conditional Sequence Processing

06/12/2018
by   Nan Rosemary Ke, et al.
0

Recurrent Neural Networks (RNNs) with attention mechanisms have obtained state-of-the-art results for many sequence processing tasks. Most of these models use a simple form of encoder with attention that looks over the entire sequence and assigns a weight to each token independently. We present a mechanism for focusing RNN encoders for sequence modelling tasks which allows them to attend to key parts of the input as needed. We formulate this using a multi-layer conditional sequence encoder that reads in one token at a time and makes a discrete decision on whether the token is relevant to the context or question being asked. The discrete gating mechanism takes in the context embedding and the current hidden state as inputs and controls information flow into the layer above. We train it using policy gradient methods. We evaluate this method on several types of tasks with different attributes. First, we evaluate the method on synthetic tasks which allow us to evaluate the model for its generalization ability and probe the behavior of the gates in more controlled settings. We then evaluate this approach on large scale Question Answering tasks including the challenging MS MARCO and SearchQA tasks. Our models shows consistent improvements for both tasks over prior work and our baselines. It has also shown to generalize significantly better on synthetic tasks as compared to the baselines.

READ FULL TEXT
research
07/05/2017

An Attention Mechanism for Answer Selection Using a Combined Global and Local View

We propose a new attention mechanism for neural based question answering...
research
03/23/2016

Recurrent Neural Network Encoder with Attention for Community Question Answering

We apply a general recurrent neural network (RNN) encoder framework to c...
research
10/11/2022

Mixture of Attention Heads: Selecting Attention Heads Per Token

Mixture-of-Experts (MoE) networks have been proposed as an efficient way...
research
03/31/2016

Neural Attention Models for Sequence Classification: Analysis and Application to Key Term Extraction and Dialogue Act Detection

Recurrent neural network architectures combining with attention mechanis...
research
12/17/2017

DeepNorm-A Deep Learning Approach to Text Normalization

This paper presents an simple yet sophisticated approach to the challeng...
research
10/06/2017

The Recurrent Temporal Discriminative Restricted Boltzmann Machines

The recurrent temporal restricted Boltzmann machine (RTRBM) has been suc...

Please sign up or login with your details

Forgot password? Click here to reset