Hierarchical Attention: What Really Counts in Various NLP Tasks

08/10/2018
by   Zehao Dou, et al.
0

Attention mechanisms in sequence to sequence models have shown great ability and wonderful performance in various natural language processing (NLP) tasks, such as sentence embedding, text generation, machine translation, machine reading comprehension, etc. Unfortunately, existing attention mechanisms only learn either high-level or low-level features. In this paper, we think that the lack of hierarchical mechanisms is a bottleneck in improving the performance of the attention mechanisms, and propose a novel Hierarchical Attention Mechanism (Ham) based on the weighted sum of different layers of a multi-level attention. Ham achieves a state-of-the-art BLEU score of 0.26 on Chinese poem generation task and a nearly 6.5 reading comprehension models such as BIDAF and Match-LSTM. Furthermore, our experiments and theorems reveal that Ham has greater generalization and representation ability than existing attention mechanisms.

READ FULL TEXT
research
05/19/2021

Sentence Extraction-Based Machine Reading Comprehension for Vietnamese

The development of Vietnamese language processing in general and machine...
research
11/14/2019

Contextual Recurrent Units for Cloze-style Reading Comprehension

Recurrent Neural Networks (RNN) are known as powerful models for handlin...
research
01/08/2022

Clustering Text Using Attention

Clustering Text has been an important problem in the domain of Natural L...
research
10/13/2020

Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension

While neural networks with attention mechanisms have achieved superior p...
research
06/08/2021

Adversarial Training for Machine Reading Comprehension with Virtual Embeddings

Adversarial training (AT) as a regularization method has proved its effe...
research
11/10/2018

Densely Connected Attention Propagation for Reading Comprehension

We propose DecaProp (Densely Connected Attention Propagation), a new den...
research
03/04/2018

CAESAR: Context Awareness Enabled Summary-Attentive Reader

Comprehending meaning from natural language is a primary objective of Na...

Please sign up or login with your details

Forgot password? Click here to reset