Neural Attention Models for Sequence Classification: Analysis and Application to Key Term Extraction and Dialogue Act Detection

03/31/2016
by   Sheng-Syun Shen, et al.
0

Recurrent neural network architectures combining with attention mechanism, or neural attention model, have shown promising performance recently for the tasks including speech recognition, image caption generation, visual question answering and machine translation. In this paper, neural attention model is applied on two sequence classification tasks, dialogue act detection and key term extraction. In the sequence labeling tasks, the model input is a sequence, and the output is the label of the input sequence. The major difficulty of sequence labeling is that when the input sequence is long, it can include many noisy or irrelevant part. If the information in the whole sequence is treated equally, the noisy or irrelevant part may degrade the classification performance. The attention mechanism is helpful for sequence classification task because it is capable of highlighting important part among the entire sequence for the classification task. The experimental results show that with the attention mechanism, discernible improvements were achieved in the sequence labeling task considered here. The roles of the attention mechanism in the tasks are further analyzed and visualized in this paper.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2015

Attention-Based Models for Speech Recognition

Recurrent sequence generators conditioned on input data through an atten...
research
12/15/2017

Pre-training Attention Mechanisms

Recurrent neural networks with differentiable attention mechanisms have ...
research
04/04/2019

Dialogue Act Classification with Context-Aware Self-Attention

Recent work in Dialogue Act classification has treated the task as a seq...
research
05/23/2017

Local Monotonic Attention Mechanism for End-to-End Speech and Language Processing

Recently, encoder-decoder neural networks have shown impressive performa...
research
06/12/2018

Focused Hierarchical RNNs for Conditional Sequence Processing

Recurrent Neural Networks (RNNs) with attention mechanisms have obtained...
research
06/11/2018

Let's do it "again": A First Computational Approach to Detecting Adverbial Presupposition Triggers

We introduce the task of predicting adverbial presupposition triggers su...
research
02/01/2019

Exploring attention mechanism for acoustic-based classification of speech utterances into system-directed and non-system-directed

Voice controlled virtual assistants (VAs) are now available in smartphon...

Please sign up or login with your details

Forgot password? Click here to reset