Temporal Attention-Gated Model for Robust Sequence Classification

12/01/2016
by   Wenjie Pei, et al.
0

Typical techniques for sequence classification are designed for well-segmented sequences which have been edited to remove noisy or irrelevant parts. Therefore, such methods cannot be easily applied on noisy sequences expected in real-world applications. In this paper, we present the Temporal Attention-Gated Model (TAGM) which integrates ideas from attention models and gated recurrent networks to better deal with noisy or unsegmented sequences. Specifically, we extend the concept of attention model to measure the relevance of each observation (time step) of a sequence. We then use a novel gated recurrent network to learn the hidden representation for the final prediction. An important advantage of our approach is interpretability since the temporal attention weights provide a meaningful value for the salience of each time step in the sequence. We demonstrate the merits of our TAGM approach, both for prediction accuracy and interpretability, on three different tasks: spoken digit recognition, text-based sentiment analysis and visual event recognition.

READ FULL TEXT

page 6

page 7

page 8

research
12/01/2019

Not All Attention Is Needed: Gated Attention Network for Sequence Data

Although deep neural networks generally have fixed network structures, t...
research
09/05/2017

Interacting Attention-gated Recurrent Networks for Recommendation

Capturing the temporal dynamics of user preferences over items is import...
research
01/03/2023

Semi-Structured Object Sequence Encoders

In this paper we explore the task of modeling (semi) structured object s...
research
07/11/2022

A Baselined Gated Attention Recurrent Network for Request Prediction in Ridesharing

Ridesharing has received global popularity due to its convenience and co...
research
11/21/2019

System Identification with Time-Aware Neural Sequence Models

Established recurrent neural networks are well-suited to solve a wide va...
research
06/22/2018

Focusing on What is Relevant: Time-Series Learning and Understanding using Attention

This paper is a contribution towards interpretability of the deep learni...
research
04/20/2020

Gated Convolutional Bidirectional Attention-based Model for Off-topic Spoken Response Detection

Off-topic spoken response detection, the task aiming at assessing whethe...

Please sign up or login with your details

Forgot password? Click here to reset