DeepAI AI Chat
Log In Sign Up

Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding

by   Jonggu Kim, et al.

To capture salient contextual information for spoken language understanding (SLU) of a dialogue, we propose time-aware models that automatically learn the latent time-decay function of the history without a manual time-decay function. We also propose a method to identify and label the current speaker to improve the SLU accuracy. In experiments on the benchmark dataset used in Dialog State Tracking Challenge 4, the proposed models achieved significantly higher F1 scores than the state-of-the-art contextual models. Finally, we analyze the effectiveness of the introduced models in detail. The analysis demonstrates that the proposed methods were effective to improve SLU accuracy individually.


page 1

page 2

page 3

page 4


Modeling Inter-Speaker Relationship in XLNet for Contextual Spoken Language Understanding

We propose two methods to capture relevant history information in a mult...

Learning Context-Sensitive Time-Decay Attention for Role-Based Dialogue Modeling

Spoken language understanding (SLU) is an essential component in convers...

Dynamic Time-Aware Attention to Speaker Roles and Contexts for Spoken Language Understanding

Spoken language understanding (SLU) is an essential component in convers...

Speaker Role Contextual Modeling for Language Understanding and Dialogue Policy Learning

Language understanding (LU) and dialogue policy learning are two essenti...

Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference

Dialogue contexts are proven helpful in the spoken language understandin...

Speaker-Sensitive Dual Memory Networks for Multi-Turn Slot Tagging

In multi-turn dialogs, natural language understanding models can introdu...

Just ASK: Building an Architecture for Extensible Self-Service Spoken Language Understanding

This paper presents the design of the machine learning architecture that...