CASA-NLU: Context-Aware Self-Attentive Natural Language Understanding for Task-Oriented Chatbots

09/18/2019
by   Arshit Gupta, et al.
0

Natural Language Understanding (NLU) is a core component of dialog systems. It typically involves two tasks - intent classification (IC) and slot labeling (SL), which are then followed by a dialogue management (DM) component. Such NLU systems cater to utterances in isolation, thus pushing the problem of context management to DM. However, contextual information is critical to the correct prediction of intents and slots in a conversation. Prior work on contextual NLU has been limited in terms of the types of contextual signals used and the understanding of their impact on the model. In this work, we propose a context-aware self-attentive NLU (CASA-NLU) model that uses multiple signals, such as previous intents, slots, dialog acts and utterances over a variable context window, in addition to the current user utterance. CASA-NLU outperforms a recurrent contextual NLU baseline on two conversational datasets, yielding a gain of up to 7 non-contextual variant of CASA-NLU achieves state-of-the-art performance for IC task on standard public datasets - Snips and ATIS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2021

A Context-Aware Hierarchical BERT Fusion Network for Multi-turn Dialog Act Detection

The success of interactive dialog systems is usually associated with the...
research
07/03/2022

DailyTalk: Spoken Dialogue Dataset for Conversational Text-to-Speech

The majority of current TTS datasets, which are collections of individua...
research
11/29/2017

Speaker-Sensitive Dual Memory Networks for Multi-Turn Slot Tagging

In multi-turn dialogs, natural language understanding models can introdu...
research
09/25/2021

Deciding Whether to Ask Clarifying Questions in Large-Scale Spoken Language Understanding

A large-scale conversational agent can suffer from understanding user ut...
research
05/27/2019

A Self-Attention Joint Model for Spoken Language Understanding in Situational Dialog Applications

Spoken language understanding (SLU) acts as a critical component in goal...
research
04/20/2019

Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding

Abstractive Community Detection is an important Spoken Language Understa...
research
04/21/2021

Adapting Long Context NLM for ASR Rescoring in Conversational Agents

Neural Language Models (NLM), when trained and evaluated with context sp...

Please sign up or login with your details

Forgot password? Click here to reset