An Efficient Approach to Encoding Context for Spoken Language Understanding

07/01/2018
by   Raghav Gupta, et al.
0

In task-oriented dialogue systems, spoken language understanding, or SLU, refers to the task of parsing natural language user utterances into semantic frames. Making use of context from prior dialogue history holds the key to more effective SLU. State of the art approaches to SLU use memory networks to encode context by processing multiple utterances from the dialogue at each turn, resulting in significant trade-offs between accuracy and computational efficiency. On the other hand, downstream components like the dialogue state tracker (DST) already keep track of the dialogue state, which can serve as a summary of the dialogue history. In this work, we propose an efficient approach to encoding context from prior utterances for SLU. More specifically, our architecture includes a separate recurrent neural network (RNN) based encoding module that accumulates dialogue context to guide the frame parsing sub-tasks and can be shared between SLU and DST. In our experiments, we demonstrate the effectiveness of our approach on dialogues from two domains.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2017

Sequential Dialogue Context Modeling for Spoken Language Understanding

Spoken Language Understanding (SLU) is a key component of goal oriented ...
research
12/13/2021

Attentive Contextual Carryover for Multi-Turn End-to-End Spoken Language Understanding

Recent years have seen significant advances in end-to-end (E2E) spoken l...
research
07/22/2016

CFGs-2-NLU: Sequence-to-Sequence Learning for Mapping Utterances to Semantics and Pragmatics

In this paper, we present a novel approach to natural language understan...
research
11/22/2015

Non-Sentential Utterances in Dialogue: Experiments in Classification and Interpretation

Non-sentential utterances (NSUs) are utterances that lack a complete sen...
research
04/30/2020

Hierarchical Encoders for Modeling and Interpreting Screenplays

While natural language understanding of long-form documents is still an ...
research
05/26/2020

History-Aware Question Answering in a Blocks World Dialogue System

It is essential for dialogue-based spatial reasoning systems to maintain...
research
03/03/2020

Transfer Learning for Context-Aware Spoken Language Understanding

Spoken language understanding (SLU) is a key component of task-oriented ...

Please sign up or login with your details

Forgot password? Click here to reset