Capture Salient Historical Information: A Fast and Accurate Non-Autoregressive Model for Multi-turn Spoken Language Understanding

06/24/2022
by   Lizhi Cheng, et al.
0

Spoken Language Understanding (SLU), a core component of the task-oriented dialogue system, expects a shorter inference facing the impatience of human users. Existing work increases inference speed by designing non-autoregressive models for single-turn SLU tasks but fails to apply to multi-turn SLU in confronting the dialogue history. The intuitive idea is to concatenate all historical utterances and utilize the non-autoregressive models directly. However, this approach seriously misses the salient historical information and suffers from the uncoordinated-slot problems. To overcome those shortcomings, we propose a novel model for multi-turn SLU named Salient History Attention with Layer-Refined Transformer (SHA-LRT), which composes of an SHA module, a Layer-Refined Mechanism (LRM), and a Slot Label Generation (SLG) task. SHA captures salient historical information for the current dialogue from both historical utterances and results via a well-designed history-attention mechanism. LRM predicts preliminary SLU results from Transformer's middle states and utilizes them to guide the final prediction, and SLG obtains the sequential dependency information for the non-autoregressive encoder. Experiments on public datasets indicate that our model significantly improves multi-turn SLU performance (17.5 times) the inference process over the state-of-the-art baseline as well as effective on the single-turn SLU tasks.

READ FULL TEXT
research
08/16/2021

An Effective Non-Autoregressive Model for Spoken Language Understanding

Spoken Language Understanding (SLU), a core component of the task-orient...
research
03/10/2021

A Result based Portable Framework for Spoken Language Understanding

Spoken language understanding (SLU), which is a core component of the ta...
research
02/23/2022

Knowledge Augmented BERT Mutual Network in Multi-turn Spoken Dialogues

Modern spoken language understanding (SLU) systems rely on sophisticated...
research
06/04/2019

Improving Long Distance Slot Carryover in Spoken Dialogue Systems

Tracking the state of the conversation is a central component in task-or...
research
09/16/2020

Parallel Interactive Networks for Multi-Domain Dialogue State Generation

The dependencies between system and user utterances in the same turn and...
research
09/30/2017

Dynamic Time-Aware Attention to Speaker Roles and Contexts for Spoken Language Understanding

Spoken language understanding (SLU) is an essential component in convers...
research
09/05/2018

Learning Context-Sensitive Time-Decay Attention for Role-Based Dialogue Modeling

Spoken language understanding (SLU) is an essential component in convers...

Please sign up or login with your details

Forgot password? Click here to reset