Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling

09/06/2016
by   Bing Liu, et al.
0

Attention-based encoder-decoder neural network models have recently shown promising results in machine translation and speech recognition. In this work, we propose an attention-based neural network model for joint intent detection and slot filling, both of which are critical steps for many speech understanding and dialog systems. Unlike in machine translation and speech recognition, alignment is explicit in slot filling. We explore different strategies in incorporating this alignment information to the encoder-decoder framework. Learning from the attention mechanism in encoder-decoder model, we further propose introducing attention to the alignment-based RNN models. Such attentions provide additional information to the intent classification and slot label prediction. Our independent task models achieve state-of-the-art intent detection error rate and slot filling F1 score on the benchmark ATIS task. Our joint training model further obtains 0.56 reduction on intent detection and 0.23 independent task models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2016

Implicit Distortion and Fertility Models for Attention-based Encoder-Decoder NMT Model

Neural machine translation has shown very promising results lately. Most...
research
12/21/2020

Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

We propose a novel Transformer encoder-based architecture with syntactic...
research
12/26/2018

A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling

Intent detection and slot filling are two main tasks for building a spok...
research
12/15/2022

Attention as a guide for Simultaneous Speech Translation

The study of the attention mechanism has sparked interest in many fields...
research
06/26/2017

Generative Encoder-Decoder Models for Task-Oriented Spoken Dialog Systems with Chatting Capability

Generative encoder-decoder models offer great promise in developing doma...
research
10/20/2016

Jointly Learning to Align and Convert Graphemes to Phonemes with Neural Attention Models

We propose an attention-enabled encoder-decoder model for the problem of...
research
03/19/2019

Simple, Fast, Accurate Intent Classification and Slot Labeling

In real-time dialogue systems running at scale, there is a tradeoff betw...

Please sign up or login with your details

Forgot password? Click here to reset