PIN: A Novel Parallel Interactive Network for Spoken Language Understanding

09/28/2020
by   Peilin Zhou, et al.
0

Spoken Language Understanding (SLU) is an essential part of the spoken dialogue system, which typically consists of intent detection (ID) and slot filling (SF) tasks. Recently, recurrent neural networks (RNNs) based methods achieved the state-of-the-art for SLU. It is noted that, in the existing RNN-based approaches, ID and SF tasks are often jointly modeled to utilize the correlation information between them. However, we noted that, so far, the efforts to obtain better performance by supporting bidirectional and explicit information exchange between ID and SF are not well studied.In addition, few studies attempt to capture the local context information to enhance the performance of SF. Motivated by these findings, in this paper, Parallel Interactive Network (PIN) is proposed to model the mutual guidance between ID and SF. Specifically, given an utterance, a Gaussian self-attentive encoder is introduced to generate the context-aware feature embedding of the utterance which is able to capture local context information. Taking the feature embedding of the utterance, Slot2Intent module and Intent2Slot module are developed to capture the bidirectional information flow for ID and SF tasks. Finally, a cooperation mechanism is constructed to fuse the information obtained from Slot2Intent and Intent2Slot modules to further reduce the prediction bias.The experiments on two benchmark datasets, i.e., SNIPS and ATIS, demonstrate the effectiveness of our approach, which achieves a competitive result with state-of-the-art models. More encouragingly, by using the feature embedding of the utterance generated by the pre-trained language model BERT, our method achieves the state-of-the-art among all comparison approaches.

READ FULL TEXT
research
09/16/2019

CM-Net: A Novel Collaborative Memory Network for Spoken Language Understanding

Spoken Language Understanding (SLU) mainly involves two tasks, intent de...
research
12/12/2018

Recurrent Neural Networks with Pre-trained Language Model Embedding for Slot Filling Task

In recent years, Recurrent Neural Networks (RNNs) based models have been...
research
10/08/2020

A Co-Interactive Transformer for Joint Slot Filling and Intent Detection

Intent detection and slot filling are two main tasks for building a spok...
research
03/20/2020

Parallel Intent and Slot Prediction using MLB Fusion

Intent and Slot Identification are two important tasks in Spoken Languag...
research
10/07/2022

A Unified Framework for Multi-intent Spoken Language Understanding with prompting

Multi-intent Spoken Language Understanding has great potential for wides...
research
05/27/2019

A Self-Attention Joint Model for Spoken Language Understanding in Situational Dialog Applications

Spoken language understanding (SLU) acts as a critical component in goal...
research
06/20/2017

Effective Spoken Language Labeling with Deep Recurrent Neural Networks

Understanding spoken language is a highly complex problem, which can be ...

Please sign up or login with your details

Forgot password? Click here to reset