Multi-grained Label Refinement Network with Dependency Structures for Joint Intent Detection and Slot Filling

09/09/2022
by   Baohang Zhou, et al.
0

Slot filling and intent detection are two fundamental tasks in the field of natural language understanding. Due to the strong correlation between these two tasks, previous studies make efforts on modeling them with multi-task learning or designing feature interaction modules to improve the performance of each task. However, none of the existing approaches consider the relevance between the structural information of sentences and the label semantics of two tasks. The intent and semantic components of a utterance are dependent on the syntactic elements of a sentence. In this paper, we investigate a multi-grained label refinement network, which utilizes dependency structures and label semantic embeddings. Considering to enhance syntactic representations, we introduce the dependency structures of sentences into our model by graph attention layer. To capture the semantic dependency between the syntactic information and task labels, we combine the task specific features with corresponding label embeddings by attention mechanism. The experimental results demonstrate that our model achieves the competitive performance on two public datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2021

SLIM: Explicit Slot-Intent Mapping with BERT for Joint Multi-Intent Detection and Slot Filling

Utterance-level intent detection and token-level slot filling are two ke...
research
02/22/2021

Joint Intent Detection And Slot Filling Based on Continual Learning Model

Slot filling and intent detection have become a significant theme in the...
research
03/30/2018

Deep Cascade Multi-task Learning for Slot Filling in Chinese E-commerce Shopping Guide Assistant

Slot filling is a critical task in natural language understanding (NLU) ...
research
06/03/2021

GL-GIN: Fast and Accurate Non-Autoregressive Model for Joint Multiple Intent Detection and Slot Filling

Multi-intent SLU can handle multiple intents in an utterance, which has ...
research
10/19/2022

Group is better than individual: Exploiting Label Topologies and Label Relations for Joint Multiple Intent Detection and Slot Filling

Recent joint multiple intent detection and slot filling models employ la...
research
09/16/2020

Retrofitting Structure-aware Transformer Language Model for End Tasks

We consider retrofitting structure-aware Transformer-based language mode...
research
11/08/2022

A Dynamic Graph Interactive Framework with Label-Semantic Injection for Spoken Language Understanding

Multi-intent detection and slot filling joint models are gaining increas...

Please sign up or login with your details

Forgot password? Click here to reset