Prior Knowledge Driven Label Embedding for Slot Filling in Natural Language Understanding

03/22/2020
by   Su Zhu, et al.
0

Traditional slot filling in natural language understanding (NLU) predicts a one-hot vector for each word. This form of label representation lacks semantic correlation modelling, which leads to severe data sparsity problem, especially when adapting an NLU model to a new domain. To address this issue, a novel label embedding based slot filling framework is proposed in this paper. Here, distributed label embedding is constructed for each slot using prior knowledge. Three encoding methods are investigated to incorporate different kinds of prior knowledge about slots: atomic concepts, slot descriptions, and slot exemplars. The proposed label embeddings tend to share text patterns and reuses data with different slot labels. This makes it useful for adaptive NLU with limited data. Also, since label embedding is independent of NLU model, it is compatible with almost all deep learning based slot filling models. The proposed approaches are evaluated on three datasets. Experiments on single domain and domain adaptation tasks show that label embedding achieves significant performance improvement over traditional one-hot label representation as well as advanced zero-shot approaches.

READ FULL TEXT

page 1

page 11

research
01/20/2021

A survey of joint intent detection and slot-filling models in natural language understanding

Intent classification and slot filling are two critical tasks for natura...
research
11/05/2019

Improving Slot Filling by Utilizing Contextual Information

Slot Filling is the task of extracting the semantic concept from a given...
research
06/03/2017

Concept Transfer Learning for Adaptive Language Understanding

Semantic transfer is an important problem of the language understanding ...
research
10/07/2021

Bridge to Target Domain by Prototypical Contrastive Learning and Label Confusion: Re-explore Zero-Shot Learning for Slot Filling

Zero-shot cross-domain slot filling alleviates the data dependence in th...
research
08/29/2018

Zero-Shot Adaptive Transfer for Conversational Language Understanding

Conversational agents such as Alexa and Google Assistant constantly need...
research
01/15/2017

Neural Models for Sequence Chunking

Many natural language understanding (NLU) tasks, such as shallow parsing...
research
03/30/2018

Deep Cascade Multi-task Learning for Slot Filling in Chinese E-commerce Shopping Guide Assistant

Slot filling is a critical task in natural language understanding (NLU) ...

Please sign up or login with your details

Forgot password? Click here to reset