BERT for Joint Intent Classification and Slot Filling

02/28/2019
by   Qian Chen, et al.
0

Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2022

Bi-directional Joint Neural Networks for Intent Classification and Slot Filling

Intent classification and slot filling are two critical tasks for natura...
research
04/30/2020

Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection

Detecting the user's intent and finding the corresponding slots among th...
research
03/20/2020

Parallel Intent and Slot Prediction using MLB Fusion

Intent and Slot Identification are two important tasks in Spoken Languag...
research
07/05/2019

Multi-lingual Intent Detection and Slot Filling in a Joint BERT-based Model

Intent Detection and Slot Filling are two pillar tasks in Spoken Natural...
research
04/15/2020

lamBERT: Language and Action Learning Using Multimodal BERT

Recently, the bidirectional encoder representations from transformers (B...
research
12/21/2020

Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

We propose a novel Transformer encoder-based architecture with syntactic...
research
10/23/2019

Hierarchical Transformers for Long Document Classification

BERT, which stands for Bidirectional Encoder Representations from Transf...

Please sign up or login with your details

Forgot password? Click here to reset