Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

12/21/2020
by   Jixuan Wang, et al.
0

We propose a novel Transformer encoder-based architecture with syntactical knowledge encoded for intent detection and slot filling. Specifically, we encode syntactic knowledge into the Transformer encoder by jointly training it to predict syntactic parse ancestors and part-of-speech of each token via multi-task learning. Our model is based on self-attention and feed-forward layers and does not require external syntactic information to be available at inference time. Experiments show that on two benchmark datasets, our models with only two Transformer encoder layers achieve state-of-the-art results. Compared to the previously best performed model without pre-training, our models achieve absolute F1 score and accuracy improvement of 1.59 for slot filling and intent detection on the SNIPS dataset, respectively. Our models also achieve absolute F1 score and accuracy improvement of 0.1 0.34 over the previously best performed model. Furthermore, the visualization of the self-attention weights illustrates the benefits of incorporating syntactic information during training.

READ FULL TEXT
research
09/06/2016

Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling

Attention-based encoder-decoder neural network models have recently show...
research
03/19/2023

CTRAN: CNN-Transformer-based Network for Natural Language Understanding

Intent-detection and slot-filling are the two main tasks in natural lang...
research
02/28/2019

BERT for Joint Intent Classification and Slot Filling

Intent classification and slot filling are two essential tasks for natur...
research
10/08/2020

A Co-Interactive Transformer for Joint Slot Filling and Intent Detection

Intent detection and slot filling are two main tasks for building a spok...
research
04/23/2018

Linguistically-Informed Self-Attention for Semantic Role Labeling

The current state-of-the-art end-to-end semantic role labeling (SRL) mod...
research
08/11/2023

Task Conditioned BERT for Joint Intent Detection and Slot-filling

Dialogue systems need to deal with the unpredictability of user intents ...
research
07/04/2017

Improving Slot Filling Performance with Attentive Neural Networks on Dependency Structures

Slot Filling (SF) aims to extract the values of certain types of attribu...

Please sign up or login with your details

Forgot password? Click here to reset