Task Conditioned BERT for Joint Intent Detection and Slot-filling

08/11/2023
by   Diogo Tavares, et al.
0

Dialogue systems need to deal with the unpredictability of user intents to track dialogue state and the heterogeneity of slots to understand user preferences. In this paper we investigate the hypothesis that solving these challenges as one unified model will allow the transfer of parameter support data across the different tasks. The proposed principled model is based on a Transformer encoder, trained on multiple tasks, and leveraged by a rich input that conditions the model on the target inferences. Conditioning the Transformer encoder on multiple target inferences over the same corpus, i.e., intent and multiple slot types, allows learning richer language interactions than a single-task model would be able to. In fact, experimental results demonstrate that conditioning the model on an increasing number of dialogue inference tasks leads to improved results: on the MultiWOZ dataset, the joint intent and slot detection can be improved by 3.2% by conditioning on intent, 10.8% by conditioning on slot and 14.4% by conditioning on both intent and slots. Moreover, on real conversations with Farfetch costumers, the proposed conditioned BERT can achieve high joint-goal and intent detection performance throughout a dialogue.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2022

A Fast Attention Network for Joint Intent Detection and Slot Filling on Edge Devices

Intent detection and slot filling are two main tasks in natural language...
research
11/01/2020

Recent Neural Methods on Slot Filling and Intent Classification for Task-Oriented Dialogue Systems: A Survey

In recent years, fostered by deep learning technologies and by the high ...
research
05/18/2023

Generalized Multiple Intent Conditioned Slot Filling

Natural language understanding includes the tasks of intent detection (i...
research
04/10/2022

UniDU: Towards A Unified Generative Dialogue Understanding Framework

With the development of pre-trained language models, remarkable success ...
research
06/10/2016

Conditional Generation and Snapshot Learning in Neural Dialogue Systems

Recently a variety of LSTM-based conditional language models (LM) have b...
research
03/19/2023

CTRAN: CNN-Transformer-based Network for Natural Language Understanding

Intent-detection and slot-filling are the two main tasks in natural lang...
research
12/21/2020

Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

We propose a novel Transformer encoder-based architecture with syntactic...

Please sign up or login with your details

Forgot password? Click here to reset