From Disfluency Detection to Intent Detection and Slot Filling

09/17/2022
by   Mai Hoang Dao, et al.
0

We present the first empirical study investigating the influence of disfluency detection on downstream tasks of intent detection and slot filling. We perform this study for Vietnamese – a low-resource language that has no previous study as well as no public dataset available for disfluency detection. First, we extend the fluent Vietnamese intent detection and slot filling dataset PhoATIS by manually adding contextual disfluencies and annotating them. Then, we conduct experiments using strong baselines for disfluency detection and joint intent detection and slot filling, which are based on pre-trained language models. We find that: (i) disfluencies produce negative effects on the performances of the downstream intent detection and slot filling tasks, and (ii) in the disfluency context, the pre-trained multilingual language model XLM-R helps produce better intent detection and slot filling performances than the pre-trained monolingual language model PhoBERT, and this is opposite to what generally found in the fluency context.

READ FULL TEXT
research
04/05/2021

Intent detection and slot filling for Vietnamese

Intent detection and slot filling are important tasks in spoken and natu...
research
12/12/2018

Recurrent Neural Networks with Pre-trained Language Model Embedding for Slot Filling Task

In recent years, Recurrent Neural Networks (RNNs) based models have been...
research
06/13/2021

GenSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling

In transfer learning, it is imperative to achieve strong alignment betwe...
research
08/09/2023

Slot Induction via Pre-trained Language Model Probing and Multi-level Contrastive Learning

Recent advanced methods in Natural Language Understanding for Task-orien...
research
09/08/2020

Simple is Better! Lightweight Data Augmentation for Low Resource Slot Filling and Intent Classification

Neural-based models have achieved outstanding performance on slot fillin...
research
01/05/2023

HIT-SCIR at MMNLU-22: Consistency Regularization for Multilingual Spoken Language Understanding

Multilingual spoken language understanding (SLU) consists of two sub-tas...
research
05/18/2023

Generalized Multiple Intent Conditioned Slot Filling

Natural language understanding includes the tasks of intent detection (i...

Please sign up or login with your details

Forgot password? Click here to reset