Recurrent Neural Networks with Pre-trained Language Model Embedding for Slot Filling Task

12/12/2018
by   Liang Qiu, et al.
0

In recent years, Recurrent Neural Networks (RNNs) based models have been applied to the Slot Filling problem of Spoken Language Understanding and achieved the state-of-the-art performances. In this paper, we investigate the effect of incorporating pre-trained language models into RNN based Slot Filling models. Our evaluation on the Airline Travel Information System (ATIS) data corpus shows that we can significantly reduce the size of labeled training data and achieve the same level of Slot Filling performance by incorporating extra word embedding and language model embedding layers pre-trained on unlabeled corpora.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2022

From Disfluency Detection to Intent Detection and Slot Filling

We present the first empirical study investigating the influence of disf...
research
10/16/2020

Modeling Token-level Uncertainty to Learn Unknown Concepts in SLU via Calibrated Dirichlet Prior RNN

One major task of spoken language understanding (SLU) in modern personal...
research
06/13/2021

GenSF: Simultaneous Adaptation of Generative Pre-trained Models and Slot Filling

In transfer learning, it is imperative to achieve strong alignment betwe...
research
09/28/2020

PIN: A Novel Parallel Interactive Network for Spoken Language Understanding

Spoken Language Understanding (SLU) is an essential part of the spoken d...
research
04/15/2021

Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding

Most End-to-End (E2E) SLU networks leverage the pre-trained ASR networks...
research
08/24/2022

PSSAT: A Perturbed Semantic Structure Awareness Transferring Method for Perturbation-Robust Slot Filling

Most existing slot filling models tend to memorize inherent patterns of ...
research
08/23/2020

Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

This paper proposes to generalize the variational recurrent neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset