Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

08/23/2020
by   Jun Qi, et al.
0

This paper proposes to generalize the variational recurrent neural network (RNN) with variational inference (VI)-based dropout regularization employed for the long short-term memory (LSTM) cells to more advanced RNN architectures like gated recurrent unit (GRU) and bi-directional LSTM/GRU. The new variational RNNs are employed for slot filling, which is an intriguing but challenging task in spoken language understanding. The experiments on the ATIS dataset suggest that the variational RNNs with the VI-based dropout regularization can significantly improve the naive dropout regularization RNNs-based baseline systems in terms of F-measure. Particularly, the variational RNN with bi-directional LSTM/GRU obtains the best F-measure score.

READ FULL TEXT
research
09/08/2014

Recurrent Neural Network Regularization

We present a simple regularization technique for Recurrent Neural Networ...
research
11/05/2013

Dropout improves Recurrent Neural Networks for Handwriting Recognition

Recurrent neural networks (RNNs) with Long Short-Term memory cells curre...
research
12/12/2018

Recurrent Neural Networks with Pre-trained Language Model Embedding for Slot Filling Task

In recent years, Recurrent Neural Networks (RNNs) based models have been...
research
01/07/2016

Leveraging Sentence-level Information with Encoder LSTM for Semantic Slot Filling

Recurrent Neural Network (RNN) and one of its specific architectures, Lo...
research
06/20/2017

Effective Spoken Language Labeling with Deep Recurrent Neural Networks

Understanding spoken language is a highly complex problem, which can be ...
research
11/16/2016

Learning long-term dependencies for action recognition with a biologically-inspired deep network

Despite a lot of research efforts devoted in recent years, how to effici...

Please sign up or login with your details

Forgot password? Click here to reset