Leveraging Sentence-level Information with Encoder LSTM for Semantic Slot Filling

01/07/2016
by   Gakuto Kurata, et al.
0

Recurrent Neural Network (RNN) and one of its specific architectures, Long Short-Term Memory (LSTM), have been widely used for sequence labeling. In this paper, we first enhance LSTM-based sequence labeling to explicitly model label dependencies. Then we propose another enhancement to incorporate the global information spanning over the whole input sequence. The latter proposed method, encoder-labeler LSTM, first encodes the whole input sequence into a fixed length vector with the encoder LSTM, and then uses this encoded vector as the initial state of another LSTM for sequence labeling. Combining these methods, we can predict the label sequence with considering label dependencies and information of whole input sequence. In the experiments of a slot filling task, which is an essential component of natural language understanding, with using the standard ATIS corpus, we achieved the state-of-the-art F1-score of 95.66

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2017

Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks

Sentence-level classification and sequential labeling are two fundamenta...
research
01/15/2017

Neural Models for Sequence Chunking

Many natural language understanding (NLU) tasks, such as shallow parsing...
research
07/09/2018

A deep learning approach for understanding natural language commands for mobile service robots

Using natural language to give instructions to robots is challenging, si...
research
06/12/2015

Listen, Attend, and Walk: Neural Mapping of Navigational Instructions to Action Sequences

We propose a neural sequence-to-sequence model for direction following, ...
research
08/23/2020

Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

This paper proposes to generalize the variational recurrent neural netwo...
research
09/10/2014

Sequence to Sequence Learning with Neural Networks

Deep Neural Networks (DNNs) are powerful models that have achieved excel...
research
04/03/2017

Syntax Aware LSTM Model for Chinese Semantic Role Labeling

As for semantic role labeling (SRL) task, when it comes to utilizing par...

Please sign up or login with your details

Forgot password? Click here to reset