Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention

05/30/2016
by   Yang Liu, et al.
0

In this paper, we proposed a sentence encoding-based model for recognizing text entailment. In our approach, the encoding of sentence is a two-stage process. Firstly, average pooling was used over word-level bidirectional LSTM (biLSTM) to generate a first-stage sentence representation. Secondly, attention mechanism was employed to replace average pooling on the same sentence for better representations. Instead of using target sentence to attend words in source sentence, we utilized the sentence's first-stage representation to attend words appeared in itself, which is called "Inner-Attention" in our paper . Experiments conducted on Stanford Natural Language Inference (SNLI) Corpus has proved the effectiveness of "Inner-Attention" mechanism. With less number of parameters, our model outperformed the existing best sentence encoding-based approach by a large margin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2022

DEIM: An effective deep encoding and interaction model for sentence matching

Natural language sentence matching is the task of comparing two sentence...
research
08/22/2018

Dynamic Self-Attention : Computing Attention over Words Dynamically for Sentence Embedding

In this paper, we propose Dynamic Self-Attention (DSA), a new self-atten...
research
07/11/2017

Refining Raw Sentence Representations for Textual Entailment Recognition via Attention

In this paper we present the model used by the team Rivercorners for the...
research
08/27/2018

Natural Language Inference with Hierarchical BiLSTM Max Pooling Architecture

Recurrent neural networks have proven to be very effective for natural l...
research
10/11/2022

Once is Enough: A Light-Weight Cross-Attention for Fast Sentence Pair Modeling

Transformer-based models have achieved great success on sentence pair mo...
research
02/15/2018

DR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference

We present a novel deep learning architecture to address the natural lan...
research
06/26/2018

Enhancing Sentence Embedding with Generalized Pooling

Pooling is an essential component of a wide variety of sentence represen...

Please sign up or login with your details

Forgot password? Click here to reset