Refining Raw Sentence Representations for Textual Entailment Recognition via Attention

07/11/2017
by   Jorge A. Balazs, et al.
0

In this paper we present the model used by the team Rivercorners for the 2017 RepEval shared task. First, our model separately encodes a pair of sentences into variable-length representations by using a bidirectional LSTM. Later, it creates fixed-length raw representations by means of simple aggregation functions, which are then refined using an attention mechanism. Finally it combines the refined representations of both sentences into a single vector to be used for classification. With this model we obtained test accuracies of 72.057 respectively, outperforming the LSTM baseline, and obtaining performances similar to a model that relies on shared information between sentences (ESIM). When using an ensemble both accuracies increased to 72.247 respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2022

DEIM: An effective deep encoding and interaction model for sentence matching

Natural language sentence matching is the task of comparing two sentence...
research
05/30/2016

Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention

In this paper, we proposed a sentence encoding-based model for recognizi...
research
07/01/2018

A Shared Attention Mechanism for Interpretation of Neural Automatic Post-Editing Systems

Automatic post-editing (APE) systems aim to correct the systematic error...
research
08/07/2017

Shortcut-Stacked Sentence Encoders for Multi-Domain Inference

We present a simple sequential sentence encoder for multi-domain natural...
research
03/07/2018

Generating Contradictory, Neutral, and Entailing Sentences

Learning distributed sentence representations remains an interesting pro...
research
01/11/2023

Deteksi Depresi dan Kecemasan Pengguna Twitter Menggunakan Bidirectional LSTM

The most common mental disorders experienced by a person in daily life a...

Please sign up or login with your details

Forgot password? Click here to reset