Log In Sign Up

Attentive Tree-structured Network for Monotonicity Reasoning

by   Zeming Chen, et al.

Many state-of-art neural models designed for monotonicity reasoning perform poorly on downward inference. To address this shortcoming, we developed an attentive tree-structured neural network. It consists of a tree-based long-short-term-memory network (Tree-LSTM) with soft attention. It is designed to model the syntactic parse tree information from the sentence pair of a reasoning task. A self-attentive aggregator is used for aligning the representations of the premise and the hypothesis. We present our model and evaluate it using the Monotonicity Entailment Dataset (MED). We show and attempt to explain that our model outperforms existing models on MED.


page 1

page 2

page 3

page 4


Modelling Sentence Pairs with Tree-structured Attentive Encoder

We describe an attentive encoder that combines tree-structured recursive...

Japanese Sentiment Classification using a Tree-Structured Long Short-Term Memory with Attention

Previous approaches to training syntax-based sentiment classification mo...

Reasoning about Entailment with Neural Attention

While most approaches to automatically recognizing entailment relations ...

EmotionX-DLC: Self-Attentive BiLSTM for Detecting Sequential Emotions in Dialogue

In this paper, we propose a self-attentive bidirectional long short-term...

An attentive neural architecture for joint segmentation and parsing and its application to real estate ads

In this paper we develop a relatively simple and effective neural joint ...

Natural Language Inference by Tree-Based Convolution and Heuristic Matching

In this paper, we propose the TBCNN-pair model to recognize entailment a...

Learning to Embed Sentences Using Attentive Recursive Trees

Sentence embedding is an effective feature representation for most deep ...