Textual Entailment with Structured Attentions and Composition

01/04/2017
by   Kai Zhao, et al.
0

Deep learning techniques are increasingly popular in the textual entailment task, overcoming the fragility of traditional discrete models with hard alignments and logics. In particular, the recently proposed attention models (Rocktäschel et al., 2015; Wang and Jiang, 2015) achieves state-of-the-art accuracy by computing soft word alignments between the premise and hypothesis sentences. However, there remains a major limitation: this line of work completely ignores syntax and recursion, which is helpful in many traditional efforts. We show that it is beneficial to extend the attention model to tree nodes between premise and hypothesis. More importantly, this subtree-level attention reveals information about entailment relation. We study the recursive composition of this subtree-level entailment relation, which can be viewed as a soft version of the Natural Logic framework (MacCartney and Manning, 2009). Experiments show that our structured attention and entailment composition model can correctly identify and infer entailment relations from the bottom up, and bring significant improvements in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/07/2022

Entailment Graph Learning with Textual Entailment and Soft Transitivity

Typed entailment graphs try to learn the entailment relations between pr...
research
06/14/2018

Grounded Textual Entailment

Capturing semantic relations between sentences, such as entailment, is a...
research
10/25/2018

Teaching Syntax by Adversarial Distraction

Existing entailment datasets mainly pose problems which can be answered ...
research
08/28/2018

Bridging Knowledge Gaps in Neural Entailment via Symbolic Models

Most textual entailment models focus on lexical gaps between the premise...
research
09/22/2015

Reasoning about Entailment with Neural Attention

While most approaches to automatically recognizing entailment relations ...
research
04/24/2018

End-Task Oriented Textual Entailment via Deep Explorations of Inter-Sentence Interactions

This work deals with SciTail, a natural entailment challenge derived fro...
research
04/04/2019

Composition of Sentence Embeddings:Lessons from Statistical Relational Learning

Various NLP problems -- such as the prediction of sentence similarity, e...

Please sign up or login with your details

Forgot password? Click here to reset