DeepAI AI Chat
Log In Sign Up

Towards logical negation for compositional distributional semantics

05/11/2020
by   Martha Lewis, et al.
0

The categorical compositional distributional model of meaning gives the composition of words into phrases and sentences pride of place. However, it has so far lacked a model of logical negation. This paper gives some steps towards providing this operator, modelling it as a version of projection onto the subspace orthogonal to a word. We give a small demonstration of the operators performance in a sentence entailment task.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/14/2015

Sentence Entailment in Compositional Distributional Semantics

Distributional semantic models provide vector representations for words ...
11/08/2018

Internal Wiring of Cartesian Verbs and Prepositions

Categorical compositional distributional semantics (CCDS) allows one to ...
05/12/2021

Conversational Negation using Worldly Context in Compositional Distributional Semantics

We propose a framework to model an operational conversational negation b...
10/14/2016

Distributional Inclusion Hypothesis for Tensor-based Composition

According to the distributional inclusion hypothesis, entailment between...
05/28/2020

Cats climb entails mammals move: preserving hyponymy in compositional distributional semantics

To give vector-based representations of meaning more structure, one appr...
08/12/2016

Compositional Distributional Cognition

We accommodate the Integrated Connectionist/Symbolic Architecture (ICS) ...
08/31/2019

Deep Reinforcement Learning with Distributional Semantic Rewards for Abstractive Summarization

Deep reinforcement learning (RL) has been a commonly-used strategy for t...