DeepAI AI Chat
Log In Sign Up

From Logical to Distributional Models

12/30/2014
by   Anne Preller, et al.
0

The paper relates two variants of semantic models for natural language, logical functional models and compositional distributional vector space models, by transferring the logic and reasoning from the logical to the distributional models. The geometrical operations of quantum logic are reformulated as algebraic operations on vectors. A map from functional models to vector space models makes it possible to compare the meaning of sentences word by word.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/31/2010

Concrete Sentence Spaces for Compositional Distributional Models of Meaning

Coecke, Sadrzadeh, and Clark (arXiv:1003.4394v1 [cs.CL]) developed a com...
08/29/2019

Feature2Vec: Distributional semantic modelling of human property knowledge

Feature norm datasets of human conceptual knowledge, collected in survey...
07/25/2017

Analogs of Linguistic Structure in Deep Representations

We investigate the compositional structure of message vectors computed b...
09/01/2017

Variational Inference for Logical Inference

Functional Distributional Semantics is a framework that aims to learn, f...
07/20/2017

High-risk learning: acquiring new word vectors from tiny data

Distributional semantics models are known to struggle with small data. I...
05/12/2021

Conversational Negation using Worldly Context in Compositional Distributional Semantics

We propose a framework to model an operational conversational negation b...
08/15/2019

Vector spaces as Kripke frames

In recent years, the compositional distributional approach in computatio...