Variational Inference for Logical Inference

09/01/2017
by   Guy Emerson, et al.
0

Functional Distributional Semantics is a framework that aims to learn, from text, semantic representations which can be interpreted in terms of truth. Here we make two contributions to this framework. The first is to show how a type of logical inference can be performed by evaluating conditional probabilities. The second is to make these calculations tractable by means of a variational approximation. This approximation also enables faster convergence during training, allowing us to close the gap with state-of-the-art vector space models when evaluating on semantic similarity. We demonstrate promising performance on two tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2016

Functional Distributional Semantics

Vector space models have become popular in distributional semantics, des...
research
05/06/2020

Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics

Functional Distributional Semantics provides a linguistically interpreta...
research
12/30/2014

From Logical to Distributional Models

The paper relates two variants of semantic models for natural language, ...
research
09/01/2017

Semantic Composition via Probabilistic Model Theory

Semantic composition remains an open problem for vector space models of ...
research
05/09/2012

Complexity Analysis and Variational Inference for Interpretation-based Probabilistic Description Logic

This paper presents complexity analysis and variational methods for infe...
research
04/21/2017

Improving Semantic Composition with Offset Inference

Count-based distributional semantic models suffer from sparsity due to u...
research
06/04/2020

Linguists Who Use Probabilistic Models Love Them: Quantification in Functional Distributional Semantics

Functional Distributional Semantics provides a computationally tractable...

Please sign up or login with your details

Forgot password? Click here to reset