Context-theoretic Semantics for Natural Language: an Algebraic Framework

09/22/2020
by   Daoud Clarke, et al.
0

Techniques in which words are represented as vectors have proved useful in many applications in computational linguistics, however there is currently no general semantic formalism for representing meaning in terms of vectors. We present a framework for natural language semantics in which words, phrases and sentences are all represented as vectors, based on a theoretical analysis which assumes that meaning is determined by context. In the theoretical analysis, we define a corpus model as a mathematical abstraction of a text corpus. The meaning of a string of words is assumed to be a vector representing the contexts it occurs in in the corpus model. Based on this assumption, we can show that the vector representations of words can be considered as elements of an algebra over a field. We note that in applications of vector spaces to representing meanings of words there is an underlying lattice structure; we interpret the partial ordering of the lattice as describing entailment between meanings. We also define the context-theoretic probability of a string, and, based on this and the lattice structure, a degree of entailment between strings. Together these properties form guidelines as to how to construct semantic representations within the framework. A context theory is an implementation of the framework; in an implementation strings are represented as vectors with the properties deduced from the theoretical analysis. We show how to incorporate logical semantics into context theories; this enables us to represent statistical information about uncertainty by taking weighted sums of individual representations. We also use the framework to analyse approaches to the task of recognising textual entailment, to ontological representations of meaning and to representing syntactic structure. For the latter, we give new algebraic descriptions of link grammar.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2011

A Context-theoretic Framework for Compositionality in Distributional Semantics

Techniques in which words are represented as vectors have proved useful ...
research
10/26/2018

Static and Dynamic Vector Semantics for Lambda Calculus Models of Natural Language

Vector models of language are based on the contextual aspects of languag...
research
07/13/2017

Learning Features from Co-occurrences: A Theoretical Analysis

Representing a word by its co-occurrences with other words in context is...
research
01/23/2014

Reasoning about Meaning in Natural Language with Compact Closed Categories and Frobenius Algebras

Compact closed categories have found applications in modeling quantum in...
research
09/24/2018

Representing Sets as Summed Semantic Vectors

Representing meaning in the form of high dimensional vectors is a common...
research
08/04/2016

Entailment Relations on Distributions

In this paper we give an overview of partial orders on the space of prob...
research
07/08/2020

Language Modeling with Reduced Densities

We present a framework for modeling words, phrases, and longer expressio...

Please sign up or login with your details

Forgot password? Click here to reset