DeepAI AI Chat
Log In Sign Up

Aligning Packed Dependency Trees: a theory of composition for distributional semantics

08/25/2016
by   David Weir, et al.
0

We present a new framework for compositional distributional semantics in which the distributional contexts of lexemes are expressed in terms of anchored packed dependency trees. We show that these structures have the potential to capture the full sentential contexts of a lexeme and provide a uniform basis for the composition of distributional knowledge in a way that captures both mutual disambiguation and generalization.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/04/2016

Quantifier Scope in Categorical Compositional Distributional Semantics

In previous work with J. Hedges, we formalised a generalised quantifiers...
11/08/2019

Composing and Embedding the Words-as-Classifiers Model of Grounded Semantics

The words-as-classifiers model of grounded lexical semantics learns a se...
07/10/2012

Challenges for Distributional Compositional Semantics

This paper summarises the current state-of-the art in the study of compo...
04/21/2017

Improving Semantic Composition with Offset Inference

Count-based distributional semantic models suffer from sparsity due to u...
06/08/2016

Learning Semantically and Additively Compositional Distributional Representations

This paper connects a vector-based composition model to a formal semanti...
11/01/2018

Exploring Semantic Incrementality with Dynamic Syntax and Vector Space Semantics

One of the fundamental requirements for models of semantic processing in...
08/23/2022

Computational valency lexica and Homeric formularity

Distributional semantics, the quantitative study of meaning variation an...