Improving Semantic Composition with Offset Inference

04/21/2017
by   Thomas Kober, et al.
0

Count-based distributional semantic models suffer from sparsity due to unobserved but plausible co-occurrences in any text collection. This problem is amplified for models like Anchored Packed Trees (APTs), that take the grammatical type of a co-occurrence into account. We therefore introduce a novel form of distributional inference that exploits the rich type structure in APTs and infers missing data by the same mechanism that is used for semantic composition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2016

Improving Sparse Word Representations with Distributional Inference for Semantic Composition

Distributional models are derived from co-occurrences in a corpus, where...
research
08/25/2016

Aligning Packed Dependency Trees: a theory of composition for distributional semantics

We present a new framework for compositional distributional semantics in...
research
04/02/2018

Modeling Semantic Plausibility by Injecting World Knowledge

Distributional data tells us that a man can swallow candy, but not that ...
research
05/06/2020

Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics

Functional Distributional Semantics provides a linguistically interpreta...
research
11/08/2019

Composing and Embedding the Words-as-Classifiers Model of Grounded Semantics

The words-as-classifiers model of grounded lexical semantics learns a se...
research
05/20/2021

A comprehensive comparative evaluation and analysis of Distributional Semantic Models

Distributional semantics has deeply changed in the last decades. First, ...
research
09/01/2017

Variational Inference for Logical Inference

Functional Distributional Semantics is a framework that aims to learn, f...

Please sign up or login with your details

Forgot password? Click here to reset