Tensors over Semirings for Latent-Variable Weighted Logic Programs

06/07/2020
by   Esma Balkir, et al.
0

Semiring parsing is an elegant framework for describing parsers by using semiring weighted logic programs. In this paper we present a generalization of this concept: latent-variable semiring parsing. With our framework, any semiring weighted logic program can be latentified by transforming weights from scalar values of a semiring to rank-n arrays, or tensors, of semiring values, allowing the modelling of latent variables within the semiring parsing framework. Semiring is too strong a notion when dealing with tensors, and we have to resort to a weaker structure: a partial semiring. We prove that this generalization preserves all the desired properties of the original semiring framework while strictly increasing its expressiveness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

Statistical and computational rates in high rank tensor estimation

Higher-order tensor datasets arise commonly in recommendation systems, n...
research
02/08/2019

Bernstein Concentration Inequalities for Tensors via Einstein Products

A generalization of the Bernstein matrix concentration inequality to ran...
research
09/26/2013

The Supervised IBP: Neighbourhood Preserving Infinite Latent Feature Models

We propose a probabilistic model to infer supervised latent variables in...
research
02/19/2019

Fast Compressive Sensing Recovery Using Generative Models with Structured Latent Variables

Deep learning models have significantly improved the visual quality and ...
research
11/10/2015

Anchored Discrete Factor Analysis

We present a semi-supervised learning algorithm for learning discrete fa...
research
06/07/2016

Optimizing Spectral Learning for Parsing

We describe a search algorithm for optimizing the number of latent state...

Please sign up or login with your details

Forgot password? Click here to reset