Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs

05/01/2022
by   Songlin Yang, et al.
0

Hidden Markov Models (HMMs) and Probabilistic Context-Free Grammars (PCFGs) are widely used structured models, both of which can be represented as factor graph grammars (FGGs), a powerful formalism capable of describing a wide range of models. Recent research found it beneficial to use large state spaces for HMMs and PCFGs. However, inference with large state spaces is computationally demanding, especially for PCFGs. To tackle this challenge, we leverage tensor rank decomposition (aka. CPD) to decrease inference computational complexities for a subset of FGGs subsuming HMMs and PCFGs. We apply CPD on the factors of an FGG and then construct a new FGG defined in the rank space. Inference with the new FGG produces the same result but has a lower time complexity when the rank size is smaller than the state size. We conduct experiments on HMM language modeling and unsupervised PCFG parsing, showing better performance than previous work. Our code is publicly available at <https://github.com/VPeterV/RankSpace-Models>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/08/2022

Low-Rank Constraints for Fast Inference in Structured Models

Structured distributions, i.e. distributions over combinatorial spaces, ...
research
04/07/2022

Modeling Label Correlations for Second-Order Semantic Dependency Parsing with Mean-Field Inference

Second-order semantic parsing with end-to-end mean-field inference has b...
research
12/18/2022

Unsupervised Discontinuous Constituency Parsing with Mildly Context-Sensitive Grammars

We study grammar induction with mildly context-sensitive grammars for un...
research
11/09/2020

Scaling Hidden Markov Language Models

The hidden Markov model (HMM) is a fundamental tool for sequence modelin...
research
06/10/2023

Pusℍ: Concurrent Probabilistic Programming with Function Spaces

We introduce a prototype probabilistic programming language (PPL) called...
research
05/11/2018

TensOrMachine: Probabilistic Boolean Tensor Decomposition

Boolean tensor decomposition approximates data of multi-way binary relat...
research
05/29/2021

Cherry-Picking Gradients: Learning Low-Rank Embeddings of Visual Data via Differentiable Cross-Approximation

We propose an end-to-end trainable framework that processes large-scale ...

Please sign up or login with your details

Forgot password? Click here to reset