Recovering Joint Probability of Discrete Random Variables from Pairwise Marginals

by   Shahana Ibrahim, et al.

Learning the joint probability of random variables (RVs) lies at the heart of statistical signal processing and machine learning. However, direct nonparametric estimation for high-dimensional joint probability is in general impossible, due to the curse of dimensionality. Recent work has proposed to recover the joint probability mass function (PMF) of an arbitrary number of RVs from three-dimensional marginals, leveraging the algebraic properties of low-rank tensor decomposition and the (unknown) dependence among the RVs. Nonetheless, accurately estimating three-dimensional marginals can still be costly in terms of sample complexity, affecting the performance of this line of work in practice in the sample-starved regime. Using three-dimensional marginals also involves challenging tensor decomposition problems whose tractability is unclear. This work puts forth a new framework for learning the joint PMF using only pairwise marginals, which naturally enjoys a lower sample complexity relative to the third-order ones. A coupled nonnegative matrix factorization (CNMF) framework is developed, and its joint PMF recovery guarantees under various conditions are analyzed. Our method also features a Gram-Schmidt (GS)-like algorithm that exhibits competitive runtime performance. The algorithm is shown to provably recover the joint PMF up to bounded error in finite iterations, under reasonable conditions. It is also shown that a recently proposed economical expectation maximization (EM) algorithm guarantees to improve upon the GS-like algorithm's output, thereby further lifting up the accuracy and efficiency. Real-data experiments are employed to showcase the effectiveness.



There are no comments yet.


page 1

page 2

page 3

page 4


Recovery of Joint Probability Distribution from one-way marginals: Low rank Tensors and Random Projections

Joint probability mass function (PMF) estimation is a fundamental machin...

Tensors, Learning, and 'Kolmogorov Extension' for Finite-alphabet Random Vectors

Estimating the joint probability mass function (PMF) of a set of random ...

Crowdsourcing via Pairwise Co-occurrences: Identifiability and Algorithms

The data deluge comes with high demands for data labeling. Crowdsourcing...

Completing a joint PMF from projections: a low-rank coupled tensor factorization approach

There has recently been considerable interest in completing a low-rank m...

Information-theoretic Feature Selection via Tensor Decomposition and Submodularity

Feature selection by maximizing high-order mutual information between th...

Improving Nonparametric Density Estimation with Tensor Decompositions

While nonparametric density estimators often perform well on low dimensi...

Crowdsourcing via Annotator Co-occurrence Imputation and Provable Symmetric Nonnegative Matrix Factorization

Unsupervised learning of the Dawid-Skene (D S) model from noisy, incom...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.