Learning Multivariate CDFs and Copulas using Tensor Factorization

10/13/2022
by   Magda Amiridi, et al.
0

Learning the multivariate distribution of data is a core challenge in statistics and machine learning. Traditional methods aim for the probability density function (PDF) and are limited by the curse of dimensionality. Modern neural methods are mostly based on black-box models, lacking identifiability guarantees. In this work, we aim to learn multivariate cumulative distribution functions (CDFs), as they can handle mixed random variables, allow efficient box probability evaluation, and have the potential to overcome local sample scarcity owing to their cumulative nature. We show that any grid sampled version of a joint CDF of mixed random variables admits a universal representation as a naive Bayes model via the Canonical Polyadic (tensor-rank) decomposition. By introducing a low-rank model, either directly in the raw data domain, or indirectly in a transformed (Copula) domain, the resulting model affords efficient sampling, closed form inference and uncertainty quantification, and comes with uniqueness guarantees under relatively mild conditions. We demonstrate the superior performance of the proposed model in several synthetic and real datasets and applications including regression, sampling and data imputation. Interestingly, our experiments with real data show that it is possible to obtain better density/mass estimates indirectly via a low-rank CDF model, than a low-rank PDF/PMF model.

READ FULL TEXT
research
03/22/2021

Recovery of Joint Probability Distribution from one-way marginals: Low rank Tensors and Random Projections

Joint probability mass function (PMF) estimation is a fundamental machin...
research
08/27/2020

Nonparametric Multivariate Density Estimation: A Low-Rank Characteristic Function Approach

Effective non-parametric density estimation is a key challenge in high-d...
research
10/30/2020

Information-theoretic Feature Selection via Tensor Decomposition and Submodularity

Feature selection by maximizing high-order mutual information between th...
research
06/20/2021

Low-rank Characteristic Tensor Density Estimation Part II: Compression and Latent Density Estimation

Learning generative probabilistic models is a core problem in machine le...
research
06/30/2020

Recovering Joint Probability of Discrete Random Variables from Pairwise Marginals

Learning the joint probability of random variables (RVs) lies at the hea...
research
01/05/2016

Low-Rank Representation over the Manifold of Curves

In machine learning it is common to interpret each data point as a vecto...
research
03/31/2021

High-Dimensional Uncertainty Quantification via Rank- and Sample-Adaptive Tensor Regression

Fabrication process variations can significantly influence the performan...

Please sign up or login with your details

Forgot password? Click here to reset