Adaptive joint distribution learning

10/10/2021
by   Damir Filipović, et al.
0

We develop a new framework for embedding (joint) probability distributions in tensor product reproducing kernel Hilbert spaces (RKHS). This framework accommodates a low-dimensional, positive, and normalized model of a Radon-Nikodym derivative, estimated from sample sizes of up to several million data points, alleviating the inherent limitations of RKHS modeling. Well-defined normalized and positive conditional distributions are natural by-products to our approach. The embedding is fast to compute and naturally accommodates learning problems ranging from prediction to classification. The theoretical findings are supplemented by favorable numerical results.

READ FULL TEXT
research
05/31/2019

Quantum Mean Embedding of Probability Distributions

The kernel mean embedding of probability distributions is commonly used ...
research
05/31/2016

Kernel Mean Embedding of Distributions: A Review and Beyond

A Hilbert space embedding of a distribution---in short, a kernel mean em...
research
08/28/2017

Characteristic and Universal Tensor Product Kernels

Kernel mean embeddings provide a versatile and powerful nonparametric re...
research
07/25/2016

A Statistical Test for Joint Distributions Equivalence

We provide a distribution-free test that can be used to determine whethe...
research
02/01/2012

Kernels on Sample Sets via Nonparametric Divergence Estimates

Most machine learning algorithms, such as classification or regression, ...
research
09/24/2017

On the Optimality of Kernel-Embedding Based Goodness-of-Fit Tests

The reproducing kernel Hilbert space (RKHS) embedding of distributions o...
research
01/31/2017

Prototypal Analysis and Prototypal Regression

Prototypal analysis is introduced to overcome two shortcomings of archet...

Please sign up or login with your details

Forgot password? Click here to reset