Kernel Conditional Density Operators

by   Ingmar Schuster, et al.

We introduce a conditional density estimation model termed the conditional density operator. It naturally captures multivariate, multimodal output densities and is competitive with recent neural conditional density models and Gaussian processes. To derive the model, we propose a novel approach to the reconstruction of probability densities from their kernel mean embeddings by drawing connections to estimation of Radon-Nikodym derivatives in the reproducing kernel Hilbert space (RKHS). We prove finite sample error bounds which are independent of problem dimensionality. Furthermore, the resulting conditional density model is applied to real-world data and we demonstrate its versatility and competitive performance.


page 1

page 2

page 3

page 4


Adaptive learning of density ratios in RKHS

Estimating the ratio of two probability densities from finitely many obs...

Neural-Kernelized Conditional Density Estimation

Conditional density estimation is a general framework for solving variou...

Nonparametric Probabilistic Regression with Coarse Learners

Probabilistic Regression refers to predicting a full probability density...

Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

Current meta-learning approaches focus on learning functional representa...

Bayesian statistical learning using density operators

This short study reformulates the statistical Bayesian learning problem ...

Adaptive Random Bandwidth for Inference in CAViaR Models

This paper investigates the size performance of Wald tests for CAViaR mo...

Set Prediction without Imposing Structure as Conditional Density Estimation

Set prediction is about learning to predict a collection of unordered va...

Please sign up or login with your details

Forgot password? Click here to reset