Kernel Mean Embedding of Distributions: A Review and Beyond

05/31/2016
by   Krikamol Muandet, et al.
0

A Hilbert space embedding of a distribution---in short, a kernel mean embedding---has recently emerged as a powerful tool for machine learning and inference. The basic idea behind this framework is to map distributions into a reproducing kernel Hilbert space (RKHS) in which the whole arsenal of kernel methods can be extended to probability measures. It can be viewed as a generalization of the original "feature map" common to support vector machines (SVMs) and other kernel methods. While initially closely associated with the latter, it has meanwhile found application in fields ranging from kernel machines and probabilistic modeling to statistical inference, causal discovery, and deep learning. The goal of this survey is to give a comprehensive review of existing work and recent advances in this research area, and to discuss the most challenging issues and open problems that could lead to new research directions. The survey begins with a brief introduction to the RKHS and positive definite kernels which forms the backbone of this survey, followed by a thorough discussion of the Hilbert space embedding of marginal distributions, theoretical guarantees, and a review of its applications. The embedding of distributions enables us to apply RKHS methods to probability measures which prompts a wide range of applications such as kernel two-sample testing, independent testing, and learning on distributional data. Next, we discuss the Hilbert space embedding for conditional distributions, give theoretical insights, and review some applications. The conditional mean embedding enables us to perform sum, product, and Bayes' rules---which are ubiquitous in graphical model, probabilistic inference, and reinforcement learning---in a non-parametric way. We then discuss relationships between this framework and other related areas. Lastly, we give some suggestions on future research directions.

READ FULL TEXT
research
05/22/2018

Counterfactual Mean Embedding: A Kernel Method for Nonparametric Causal Inference

This paper introduces a novel Hilbert space representation of a counterf...
research
03/07/2016

Bayesian Learning of Kernel Embeddings

Kernel methods are one of the mainstays of machine learning, but the pro...
research
02/21/2021

Tractable Computation of Expected Kernels by Circuits

Computing the expectation of some kernel function is ubiquitous in machi...
research
10/10/2021

Adaptive joint distribution learning

We develop a new framework for embedding (joint) probability distributio...
research
11/07/2021

A Review of Location Encoding for GeoAI: Methods and Applications

A common need for artificial intelligence models in the broader geoscien...
research
07/29/2020

Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures

Kernel mean embedding (KME) is a powerful tool to analyze probability me...
research
06/03/2019

Approximation capability of neural networks on spaces of probability measures and tree-structured domains

This paper extends the proof of density of neural networks in the space ...

Please sign up or login with your details

Forgot password? Click here to reset