Two-stage Sampled Learning Theory on Distributions

02/07/2014
by   Zoltan Szabo, et al.
0

We focus on the distribution regression problem: regressing to a real-valued response from a probability distribution. Although there exist a large number of similarity measures between distributions, very little is known about their generalization performance in specific learning tasks. Learning problems formulated on distributions have an inherent two-stage sampled difficulty: in practice only samples from sampled distributions are observable, and one has to build an estimate on similarities computed between sets of points. To the best of our knowledge, the only existing method with consistency guarantees for distribution regression requires kernel density estimation as an intermediate step (which suffers from slow convergence issues in high dimensions), and the domain of the distributions to be compact Euclidean. In this paper, we provide theoretical guarantees for a remarkably simple algorithmic alternative to solve the distribution regression problem: embed the distributions to a reproducing kernel Hilbert space, and learn a ridge regressor from the embeddings to the outputs. Our main contribution is to prove the consistency of this technique in the two-stage sampled setting under mild conditions (on separable, topological domains endowed with kernels). For a given total number of observations, we derive convergence rates as an explicit function of the problem difficulty. As a special case, we answer a 15-year-old open question: we establish the consistency of the classical set kernel [Haussler, 1999; Gartner et. al, 2002] in regression, and cover more recent kernels on distributions, including those due to [Christmann and Steinwart, 2010].

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2014

Learning Theory for Distribution Regression

We focus on the distribution regression problem: regressing to vector-va...
research
06/16/2020

Estimates on Learning Rates for Multi-Penalty Distribution Regression

This paper is concerned with functional learning by utilizing two-stage ...
research
08/26/2022

Coefficient-based Regularized Distribution Regression

In this paper, we consider the coefficient-based regularized distributio...
research
08/28/2023

Improved learning theory for kernel distribution regression with two-stage sampling

The distribution regression problem encompasses many important statistic...
research
07/13/2020

Strong Uniform Consistency with Rates for Kernel Density Estimators with General Kernels on Manifolds

We provide a strong uniform consistency result with the convergence rate...
research
02/08/2022

Distribution Regression with Sliced Wasserstein Kernels

The problem of learning functions over spaces of probabilities - or dist...
research
03/01/2022

Performance of Distribution Regression with Doubling Measure under the seek of Closest Point

We study the distribution regression problem assuming the distribution o...

Please sign up or login with your details

Forgot password? Click here to reset