Consistent Interpolating Ensembles via the Manifold-Hilbert Kernel

05/19/2022
by   Yutong Wang, et al.
0

Recent research in the theory of overparametrized learning has sought to establish generalization guarantees in the interpolating regime. Such results have been established for a few common classes of methods, but so far not for ensemble methods. We devise an ensemble classification method that simultaneously interpolates the training data, and is consistent for a broad class of data distributions. To this end, we define the manifold-Hilbert kernel for data distributed on a Riemannian manifold. We prove that kernel smoothing regression using the manifold-Hilbert kernel is weakly consistent in the setting of Devroye et al. 1998. For the sphere, we show that the manifold-Hilbert kernel can be realized as a weighted random partition kernel, which arises as an infinite ensemble of partition-based classifiers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2014

A Framework for Shape Analysis via Hilbert Space Embedding

We propose a framework for 2D shape analysis using positive definite ker...
research
06/13/2020

Sample complexity and effective dimension for regression on manifolds

We consider the theory of regression on a manifold using reproducing ker...
research
09/22/2020

Deep Neural Tangent Kernel and Laplace Kernel Have the Same RKHS

We prove that the reproducing kernel Hilbert spaces (RKHS) of a deep neu...
research
03/15/2012

Super-Samples from Kernel Herding

We extend the herding algorithm to continuous spaces by using the kernel...
research
12/28/2021

Ensemble Recognition in Reproducing Kernel Hilbert Spaces through Aggregated Measurements

In this paper, we study the problem of learning dynamical properties of ...
research
08/04/2019

Measuring the Algorithmic Convergence of Randomized Ensembles: The Regression Setting

When randomized ensemble methods such as bagging and random forests are ...

Please sign up or login with your details

Forgot password? Click here to reset