Consistency of Interpolation with Laplace Kernels is a High-Dimensional Phenomenon

12/28/2018
by   Alexander Rakhlin, et al.
16

We show that minimum-norm interpolation in the Reproducing Kernel Hilbert Space corresponding to the Laplace kernel is not consistent if input dimension is constant. The lower bound holds for any choice of kernel bandwidth, even if selected based on data. The result supports the empirical observation that minimum-norm interpolation (that is, exact fit to training data) in RKHS generalizes well for some high-dimensional datasets, but not for low-dimensional ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2021

Tight bounds for minimum l1-norm interpolation of noisy data

We provide matching upper and lower bounds of order σ^2/log(d/n) for the...
research
08/07/2020

Generalization error of minimum weighted norm and kernel interpolation

We study the generalization error of functions that interpolate prescrib...
research
11/09/2021

Harmless interpolation in regression and classification with structured features

Overparametrized neural networks tend to perfectly fit noisy training da...
research
08/01/2018

Just Interpolate: Kernel "Ridgeless" Regression Can Generalize

In the absence of explicit regularization, Kernel "Ridgeless" Regression...
research
09/03/2020

Kernel Interpolation of High Dimensional Scattered Data

Data sites selected from modeling high-dimensional problems often appear...
research
10/22/2020

Principled Interpolation in Normalizing Flows

Generative models based on normalizing flows are very successful in mode...
research
01/28/2021

Interpolating Classifiers Make Few Mistakes

This paper provides elementary analyses of the regret and generalization...

Please sign up or login with your details

Forgot password? Click here to reset