DeepAI AI Chat
Log In Sign Up

On the Risk of Minimum-Norm Interpolants and Restricted Lower Isometry of Kernels

08/27/2019
by   Tengyuan Liang, et al.
6

We study the risk of minimum-norm interpolants of data in a Reproducing Kernel Hilbert Space where kernel is defined as a function of the inner product. Our upper bounds on the risk are of a multiple-descent shape for the various scalings of d = n^α, α∈(0,1), for the input dimension d and sample size n. At the heart of our analysis is a study of spectral properties of the random kernel matrix restricted to a filtration of eigen-spaces of the population covariance operator. Since gradient flow on appropriately initialized wide neural networks converges to a minimum-norm interpolant, the analysis also yields estimation guarantees for these models.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/30/2021

Minimum complexity interpolation in random features models

Despite their many appealing properties, kernel methods are heavily affe...
12/06/2012

Excess risk bounds for multitask learning with trace norm regularization

Trace norm regularization is a popular method of multitask learning. We ...
09/19/2022

Deep Linear Networks can Benignly Overfit when Shallow Ones Do

We bound the excess risk of interpolating deep linear networks trained u...
03/03/2011

The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning

We derive an upper bound on the local Rademacher complexity of ℓ_p-norm ...
01/04/2019

On Reproducing Kernel Banach Spaces: Generic Definitions and Unified Framework of Constructions

Recently, there has been emerging interest in constructing reproducing k...