The Predictive Normalized Maximum Likelihood for Over-parameterized Linear Regression with Norm Constraint: Regret and Double Descent

02/14/2021
by   Koby Bibas, et al.
0

A fundamental tenet of learning theory is that a trade-off exists between the complexity of a prediction rule and its ability to generalize. The double-decent phenomenon shows that modern machine learning models do not obey this paradigm: beyond the interpolation limit, the test error declines as model complexity increases. We investigate over-parameterization in linear regression using the recently proposed predictive normalized maximum likelihood (pNML) learner which is the min-max regret solution for individual data. We derive an upper bound of its regret and show that if the test sample lies mostly in a subspace spanned by the eigenvectors associated with the large eigenvalues of the empirical correlation matrix of the training data, the model generalizes despite its over-parameterized nature. We demonstrate the use of the pNML regret as a point-wise learnability measure on synthetic data and that it can successfully predict the double-decent phenomenon using the UCI dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2022

Beyond Ridge Regression for Distribution-Free Data

In supervised batch learning, the predictive normalized maximum likeliho...
research
05/12/2019

A New Look at an Old Problem: A Universal Learning Approach to Linear Regression

Linear regression is a classical paradigm in statistics. A new look at i...
research
10/18/2021

Single Layer Predictive Normalized Maximum Likelihood for Out-of-Distribution Detection

Detecting out-of-distribution (OOD) samples is vital for developing mach...
research
11/20/2020

Efficient Data-Dependent Learnability

The predictive normalized maximum likelihood (pNML) approach has recentl...
research
04/28/2019

Deep pNML: Predictive Normalized Maximum Likelihood for Deep Neural Networks

The Predictive Normalized Maximum Likelihood (pNML) scheme has been rece...
research
06/29/2021

Predictive Model Degrees of Freedom in Linear Regression

Overparametrized interpolating models have drawn increasing attention fr...
research
02/28/2021

Asymptotic Risk of Overparameterized Likelihood Models: Double Descent Theory for Deep Neural Networks

We investigate the asymptotic risk of a general class of overparameteriz...

Please sign up or login with your details

Forgot password? Click here to reset