Predictive Model Degrees of Freedom in Linear Regression

06/29/2021
by   Bo Luan, et al.
0

Overparametrized interpolating models have drawn increasing attention from machine learning. Some recent studies suggest that regularized interpolating models can generalize well. This phenomenon seemingly contradicts the conventional wisdom that interpolation tends to overfit the data and performs poorly on test data. Further, it appears to defy the bias-variance trade-off. As one of the shortcomings of the existing theory, the classical notion of model degrees of freedom fails to explain the intrinsic difference among the interpolating models since it focuses on estimation of in-sample prediction error. This motivates an alternative measure of model complexity which can differentiate those interpolating models and take different test points into account. In particular, we propose a measure with a proper adjustment based on the squared covariance between the predictions and observations. Our analysis with least squares method reveals some interesting properties of the measure, which can reconcile the "double descent" phenomenon with the classical theory. This opens doors to an extended definition of model degrees of freedom in modern predictive settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2022

On Measuring Model Complexity in Heteroscedastic Linear Regression

Heteroscedasticity is common in real world applications and is often han...
research
02/24/2019

De-Biasing The Lasso With Degrees-of-Freedom Adjustment

This paper studies schemes to de-bias the Lasso in sparse linear regress...
research
10/31/2022

Three Properties of F-Statistics for Multiple Regression and ANOVA

This paper establishes three properties of F-statistics for inference ab...
research
03/24/2021

Loss based prior for the degrees of freedom of the Wishart distribution

In this paper we propose a novel method to deal with Vector Autoregressi...
research
09/16/2021

Information Dynamics and The Arrow of Time

Time appears to pass irreversibly. In light of CPT symmetry, the Univers...
research
02/14/2021

The Predictive Normalized Maximum Likelihood for Over-parameterized Linear Regression with Norm Constraint: Regret and Double Descent

A fundamental tenet of learning theory is that a trade-off exists betwee...
research
06/10/2019

Degrees of Freedom Analysis of Unrolled Neural Networks

Unrolled neural networks emerged recently as an effective model for lear...

Please sign up or login with your details

Forgot password? Click here to reset