Interpolation Learning With Minimum Description Length

02/14/2023
by   Naren Sarayu Manoj, et al.
0

We prove that the Minimum Description Length learning rule exhibits tempered overfitting. We obtain tempered agnostic finite sample learning guarantees and characterize the asymptotic behavior in the presence of random label noise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/28/2023

Noisy Interpolation Learning with Shallow Univariate ReLU Networks

We study the asymptotic overfitting behavior of interpolation with minim...
research
04/16/2023

Regression and Algorithmic Information Theory

In this paper we prove a theorem about regression, in that the shortest ...
research
07/11/2016

Minimum Description Length Principle in Supervised Learning with Application to Lasso

The minimum description length (MDL) principle in supervised learning is...
research
10/31/2021

Minimum Description Length Recurrent Neural Networks

We train neural networks to optimize a Minimum Description Length score,...
research
03/05/2021

Rissanen Data Analysis: Examining Dataset Characteristics via Description Length

We introduce a method to determine if a certain capability helps to achi...
research
06/12/2020

On the Impact of Finite-Length ProbabilisticShaping on Fiber Nonlinear Interference

The interplay of shaped signaling and fiber nonlinearities is reviewed i...
research
01/12/2020

Finite-Sample Analysis of Image Registration

We study the problem of image registration in the finite-resolution regi...

Please sign up or login with your details

Forgot password? Click here to reset