DeepAI
Log In Sign Up

Undecidability of Underfitting in Learning Algorithms

02/04/2021
by   Sonia Sehra, et al.
0

Using recent machine learning results that present an information-theoretic perspective on underfitting and overfitting, we prove that deciding whether an encodable learning algorithm will always underfit a dataset, even if given unlimited training time, is undecidable. We discuss the importance of this result and potential topics for further research, including information-theoretic and probabilistic strategies for bounding learning algorithm fit.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/12/2020

An Information-Theoretic Perspective on Overfitting and Underfitting

We present an information-theoretic framework for understanding overfitt...
10/26/2013

Efficient Information Theoretic Clustering on Discrete Lattices

We consider the problem of clustering data that reside on discrete, low ...
06/10/2019

Big Variates: Visualizing and identifying key variables in a multivariate world

Big Data involves both a large number of events but also many variables....
08/08/2022

Information bottleneck theory of high-dimensional regression: relevancy, efficiency and optimality

Avoiding overfitting is a central challenge in machine learning, yet man...
01/28/2020

Margin Maximization as Lossless Maximal Compression

The ultimate goal of a supervised learning algorithm is to produce model...
07/21/2013

An Information Theoretic Measure of Judea Pearl's Identifiability and Causal Influence

In this paper, we define a new information theoretic measure that we cal...
06/07/2021

An Information-theoretic Approach to Distribution Shifts

Safely deploying machine learning models to the real world is often a ch...