Fitting Elephants

03/31/2021
by   Partha P. Mitra, et al.
3

Textbook wisdom advocates for smooth function fits and implies that interpolation of noisy data should lead to poor generalization. A related heuristic is that fitting parameters should be fewer than measurements (Occam's Razor). Surprisingly, contemporary machine learning (ML) approaches, cf. deep nets (DNNs), generalize well despite interpolating noisy data. This may be understood via Statistically Consistent Interpolation (SCI), i.e. data interpolation techniques that generalize optimally for big data. In this article we elucidate SCI using the weighted interpolating nearest neighbors (wiNN) algorithm, which adds singular weight functions to kNN (k-nearest neighbors). This shows that data interpolation can be a valid ML strategy for big data. SCI clarifies the relation between two ways of modeling natural phenomena: the rationalist approach (strong priors) of theoretical physics with few parameters and the empiricist (weak priors) approach of modern ML with more parameters than data. SCI shows that the purely empirical approach can successfully predict. However data interpolation does not provide theoretical insights, and the training data requirements may be prohibitive. Complex animal brains are between these extremes, with many parameters, but modest training data, and with prior structure encoded in species-specific mesoscale circuitry. Thus, modern ML provides a distinct epistemological approach different both from physical theories and animal brains.

READ FULL TEXT

page 2

page 4

page 7

page 11

06/09/2019

Understanding overfitting peaks in generalization error: Analytical risk curves for l_2 and l_1 penalized interpolation

Traditionally in regression one minimizes the number of fitting paramete...
06/07/2021

Parameter-free Statistically Consistent Interpolation: Dimension-independent Convergence Rates for Hilbert kernel regression

Previously, statistical textbook wisdom has held that interpolating nois...
09/06/2021

A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of Overparameterized Machine Learning

The rapid recent progress in machine learning (ML) has raised a number o...
07/30/2014

Automated Machine Learning on Big Data using Stochastic Algorithm Tuning

We introduce a means of automating machine learning (ML) for big data ta...
07/26/2022

Physics Embedded Machine Learning for Electromagnetic Data Imaging

Electromagnetic (EM) imaging is widely applied in sensing for security, ...
05/29/2021

Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation

In the past decade the mathematical theory of machine learning has lagge...
10/01/2019

Spherical k-Nearest Neighbors Interpolation

Geospatial interpolation is a challenging task due to real world data of...