Fitting Elephants

03/31/2021
by   Partha P. Mitra, et al.
3

Textbook wisdom advocates for smooth function fits and implies that interpolation of noisy data should lead to poor generalization. A related heuristic is that fitting parameters should be fewer than measurements (Occam's Razor). Surprisingly, contemporary machine learning (ML) approaches, cf. deep nets (DNNs), generalize well despite interpolating noisy data. This may be understood via Statistically Consistent Interpolation (SCI), i.e. data interpolation techniques that generalize optimally for big data. In this article we elucidate SCI using the weighted interpolating nearest neighbors (wiNN) algorithm, which adds singular weight functions to kNN (k-nearest neighbors). This shows that data interpolation can be a valid ML strategy for big data. SCI clarifies the relation between two ways of modeling natural phenomena: the rationalist approach (strong priors) of theoretical physics with few parameters and the empiricist (weak priors) approach of modern ML with more parameters than data. SCI shows that the purely empirical approach can successfully predict. However data interpolation does not provide theoretical insights, and the training data requirements may be prohibitive. Complex animal brains are between these extremes, with many parameters, but modest training data, and with prior structure encoded in species-specific mesoscale circuitry. Thus, modern ML provides a distinct epistemological approach different both from physical theories and animal brains.

READ FULL TEXT

page 2

page 4

page 7

page 11

research
06/09/2019

Understanding overfitting peaks in generalization error: Analytical risk curves for l_2 and l_1 penalized interpolation

Traditionally in regression one minimizes the number of fitting paramete...
research
06/07/2021

Parameter-free Statistically Consistent Interpolation: Dimension-independent Convergence Rates for Hilbert kernel regression

Previously, statistical textbook wisdom has held that interpolating nois...
research
09/06/2021

A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of Overparameterized Machine Learning

The rapid recent progress in machine learning (ML) has raised a number o...
research
07/30/2014

Automated Machine Learning on Big Data using Stochastic Algorithm Tuning

We introduce a means of automating machine learning (ML) for big data ta...
research
10/07/2022

Geomagnetic Survey Interpolation with the Machine Learning Approach

This paper portrays the method of UAV magnetometry survey data interpola...
research
05/29/2021

Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation

In the past decade the mathematical theory of machine learning has lagge...
research
07/26/2022

Physics Embedded Machine Learning for Electromagnetic Data Imaging

Electromagnetic (EM) imaging is widely applied in sensing for security, ...

Please sign up or login with your details

Forgot password? Click here to reset