The Relativity of Induction

09/22/2020
by   Larry Muhlstein, et al.
0

Lately there has been a lot of discussion about why deep learning algorithms perform better than we would theoretically suspect. To get insight into this question, it helps to improve our understanding of how learning works. We explore the core problem of generalization and show that long-accepted Occam's razor and parsimony principles are insufficient to ground learning. Instead, we derive and demonstrate a set of relativistic principles that yield clearer insight into the nature and dynamics of learning. We show that concepts of simplicity are fundamentally contingent, that all learning operates relative to an initial guess, and that generalization cannot be measured or strongly inferred, but that it can be expected given enough observation. Using these principles, we reconstruct our understanding in terms of distributed learning systems whose components inherit beliefs and update them. We then apply this perspective to elucidate the nature of some real world inductive processes including deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2011

Principles of Solomonoff Induction and AIXI

We identify principles characterizing Solomonoff Induction by demands on...
research
12/11/2022

Generalization Through the Lens of Learning Dynamics

A machine learning (ML) system must learn not only to match the output o...
research
07/13/2023

A Causal Framework to Unify Common Domain Generalization Approaches

Domain generalization (DG) is about learning models that generalize well...
research
03/27/2013

Induction, of and by Probability

This paper examines some methods and ideas underlying the author's succe...
research
02/26/2019

Induction, Coinduction, and Fixed Points: Intuitions and Tutorial

Recently we presented a concise survey of the formulation of the inducti...
research
12/14/2020

NeurIPS 2020 Competition: Predicting Generalization in Deep Learning

Understanding generalization in deep learning is arguably one of the mos...
research
10/23/2018

A mathematical theory of semantic development in deep neural networks

An extensive body of empirical research has revealed remarkable regulari...

Please sign up or login with your details

Forgot password? Click here to reset