'Almost Sure' Chaotic Properties of Machine Learning Methods

07/28/2014
by   Nabarun Mondal, et al.
0

It has been demonstrated earlier that universal computation is 'almost surely' chaotic. Machine learning is a form of computational fixed point iteration, iterating over the computable function space. We showcase some properties of this iteration, and establish in general that the iteration is 'almost surely' of chaotic nature. This theory explains the observation in the counter intuitive properties of deep learning methods. This paper demonstrates that these properties are going to be universal to any learning method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/03/2020

Iterated Linear Optimization

We introduce a fixed point iteration process built on optimization of a ...
research
06/10/2019

Data-driven Reconstruction of Nonlinear Dynamics from Sparse Observation

We present a data-driven model to reconstruct nonlinear dynamics from a ...
research
02/07/2020

Differentiable Fixed-Point Iteration Layer

Recently, several studies proposed methods to utilize some restricted cl...
research
08/30/2021

Beyond Value Iteration for Parity Games: Strategy Iteration with Universal Trees

Parity games have witnessed several new quasi-polynomial algorithms sinc...
research
01/15/2023

Shades of Iteration: from Elgot to Kleene

Notions of iteration range from the arguably most general Elgot iteratio...
research
02/23/2021

Uniform Elgot Iteration in Foundations

Category theory is famous for its innovative way of thinking of concepts...
research
03/31/2021

Distributed Picard Iteration

The Picard iteration is widely used to find fixed points of locally cont...

Please sign up or login with your details

Forgot password? Click here to reset