Algorithmic Probability-guided Supervised Machine Learning on Non-differentiable Spaces

We show how complexity theory can be introduced in machine learning to help bring together apparently disparate areas of current research. We show that this new approach requires less training data and is more generalizable as it shows greater resilience to random attacks. We investigate the shape of the discrete algorithmic space when performing regression or classification using a loss function parametrized by algorithmic complexity, demonstrating that the property of differentiation is not necessary to achieve results similar to those obtained using differentiable programming approaches such as deep learning. In doing so we use examples which enable the two approaches to be compared (small, given the computational power required for estimations of algorithmic complexity). We find and report that (i) machine learning can successfully be performed on a non-smooth surface using algorithmic complexity; (ii) that parameter solutions can be found using an algorithmic-probability classifier, establishing a bridge between a fundamentally discrete theory of computability and a fundamentally continuous mathematical theory of optimization methods; (iii) a formulation of an algorithmically directed search technique in non-smooth manifolds can be defined and conducted; (iv) exploitation techniques and numerical methods for algorithmic search to navigate these discrete non-differentiable spaces can be performed; in application of the (a) identification of generative rules from data observations; (b) solutions to image classification problems more resilient against pixel attacks compared to neural networks; (c) identification of equation parameters from a small data-set in the presence of noise in continuous ODE system problem, (d) classification of Boolean NK networks by (1) network topology, (2) underlying Boolean function, and (3) number of incoming edges.

READ FULL TEXT

page 10

page 20

page 21

research
08/09/2023

Learning of discrete models of variational PDEs from data

We show how to learn discrete field theories from observational data of ...
research
06/29/2020

Deep Ordinal Regression with Label Diversity

Regression via classification (RvC) is a common method used for regressi...
research
12/14/2022

Multiclass classification utilising an estimated algorithmic probability prior

Methods of pattern recognition and machine learning are applied extensiv...
research
05/09/2019

Differentiable Approximation Bridges For Training Networks Containing Non-Differentiable Functions

Modern neural network training relies on piece-wise (sub-)differentiable...
research
12/22/2021

Algorithmic Probability of Large Datasets and the Simplicity Bubble Problem in Machine Learning

When mining large datasets in order to predict new data, limitations of ...
research
03/27/2013

Foundations of Probability Theory for AI - The Application of Algorithmic Probability to Problems in Artificial Intelligence

This paper covers two topics: first an introduction to Algorithmic Compl...
research
02/11/2022

Exploration of Differentiability in a Proton Computed Tomography Simulation Framework

Objective. Algorithmic differentiation (AD) can be a useful technique to...

Please sign up or login with your details

Forgot password? Click here to reset