On the impossibility of discovering a formula for primes using AI

07/27/2023
by   Alexander Kolpakov, et al.
0

The present work explores the theoretical limits of Machine Learning (ML) within the framework of Kolmogorov's theory of Algorithmic Probability, which clarifies the notion of entropy as Expected Kolmogorov Complexity and formalizes other fundamental concepts such as Occam's razor via Levin's Universal Distribution. As a fundamental application, we develop Maximum Entropy methods that allow us to derive the Erdős–Kac Law in Probabilistic Number Theory, and establish the impossibility of discovering a formula for primes using Machine Learning via the Prime Coding Theorem.

READ FULL TEXT
research
10/24/2022

The Entropy Method in Large Deviation Theory

This paper illustrates the power of the entropy method in addressing pro...
research
03/16/2019

Entropy modulo a prime

Building on work of Kontsevich, we introduce a definition of the entropy...
research
10/13/2022

Entropy Approximation by Machine Learning Regression: Application for Irregularity Evaluation of Images in Remote Sensing

Approximation of entropies of various types using machine learning (ML) ...
research
07/29/2019

Discovering Association with Copula Entropy

Discovering associations is of central importance in scientific practice...
research
04/25/2023

A New Information Theory of Certainty for Machine Learning

Claude Shannon coined entropy to quantify the uncertainty of a random di...
research
02/11/2015

How to show a probabilistic model is better

We present a simple theoretical framework, and corresponding practical p...
research
09/18/2020

ACSS-q: Algorithmic complexity for short strings via quantum accelerated approach

In this research we present a quantum circuit for estimating algorithmic...

Please sign up or login with your details

Forgot password? Click here to reset