The algorithmic second law of thermodynamics

08/14/2023
by   Aram Ebtekar, et al.
0

Gács' coarse-grained algorithmic entropy leverages universal computation to quantify the information content of any given physical state. Unlike the Boltzmann and Shannon-Gibbs entropies, it requires no prior commitment to macrovariables or probabilistic ensembles. Whereas earlier work had made loose connections between the entropy of thermodynamic systems and information-processing systems, the algorithmic entropy formally unifies them both. After adapting Gács' definition to Markov processes, we prove a very general second law of thermodynamics, and discuss its advantages over previous formulations. Finally, taking inspiration from Maxwell's demon, we model an information engine powered by compressible data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/17/2020

Change the coefficients of conditional entropies in extensivity

The Boltzmann–Gibbs entropy is a functional on the space of probability ...
research
12/29/2011

Multi-q Analysis of Image Patterns

This paper studies the use of the Tsallis Entropy versus the classic Bol...
research
10/17/2011

Information, learning and falsification

There are (at least) three approaches to quantifying information. The fi...
research
08/01/2019

General Information Theory: Time and Information

This paper introduces time into information theory, gives a more accurat...
research
07/05/2022

Entropy of Sharp Restart

Restart has the potential of expediting or impeding the completion times...
research
08/02/2021

Fundamental Advantage of Feedback Control Based on a Generalized Second Law of Thermodynamics

Based on a novel generalized second law of thermodynamics, we demonstrat...

Please sign up or login with your details

Forgot password? Click here to reset