On Macroscopic Complexity and Perceptual Coding

05/10/2010
by   John Scoville, et al.
0

The theoretical limits of 'lossy' data compression algorithms are considered. The complexity of an object as seen by a macroscopic observer is the size of the perceptual code which discards all information that can be lost without altering the perception of the specified observer. The complexity of this macroscopically observed state is the simplest description of any microstate comprising that macrostate. Inference and pattern recognition based on macrostate rather than microstate complexities will take advantage of the complexity of the macroscopic observer to ignore irrelevant noise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2019

Vignette: Perceptual Compression for Video Storage and Processing Systems

Compressed videos constitute 70 rates far outpace compute and storage im...
research
05/20/2009

Information Distance in Multiples

Information distance is a parameter-free similarity measure based on com...
research
02/14/2022

An Introduction to Neural Data Compression

Neural compression is the application of neural networks and other machi...
research
10/19/2019

ProxIQA: A Proxy Approach to Perceptual Optimization of Learned Image Compression

The use of ℓ_p(p=1,2) norms has largely dominated the measurement of los...
research
07/10/2018

The Helmholtz Method: Using Perceptual Compression to Reduce Machine Learning Complexity

This paper proposes a fundamental answer to a frequently asked question ...
research
09/08/2005

Achievable Rates for Pattern Recognition

Biological and machine pattern recognition systems face a common challen...
research
01/28/2019

Space and complexities of territorial systems

The spatial character of territorial systems plays a crucial role in the...

Please sign up or login with your details

Forgot password? Click here to reset