Neural Networks Optimally Compress the Sawbridge

11/10/2020
by   Aaron B. Wagner, et al.
0

Neural-network-based compressors have proven to be remarkably effective at compressing sources, such as images, that are nominally high-dimensional but presumed to be concentrated on a low-dimensional manifold. We consider a continuous-time random process that models an extreme version of such a source, wherein the realizations fall along a one-dimensional "curve" in function space that has infinite-dimensional linear span. We precisely characterize the optimal entropy-distortion tradeoff for this source and show numerically that it is achieved by neural-network-based compressors trained via stochastic gradient descent. In contrast, we show both analytically and experimentally that compressors based on the classical Karhunen-Loève transform are highly suboptimal at high rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2022

Do Neural Networks Compress Manifolds Optimally?

Artificial Neural-Network-based (ANN-based) lossy compressors have recen...
research
02/10/2022

On One-Bit Quantization

We consider the one-bit quantizer that minimizes the mean squared error ...
research
08/04/2019

Hopfield Neural Network Flow: A Geometric Viewpoint

We provide gradient flow interpretations for the continuous-time continu...
research
12/07/2021

A Continuous-time Stochastic Gradient Descent Method for Continuous Data

Optimization problems with continuous data appear in, e.g., robust machi...
research
06/24/2020

When Do Neural Networks Outperform Kernel Methods?

For a certain scaling of the initialization of stochastic gradient desce...
research
09/14/2020

Convolutional Signature for Sequential Data

Signature is an infinite graded sequence of statistics known to characte...
research
02/25/2020

Optimizing User Interface Layouts via Gradient Descent

Automating parts of the user interface (UI) design process has been a lo...

Please sign up or login with your details

Forgot password? Click here to reset