An Empirical Analysis of Recurrent Learning Algorithms In Neural Lossy Image Compression Systems

01/27/2022
by   Ankur Mali, et al.
0

Recent advances in deep learning have resulted in image compression algorithms that outperform JPEG and JPEG 2000 on the standard Kodak benchmark. However, they are slow to train (due to backprop-through-time) and, to the best of our knowledge, have not been systematically evaluated on a large variety of datasets. In this paper, we perform the first large-scale comparison of recent state-of-the-art hybrid neural compression algorithms, while exploring the effects of alternative training strategies (when applicable). The hybrid recurrent neural decoder is a former state-of-the-art model (recently overtaken by a Google model) that can be trained using backprop-through-time (BPTT) or with alternative algorithms like sparse attentive backtracking (SAB), unbiased online recurrent optimization (UORO), and real-time recurrent learning (RTRL). We compare these training alternatives along with the Google models (GOOG and E2E) on 6 benchmark datasets. Surprisingly, we found that the model trained with SAB performs better (outperforming even BPTT), resulting in faster convergence and a better peak signal-to-noise ratio.

READ FULL TEXT
research
03/23/2019

Distributed Lossy Image Compression with Recurrent Networks

We propose a new architecture for distributed image compression from a g...
research
04/28/2019

X-Ray Image Compression Using Convolutional Recurrent Neural Networks

In the advent of a digital health revolution, vast amounts of clinical d...
research
10/17/2018

Online Learning of Recurrent Neural Architectures by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
02/12/2019

GAN- vs. JPEG2000 Image Compression for Distributed Automotive Perception: Higher Peak SNR Does Not Mean Better Semantic Segmentation

The high amount of sensors required for autonomous driving poses enormou...
research
07/15/2020

Compression strategies and space-conscious representations for deep neural networks

Recent advances in deep learning have made available large, powerful con...
research
10/17/2018

Continual Learning of Recurrent Neural Networks by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
03/17/2018

A Novel Blaschke Unwinding Adaptive Fourier Decomposition based Signal Compression Algorithm with Application on ECG Signals

This paper presents a novel signal compression algorithm based on the Bl...

Please sign up or login with your details

Forgot password? Click here to reset