Learned Greedy Method (LGM): A Novel Neural Architecture for Sparse Coding and Beyond

10/14/2020
by   Rajaei Khatib, et al.
0

The fields of signal and image processing have been deeply influenced by the introduction of deep neural networks. These are successfully deployed in a wide range of real-world applications, obtaining state of the art results and surpassing well-known and well-established classical methods. Despite their impressive success, the architectures used in many of these neural networks come with no clear justification. As such, these are usually treated as "black box" machines that lack any kind of interpretability. A constructive remedy to this drawback is a systematic design of such networks by unfolding well-understood iterative algorithms. A popular representative of this approach is the Iterative Shrinkage-Thresholding Algorithm (ISTA) and its learned version – LISTA, aiming for the sparse representations of the processed signals. In this paper we revisit this sparse coding task and propose an unfolded version of a greedy pursuit algorithm for the same goal. More specifically, we concentrate on the well-known Orthogonal-Matching-Pursuit (OMP) algorithm, and introduce its unfolded and learned version. Key features of our Learned Greedy Method (LGM) are the ability to accommodate a dynamic number of unfolded layers, and a stopping mechanism based on representation error, both adapted to the input. We develop several variants of the proposed LGM architecture and test some of them in various experiments, demonstrating their flexibility and efficiency.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 38

page 39

page 42

05/27/2019

Learning step sizes for unfolded sparse coding

Sparse coding is typically solved by iterative optimization techniques, ...
03/17/2021

Thresholding Greedy Pursuit for Sparse Recovery Problems

We study here sparse recovery problems in the presence of additive noise...
12/22/2019

Algorithm Unrolling: Interpretable, Efficient Deep Learning for Signal and Image Processing

Deep neural networks provide unprecedented performance gains in many rea...
12/21/2021

Learned ISTA with Error-based Thresholding for Adaptive Sparse Coding

The learned iterative shrinkage thresholding algorithm (LISTA) introduce...
09/29/2021

Adaptive Approach For Sparse Representations Using The Locally Competitive Algorithm For Audio

Gammachirp filterbank has been used to approximate the cochlea in sparse...
11/29/2020

Architectural Adversarial Robustness: The Case for Deep Pursuit

Despite their unmatched performance, deep neural networks remain suscept...
07/27/2016

Convolutional Neural Networks Analyzed via Convolutional Sparse Coding

Convolutional neural networks (CNN) have led to many state-of-the-art re...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.