Convolutional Sparse Coding Fast Approximation with Application to Seismic Reflectivity Estimation

06/29/2021
by   Deborah Pereg, et al.
4

In sparse coding, we attempt to extract features of input vectors, assuming that the data is inherently structured as a sparse superposition of basic building blocks. Similarly, neural networks perform a given task by learning features of the training data set. Recently both data-driven and model-driven feature extracting methods have become extremely popular and have achieved remarkable results. Nevertheless, practical implementations are often too slow to be employed in real-life scenarios, especially for real-time applications. We propose a speed-up upgraded version of the classic iterative thresholding algorithm, that produces a good approximation of the convolutional sparse code within 2-5 iterations. The speed advantage is gained mostly from the observation that most solvers are slowed down by inefficient global thresholding. The main idea is to normalize each data point by the local receptive field energy, before applying a threshold. This way, the natural inclination towards strong feature expressions is suppressed, so that one can rely on a global threshold that can be easily approximated, or learned during training. The proposed algorithm can be employed with a known predetermined dictionary, or with a trained dictionary. The trained version is implemented as a neural net designed as the unfolding of the proposed solver. The performance of the proposed solution is demonstrated via the seismic inversion problem in both synthetic and real data scenarios. We also provide theoretical guarantees for a stable support recovery. Namely, we prove that under certain conditions the true support is perfectly recovered within the first iteration.

READ FULL TEXT

page 1

page 12

page 13

page 14

research
04/16/2018

Learning Simple Thresholded Features with Sparse Support Recovery

The thresholded feature has recently emerged as an extremely efficient, ...
research
01/23/2020

Ada-LISTA: Learned Solvers Adaptive to Varying Models

Neural networks that are based on unfolding of an iterative solver, such...
research
04/21/2021

Efficient Sparse Coding using Hierarchical Riemannian Pursuit

Sparse coding is a class of unsupervised methods for learning a sparse r...
research
06/22/2021

Learned Interpretable Residual Extragradient ISTA for Sparse Coding

Recently, the study on learned iterative shrinkage thresholding algorith...
research
09/28/2022

Algorithm Unfolding for Block-sparse and MMV Problems with Reduced Training Overhead

In this paper we consider algorithm unfolding for the Multiple Measureme...
research
07/27/2016

Convolutional Neural Networks Analyzed via Convolutional Sparse Coding

Convolutional neural networks (CNN) have led to many state-of-the-art re...
research
12/18/2018

Frank-Wolfe Algorithm for the Exact Sparse Problem

In this paper, we study the properties of the Frank-Wolfe algorithm to s...

Please sign up or login with your details

Forgot password? Click here to reset