Distributed Convolutional Sparse Coding

05/29/2017
by   Thomas Moreau, et al.
0

We consider the problem of building shift-invariant representations for long signals in the context of distributed processing. We propose an asynchronous algorithm based on coordinate descent called DICOD to efficiently solve the ℓ_1-minimization problems involved in convolutional sparse coding. This algorithm leverages the weak temporal dependency of the convolution to reduce the interprocess communication to a few local messages. We prove that this algorithm converges to the optimal solution and that it scales with superlinear speedup, up to a certain limit. These properties are illustrated with numerical experiments and our algorithm is compared to the state-of-the-art methods used for convolutional sparse coding.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2021

Efficient ADMM-based Algorithms for Convolutional Sparse Coding

Convolutional sparse coding improves on the standard sparse approximatio...
research
05/24/2018

Multivariate Convolutional Sparse Coding for Electromagnetic Brain Signals

Frequency-specific patterns of neural activity are traditionally interpr...
research
09/27/2017

Fast Convolutional Sparse Coding in the Dual Domain

Convolutional sparse coding (CSC) is an important building block of many...
research
06/12/2018

Fast Rotational Sparse Coding

We propose an algorithm for rotational sparse coding along with an effic...
research
11/01/2018

A Local Block Coordinate Descent Algorithm for the Convolutional Sparse Coding Model

The Convolutional Sparse Coding (CSC) model has recently gained consider...
research
05/12/2017

Convolutional Sparse Representations with Gradient Penalties

While convolutional sparse representations enjoy a number of useful prop...
research
07/20/2017

Convolutional Sparse Coding: Boundary Handling Revisited

Two different approaches have recently been proposed for boundary handli...

Please sign up or login with your details

Forgot password? Click here to reset