Online Convolutional Sparse Coding with Sample-Dependent Dictionary

04/27/2018
by   Yaqing Wang, et al.
0

Convolutional sparse coding (CSC) has been popularly used for the learning of shift-invariant dictionaries in image and signal processing. However, existing methods have limited scalability. In this paper, instead of convolving with a dictionary shared by all samples, we propose the use of a sample-dependent dictionary in which filters are obtained as linear combinations of a small set of base filters learned from the data. This added flexibility allows a large number of sample-dependent patterns to be captured, while the resultant model can still be efficiently learned by online learning. Extensive experimental results show that the proposed method outperforms existing CSC algorithms with significantly reduced time and space requirements.

READ FULL TEXT
research
01/25/2023

An Efficient Approximate Method for Online Convolutional Dictionary Learning

Most existing convolutional dictionary learning (CDL) algorithms are bas...
research
06/21/2017

Scalable Online Convolutional Sparse Coding

Convolutional sparse coding (CSC) improves sparse coding by learning a s...
research
08/31/2019

Stochastic Convolutional Sparse Coding

State-of-the-art methods for Convolutional Sparse Coding usually employ ...
research
09/21/2014

Analyzing sparse dictionaries for online learning with kernels

Many signal processing and machine learning methods share essentially th...
research
09/29/2011

The Statistical Inefficiency of Sparse Coding for Images (or, One Gabor to Rule them All)

Sparse coding is a proven principle for learning compact representations...
research
09/09/2017

Convolutional Dictionary Learning

Convolutional sparse representations are a form of sparse representation...
research
03/08/2019

General Convolutional Sparse Coding with Unknown Noise

Convolutional sparse coding (CSC) can learn representative shift-invaria...

Please sign up or login with your details

Forgot password? Click here to reset