Compressed Dictionary Learning

05/02/2018
by   Flavio Teixeira, et al.
0

In this paper we show that the computational complexity of the Iterative Thresholding and K-Residual-Means (ITKrM) algorithm for dictionary learning can be significantly reduced by using dimensionality reduction techniques based on the Johnson-Lindenstrauss Lemma. We introduce the Iterative Compressed-Thresholding and K-Means (IcTKM) algorithm for fast dictionary learning and study its convergence properties. We show that IcTKM can locally recover a generating dictionary with low computational complexity up to a target error ε̃ by compressing d-dimensional training data into m < d dimensions, where m is proportional to d and inversely proportional to the distortion level δ incurred by compressing the data. Increasing the distortion level δ reduces the computational complexity of IcTKM at the cost of an increased recovery error and reduced admissible sparsity level for the training data. For generating dictionaries comprised of K atoms, we show that IcTKM can stably recover the dictionary with distortion levels up to the order δ≤ O(1/√( K)). The compression effectively shatters the data dimension bottleneck in the computational cost of the ITKrM algorithm. For training data with sparsity levels S ≤ O(K^2/3), ITKrM can locally recover the dictionary with a computational cost that scales as O(d K (ε̃^-1)) per training signal. We show that for these same sparsity levels the computational cost can be brought down to O(^5 (d) K (ε̃^-1)) with IcTKM, a significant reduction when high-dimensional data is considered. Our theoretical results are complemented with numerical simulations which demonstrate that IcTKM is a powerful, low-cost algorithm for learning dictionaries from high-dimensional data sets.

READ FULL TEXT

page 13

page 14

research
04/19/2018

Dictionary learning - from local towards global and adaptive

This paper studies the convergence behaviour of dictionary learning via ...
research
01/24/2014

Local Identification of Overcomplete Dictionaries

This paper presents the first theoretical results showing that stable id...
research
11/07/2022

Decentralized Complete Dictionary Learning via ℓ^4-Norm Maximization

With the rapid development of information technologies, centralized data...
research
10/19/2022

Spectral Subspace Dictionary Learning

Dictionary learning, the problem of recovering a sparsely used matrix 𝐃∈...
research
04/26/2015

Computational Cost Reduction in Learned Transform Classifications

We present a theoretical analysis and empirical evaluations of a novel s...
research
11/03/2016

Adaptive Geometric Multiscale Approximations for Intrinsically Low-dimensional Data

We consider the problem of efficiently approximating and encoding high-d...
research
10/15/2021

NNK-Means: Dictionary Learning using Non-Negative Kernel regression

An increasing number of systems are being designed by first gathering si...

Please sign up or login with your details

Forgot password? Click here to reset