Identifying global optimality for dictionary learning

04/17/2016
by   Lei Le, et al.
0

Learning new representations of input observations in machine learning is often tackled using a factorization of the data. For many such problems, including sparse coding and matrix completion, learning these factorizations can be difficult, in terms of efficiency and to guarantee that the solution is a global minimum. Recently, a general class of objectives have been introduced-which we term induced dictionary learning models (DLMs)-that have an induced convex form that enables global optimization. Though attractive theoretically, this induced form is impractical, particularly for large or growing datasets. In this work, we investigate the use of practical alternating minimization algorithms for induced DLMs, that ensure convergence to global optima. We characterize the stationary points of these models, and, using these insights, highlight practical choices for the objectives. We then provide theoretical and empirical evidence that alternating minimization, from a random initialization, converges to global minima for a large subclass of induced DLMs. In particular, we take advantage of the existence of the (potentially unknown) convex induced form, to identify when stationary points are global minima for the dictionary learning objective. We then provide an empirical investigation into practical optimization choices for using alternating minimization for induced DLMs, for both batch and stochastic gradient descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2017

Alternating minimization for dictionary learning with random initialization

We present theoretical guarantees for an alternating minimization algori...
research
02/01/2018

Analysis of Fast Alternating Minimization for Structured Dictionary Learning

Methods exploiting sparsity have been popular in imaging and signal proc...
research
05/31/2018

Analysis of Fast Structured Dictionary Learning

Sparsity-based models and techniques have been exploited in many signal ...
research
04/28/2020

Distributed Projected Subgradient Method for Weakly Convex Optimization

The stochastic subgradient method is a widely-used algorithm for solving...
research
07/24/2021

A Tensor-Train Dictionary Learning algorithm based on Spectral Proximal Alternating Linearized Minimization

Dictionary Learning (DL) is one of the leading sparsity promoting techni...
research
02/22/2019

Unique Sharp Local Minimum in ℓ_1-minimization Complete Dictionary Learning

We study the problem of globally recovering a dictionary from a set of s...
research
04/07/2019

Every Local Minimum is a Global Minimum of an Induced Model

For non-convex optimization in machine learning, this paper proves that ...

Please sign up or login with your details

Forgot password? Click here to reset