Convex Optimization without Projection Steps

08/04/2011
by   Martin Jaggi, et al.
0

For the general problem of minimizing a convex function over a compact convex domain, we will investigate a simple iterative approximation algorithm based on the method by Frank & Wolfe 1956, that does not need projection steps in order to stay inside the optimization domain. Instead of a projection step, the linearized problem defined by a current subgradient is solved, which gives a step direction that will naturally stay in the domain. Our framework generalizes the sparse greedy algorithm of Frank & Wolfe and its primal-dual analysis by Clarkson 2010 (and the low-rank SDP approach by Hazan 2008) to arbitrary convex domains. We give a convergence proof guaranteeing ϵ-small duality gap after O(1/ϵ) iterations. The method allows us to understand the sparsity of approximate solutions for any l1-regularized convex optimization problem (and for optimization over the simplex), expressed as a function of the approximation quality. We obtain matching upper and lower bounds of Θ(1/ϵ) for the sparsity for l1-problems. The same bounds apply to low-rank semidefinite optimization with bounded trace, showing that rank O(1/ϵ) is best possible here as well. As another application, we obtain sparse matrices of O(1/ϵ) non-zero entries as ϵ-approximate solutions when optimizing any convex function over a class of diagonally dominant symmetric matrices. We show that our proposed first-order method also applies to nuclear norm and max-norm matrix optimization problems. For nuclear norm regularized optimization, such as matrix completion and low-rank recovery, we demonstrate the practical efficiency and scalability of our algorithm for large matrix problems, as e.g. the Netflix dataset. For general convex optimization over bounded matrix max-norm, our algorithm is the first with a convergence guarantee, to the best of our knowledge.

READ FULL TEXT
research
09/22/2020

Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank Constraints

We propose a framework for modeling and solving low-rank optimization pr...
research
11/04/2014

Fast Exact Matrix Completion with Finite Samples

Matrix completion is the problem of recovering a low rank matrix by obse...
research
02/15/2023

On Finite-Step Convergence of the Non-Greedy Algorithm for L_1-Norm PCA and Beyond

The non-greedy algorithm for L_1-norm PCA proposed in <cit.> is revisite...
research
07/17/2014

Sparse and Low-Rank Covariance Matrices Estimation

This paper aims at achieving a simultaneously sparse and low-rank estima...
research
04/13/2017

Projection Free Rank-Drop Steps

The Frank-Wolfe (FW) algorithm has been widely used in solving nuclear n...
research
07/22/2014

Approximate Regularization Path for Nuclear Norm Based H2 Model Reduction

This paper concerns model reduction of dynamical systems using the nucle...
research
07/22/2018

A Trace Lasso Regularized L1-norm Graph Cut for Highly Correlated Noisy Hyperspectral Image

This work proposes an adaptive trace lasso regularized L1-norm based gra...

Please sign up or login with your details

Forgot password? Click here to reset