Revisiting First-Order Convex Optimization Over Linear Spaces

03/26/2018
by   Francesco Locatello, et al.
0

Two popular examples of first-order optimization methods over linear spaces are coordinate descent and matching pursuit algorithms, with their randomized variants. While the former targets the optimization by moving along coordinates, the latter considers a generalized notion of directions. Exploiting the connection between the two algorithms, we present a unified analysis of both, providing affine invariant sublinear O(1/t) rates on smooth objectives and linear convergence on strongly convex objectives. As a byproduct of our affine invariant analysis of matching pursuit, our rates for steepest coordinate descent are the tightest known. Furthermore, we show the first accelerated convergence rate O(1/t^2) for matching pursuit on convex objectives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2017

A Unified Optimization View on Generalized Matching Pursuit and Frank-Wolfe

Two of the most fundamental prototypes of greedy optimization are the ma...
research
04/28/2019

Blended Matching Pursuit

Matching pursuit algorithms are an important class of algorithms in sign...
research
05/31/2017

Greedy Algorithms for Cone Constrained Optimization with Convergence Guarantees

Greedy optimization methods such as Matching Pursuit (MP) and Frank-Wolf...
research
10/03/2010

Convolutional Matching Pursuit and Dictionary Training

Matching pursuit and K-SVD is demonstrated in the translation invariant ...
research
07/01/2016

Convergence Rate of Frank-Wolfe for Non-Convex Objectives

We give a simple proof that the Frank-Wolfe algorithm obtains a stationa...
research
03/30/2020

Explicit Regularization of Stochastic Gradient Methods through Duality

We consider stochastic gradient methods under the interpolation regime w...

Please sign up or login with your details

Forgot password? Click here to reset