A Linearly Convergent Conditional Gradient Algorithm with Applications to Online and Stochastic Optimization

01/20/2013
by   Dan Garber, et al.
0

Linear optimization is many times algorithmically simpler than non-linear convex optimization. Linear optimization over matroid polytopes, matching polytopes and path polytopes are example of problems for which we have simple and efficient combinatorial algorithms, but whose non-linear convex counterpart is harder and admits significantly less efficient algorithms. This motivates the computational model of convex optimization, including the offline, online and stochastic settings, using a linear optimization oracle. In this computational model we give several new results that improve over the previous state-of-the-art. Our main result is a novel conditional gradient algorithm for smooth and strongly convex optimization over polyhedral sets that performs only a single linear optimization step over the domain on each iteration and enjoys a linear convergence rate. This gives an exponential improvement in convergence rate over previous results. Based on this new conditional gradient algorithm we give the first algorithms for online convex optimization over polyhedral sets that perform only a single linear optimization step over the domain while having optimal regret guarantees, answering an open question of Kalai and Vempala, and Hazan and Kale. Our online algorithms also imply conditional gradient algorithms for non-smooth and stochastic convex optimization with the same convergence rates as projected (sub)gradient methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/13/2018

Fast Rates for Online Gradient Descent Without Strong Convexity via Hoffman's Bound

Hoffman's classical result gives a bound on the distance of a point from...
research
03/20/2012

On the Equivalence between Herding and Conditional Gradient Algorithms

We show that the herding procedure of Welling (2009) takes exactly the f...
research
07/12/2012

Distributed Strongly Convex Optimization

A lot of effort has been invested into characterizing the convergence ra...
research
09/27/2018

On the Regret Minimization of Nonconvex Online Gradient Ascent for Online PCA

Non-convex optimization with global convergence guarantees is gaining si...
research
02/08/2020

Curvature of Feasible Sets in Offline and Online Optimization

It is known that the curvature of the feasible set in convex optimizatio...
research
04/07/2016

Deep Online Convex Optimization with Gated Games

Methods from convex optimization are widely used as building blocks for ...
research
04/16/2019

Global Error Bounds and Linear Convergence for Gradient-Based Algorithms for Trend Filtering and ℓ_1-Convex Clustering

We propose a class of first-order gradient-type optimization algorithms ...

Please sign up or login with your details

Forgot password? Click here to reset