RSG: Beating Subgradient Method without Smoothness and Strong Convexity

12/09/2015
by   Tianbao Yang, et al.
0

In this paper, we study the efficiency of a Restarted Sub Gradient (RSG) method that periodically restarts the standard subgradient method (SG). We show that, when applied to a broad class of convex optimization problems, RSG method can find an ϵ-optimal solution with a low complexity than SG method. In particular, we first show that RSG can reduce the dependence of SG's iteration complexity on the distance between the initial solution and the optimal set to that between the ϵ-level set and the optimal set. In addition, we show the advantages of RSG over SG in solving three different families of convex optimization problems. (a) For the problems whose epigraph is a polyhedron, RSG is shown to converge linearly. (b) For the problems with local quadratic growth property, RSG has an O(1/ϵ(1/ϵ)) iteration complexity. (c) For the problems that admit a local Kurdyka-Ł ojasiewicz property with a power constant of β∈[0,1), RSG has an O(1/ϵ^2β(1/ϵ)) iteration complexity. On the contrary, with only the standard analysis, the iteration complexity of SG is known to be O(1/ϵ^2) for these three classes of problems. The novelty of our analysis lies at exploiting the lower bound of the first-order optimality residual at the ϵ-level set. It is this novelty that allows us to explore the local properties of functions (e.g., local quadratic growth property, local Kurdyka-Ł ojasiewicz property, more generally local error bounds) to develop the improved convergence of RSG. We demonstrate the effectiveness of the proposed algorithms on several machine learning tasks including regression and classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2016

Accelerated Stochastic Subgradient Methods under Local Error Bound Condition

In this paper, we propose two accelerated stochastic subgradient method...
research
08/11/2016

A Richer Theory of Convex Constrained Optimization with Reduced Projections and Improved Rates

This paper focuses on convex constrained optimization problems, where th...
research
08/29/2015

Generalized Uniformly Optimal Methods for Nonlinear Programming

In this paper, we present a generic framework to extend existing uniform...
research
01/02/2023

On Bilevel Optimization without Lower-level Strong Convexity

Theoretical properties of bilevel problems are well studied when the low...
research
06/17/2022

Generalized Frank-Wolfe Algorithm for Bilevel Optimization

In this paper, we study a class of bilevel optimization problems, also k...
research
03/14/2021

Transient growth of accelerated first-order methods for strongly convex optimization problems

Optimization algorithms are increasingly being used in applications with...
research
06/21/2022

A Single-Timescale Analysis For Stochastic Approximation With Multiple Coupled Sequences

Stochastic approximation (SA) with multiple coupled sequences has found ...

Please sign up or login with your details

Forgot password? Click here to reset