A simpler approach to obtaining an O(1/t) convergence rate for the projected stochastic subgradient method

12/10/2012
by   Simon Lacoste-Julien, et al.
0

In this note, we present a new averaging technique for the projected stochastic subgradient method. By using a weighted average with a weight of t+1 for each iterate w_t at iteration t, we obtain the convergence rate of O(1/t) with both an easy proof and an easy implementation. The new scheme is compared empirically to existing techniques, with similar performance behavior.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2017

Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity

We generalize the classic convergence rate theory for subgradient method...
research
03/10/2022

A note on estimating Bass model parameters

Bass (1969) proposed a model (the Bass model) for the timing of adoption...
research
06/08/2021

An Online Riemannian PCA for Stochastic Canonical Correlation Analysis

We present an efficient stochastic algorithm (RSG+) for canonical correl...
research
10/24/2019

Arbitrary Rates of Convergence for Projected and Extrinsic Means

We study central limit theorems for the projected sample mean of indepen...
research
05/24/2022

Accelerating Frank-Wolfe via Averaging Step Directions

The Frank-Wolfe method is a popular method in sparse constrained optimiz...
research
11/30/2021

Convergence Rate of Multiple-try Metropolis Independent sampler

The Multiple-try Metropolis (MTM) method is an interesting extension of ...
research
07/03/2012

A Fast Projected Fixed-Point Algorithm for Large Graph Matching

We propose a fast approximate algorithm for large graph matching. A new ...

Please sign up or login with your details

Forgot password? Click here to reset