Geometric descent method for convex composite minimization

12/29/2016
by   Shixiang Chen, et al.
0

In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh to tackle nonsmooth and strongly convex composite problems. We prove that our proposed algorithm, dubbed geometric proximal gradient method (GeoPG), converges with a linear rate (1-1/√(κ)) and thus achieves the optimal rate among first-order methods, where κ is the condition number of the problem. Numerical results on linear regression and logistic regression with elastic net regularization show that GeoPG compares favorably with Nesterov's accelerated proximal gradient method, especially when the problem is ill-conditioned.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2016

Accelerated Stochastic Mirror Descent Algorithms For Composite Non-strongly Convex Optimization

We consider the problem of minimizing the sum of an average function of ...
research
08/20/2017

Stochastic Primal-Dual Proximal ExtraGradient Descent for Compositely Regularized Optimization

We consider a wide range of regularized stochastic minimization problems...
research
11/19/2020

Anderson acceleration of coordinate descent

Acceleration of first order methods is mainly obtained via inertial tech...
research
12/13/2022

Linear Convergence of ISTA and FISTA

In this paper, we revisit the class of iterative shrinkage-thresholding ...
research
12/03/2015

Kalman-based Stochastic Gradient Method with Stop Condition and Insensitivity to Conditioning

Modern proximal and stochastic gradient descent (SGD) methods are believ...
research
12/31/2020

Constrained and Composite Optimization via Adaptive Sampling Methods

The motivation for this paper stems from the desire to develop an adapti...
research
11/03/2022

Proximal Subgradient Norm Minimization of ISTA and FISTA

For first-order smooth optimization, the research on the acceleration ph...

Please sign up or login with your details

Forgot password? Click here to reset