Optimal Computational Trade-Off of Inexact Proximal Methods

10/18/2012
by   Pierre Machart, et al.
0

In this paper, we investigate the trade-off between convergence rate and computational cost when minimizing a composite functional with proximal-gradient methods, which are popular optimisation tools in machine learning. We consider the case when the proximity operator is computed via an iterative procedure, which provides an approximation of the exact proximity operator. In that case, we obtain algorithms with two nested loops. We show that the strategy that minimizes the computational cost to reach a solution with a desired accuracy in finite time is to set the number of inner iterations to a constant, which differs from the strategy indicated by a convergence rate analysis. In the process, we also present a new procedure called SIP (that is Speedy Inexact Proximal-gradient algorithm) that is both computationally efficient and easy to implement. Our numerical experiments confirm the theoretical findings and suggest that SIP can be a very competitive alternative to the standard procedure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2020

Optimization of Graph Total Variation via Active-Set-based Combinatorial Reconditioning

Structured convex optimization on weighted graphs finds numerous applica...
research
02/07/2020

Wasserstein Proximal Gradient

We consider the task of sampling from a log-concave probability distribu...
research
02/15/2019

ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization

In this paper, we propose a new stochastic algorithmic framework to solv...
research
03/13/2023

The joint bidiagonalization of a matrix pair with inaccurate inner iterations

The joint bidiagonalization (JBD) process iteratively reduces a matrix p...
research
06/29/2023

A Low-Power Hardware-Friendly Optimisation Algorithm With Absolute Numerical Stability and Convergence Guarantees

We propose Dual-Feedback Generalized Proximal Gradient Descent (DFGPGD) ...
research
09/03/2012

Proximal methods for the latent group lasso penalty

We consider a regularized least squares problem, with regularization by ...
research
11/30/2021

Convergence Rate of Multiple-try Metropolis Independent sampler

The Multiple-try Metropolis (MTM) method is an interesting extension of ...

Please sign up or login with your details

Forgot password? Click here to reset