On the Proximal Gradient Algorithm with Alternated Inertia

01/17/2018
by   Franck Iutzeler, et al.
0

In this paper, we investigate the attractive properties of the proximal gradient algorithm with inertia. Notably, we show that using alternated inertia yields monotonically decreasing functional values, which contrasts with usual accelerated proximal gradient methods. We also provide convergence rates for the algorithm with alternated inertia based on local geometric properties of the objective function. The results are put into perspective by discussions on several extensions and illustrations on common regularized problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2018

On the complexity of convex inertial proximal algorithms

The inertial proximal gradient algorithm is efficient for the composite ...
research
11/09/2022

Regularized Rényi divergence minimization through Bregman proximal gradient algorithms

We study the variational inference problem of minimizing a regularized R...
research
10/11/2019

General Proximal Incremental Aggregated Gradient Algorithms: Better and Novel Results under General Scheme

The incremental aggregated gradient algorithm is popular in network opti...
research
12/19/2017

Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs

A regularized optimization problem over a large unstructured graph is st...
research
02/09/2021

Proximal Gradient Descent-Ascent: Variable Convergence under KŁ Geometry

The gradient descent-ascent (GDA) algorithm has been widely applied to s...
research
11/12/2012

Proximal Stochastic Dual Coordinate Ascent

We introduce a proximal version of dual coordinate ascent method. We dem...
research
12/04/2018

A probabilistic incremental proximal gradient method

In this paper, we propose a probabilistic optimization method, named pro...

Please sign up or login with your details

Forgot password? Click here to reset