Computational complexity of Inexact Proximal Point Algorithm for Convex Optimization under Holderian Growth

08/10/2021
by   Andrei Patrascu, et al.
0

Several decades ago the Proximal Point Algorithm (PPA) stated to gain a long-lasting attraction for both abstract operator theory and numerical optimization communities. Even in modern applications, researchers still use proximal minimization theory to design scalable algorithms that overcome nonsmoothness. Remarkable works as <cit.> established tight relations between the convergence behaviour of PPA and the regularity of the objective function. In this manuscript we derive nonasymptotic iteration complexity of exact and inexact PPA to minimize convex functions under γ-Holderian growth: log(1/ϵ) (for γ∈ [1,2]) and 1/ϵ^γ - 2 (for γ > 2). In particular, we recover well-known results on PPA: finite convergence for sharp minima and linear convergence for quadratic growth, even under presence of inexactness. However, without taking into account the concrete computational effort paid for computing each PPA iteration, any iteration complexity remains abstract and purely informative. Therefore, using an inner (proximal) gradient/subgradient method subroutine that computes inexact PPA iteration, we secondly show novel computational complexity bounds on a restarted inexact PPA, available when no information on the growth of the objective function is known. In the numerical experiments we confirm the practical performance and implementability of our framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2018

On the complexity of convex inertial proximal algorithms

The inertial proximal gradient algorithm is efficient for the composite ...
research
08/22/2020

Fast Proximal Gradient Methods for Nonsmooth Convex Optimization for Tomographic Image Reconstruction

The Fast Proximal Gradient Method (FPGM) and the Monotone FPGM (MFPGM) f...
research
12/10/2021

Deep Q-Network with Proximal Iteration

We employ Proximal Iteration for value-function optimization in reinforc...
research
03/04/2022

Sharper Bounds for Proximal Gradient Algorithms with Errors

We analyse the convergence of the proximal gradient algorithm for convex...
research
03/15/2021

Lower Complexity Bounds of Finite-Sum Optimization Problems: The Results and Construction

The contribution of this paper includes two aspects. First, we study the...
research
06/17/2022

RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

The accelerated proximal point algorithm (APPA), also known as "Catalyst...
research
09/03/2019

Complexity analysis of the Controlled Loosening-up (CLuP) algorithm

In our companion paper <cit.> we introduced a powerful mechanism that we...

Please sign up or login with your details

Forgot password? Click here to reset