Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators

06/10/2020
by   Mathieu Barré, et al.
0

Proximal operations are among the most common primitives appearing in both practical and theoretical (or high-level) optimization methods. This basic operation typically consists in solving an intermediary (hopefully simpler) optimization problem. In this work, we survey notions of inaccuracies that can be used when solving those intermediary optimization problems. Then, we show that worst-case guarantees for algorithms relying on such inexact proximal operations can be systematically obtained through a generic procedure based on semidefinite programming. This methodology is primarily based on the approach introduced by Drori and Teboulle (Mathematical Programming, 2014) and on convex interpolation results, and allows producing non-improvable worst-case analyzes. In other words, for a given algorithm, the methodology generates both worst-case certificates (i.e., proofs) and problem instances on which those bounds are achieved. Relying on this methodology, we provide three new methods with conceptually simple proofs: (i) an optimized relatively inexact proximal point method, (ii) an extension of the hybrid proximal extragradient method of Monteiro and Svaiter (SIAM Journal on Optimization, 2013), and (iii) an inexact accelerated forward-backward splitting supporting backtracking line-search, and both (ii) and (iii) supporting possibly strongly convex objectives. Finally, we use the methodology for studying a recent inexact variant of the Douglas-Rachford splitting due to Eckstein and Yao (Mathematical Programming, 2018). We showcase and compare the different variants of the accelerated inexact forward-backward method on a factorization and a total variation problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/13/2019

Superiorization vs. Accelerated Convex Optimization: The Superiorized/Regularized Least-Squares Case

In this paper we conduct a study of both superiorization and optimizatio...
research
01/11/2022

PEPit: computer-assisted worst-case analyses of first-order optimization methods in Python

PEPit is a Python package aiming at simplifying the access to worst-case...
research
06/03/2019

A Generic Acceleration Framework for Stochastic Composite Optimization

In this paper, we introduce various mechanisms to obtain accelerated fir...
research
08/02/2019

Gradient Flows and Accelerated Proximal Splitting Methods

Proximal based methods are well-suited to nonsmooth optimization problem...
research
02/04/2014

UNLocBoX: A MATLAB convex optimization toolbox for proximal-splitting methods

Convex optimization is an essential tool for machine learning, as many o...
research
03/05/2016

A single-phase, proximal path-following framework

We propose a new proximal, path-following framework for a class of const...
research
05/25/2020

Boosting First-order Methods by Shifting Objective: New Schemes with Faster Worst Case Rates

We propose a new methodology to design first-order methods for unconstra...

Please sign up or login with your details

Forgot password? Click here to reset