RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

06/17/2022
by   Yair Carmon, et al.
0

The accelerated proximal point algorithm (APPA), also known as "Catalyst", is a well-established reduction from convex optimization to approximate proximal point computation (i.e., regularized minimization). This reduction is conceptually elegant and yields strong convergence rate guarantees. However, these rates feature an extraneous logarithmic term arising from the need to compute each proximal point to high accuracy. In this work, we propose a novel Relaxed Error Criterion for Accelerated Proximal Point (RECAPP) that eliminates the need for high accuracy subproblem solutions. We apply RECAPP to two canonical problems: finite-sum and max-structured minimization. For finite-sum problems, we match the best known complexity, previously obtained by carefully-designed problem-specific algorithms. For minimizing max_y f(x,y) where f is convex in x and strongly-concave in y, we improve on the best known (Catalyst-based) bound by a logarithmic factor.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2016

Accelerated Stochastic Mirror Descent Algorithms For Composite Non-strongly Convex Optimization

We consider the problem of minimizing the sum of an average function of ...
research
09/13/2019

A Stochastic Proximal Point Algorithm for Saddle-Point Problems

We consider saddle point problems which objective functions are the aver...
research
02/05/2020

Near-Optimal Algorithms for Minimax Optimization

This paper resolves a longstanding open question pertaining to the desig...
research
06/24/2015

Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization

We develop a family of accelerated stochastic algorithms that minimize s...
research
06/02/2022

Accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient

In this paper we develop accelerated first-order methods for convex opti...
research
01/28/2019

Asynchronous Accelerated Proximal Stochastic Gradient for Strongly Convex Distributed Finite Sums

In this work, we study the problem of minimizing the sum of strongly con...
research
08/10/2021

Computational complexity of Inexact Proximal Point Algorithm for Convex Optimization under Holderian Growth

Several decades ago the Proximal Point Algorithm (PPA) stated to gain a ...

Please sign up or login with your details

Forgot password? Click here to reset