Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion and Strong Solutions to Variational Inequalities

02/20/2020
by   Jelena Diakonikolas, et al.
0

We leverage the connections between nonexpansive maps, monotone Lipschitz operators, and proximal mappings to obtain near-optimal (i.e., optimal up to poly-log factors in terms of iteration complexity) and parameter-free methods for solving monotone inclusion problems. These results immediately translate into near-optimal guarantees for approximating strong solutions to variational inequality problems, approximating convex-concave min-max optimization problems, and minimizing the norm of the gradient in min-max optimization problems. Our analysis is based on a novel and simple potential-based proof of convergence of Halpern iteration, a classical iteration for finding fixed points of nonexpansive maps. Additionally, we provide a series of algorithmic reductions that highlight connections between different problem classes and lead to lower bounds that certify near-optimality of the studied methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2020

Higher-order methods for convex-concave min-max optimization and monotone variational inequalities

We provide improved convergence rates for constrained convex-concave min...
research
03/17/2022

A Stochastic Halpern Iteration with Variance Reduction for Stochastic Monotone Inclusion Problems

We study stochastic monotone inclusion problems, which widely appear in ...
research
01/10/2023

Min-Max Optimization Made Simple: Approximating the Proximal Point Method via Contraction Maps

In this paper we present a first-order method that admits near-optimal c...
research
12/15/2022

Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems

We leverage path differentiability and a recent result on nonsmooth impl...
research
10/06/2022

Accelerated Single-Call Methods for Constrained Min-Max Optimization

We study first-order methods for constrained min-max optimization. Exist...
research
01/28/2021

Potential Function-based Framework for Making the Gradients Small in Convex and Min-Max Optimization

Making the gradients small is a fundamental optimization problem that ha...
research
10/15/2020

Adaptive and Universal Single-gradient Algorithms for Variational Inequalities

Variational inequalities with monotone operators capture many problems o...

Please sign up or login with your details

Forgot password? Click here to reset