Beyond the Golden Ratio for Variational Inequality Algorithms

12/28/2022
by   Ahmet Alacaoglu, et al.
0

We improve the understanding of the golden ratio algorithm, which solves monotone variational inequalities (VI) and convex-concave min-max problems via the distinctive feature of adapting the step sizes to the local Lipschitz constants. Adaptive step sizes not only eliminate the need to pick hyperparameters, but they also remove the necessity of global Lipschitz continuity and can increase from one iteration to the next. We first establish the equivalence of this algorithm with popular VI methods such as reflected gradient, Popov or optimistic gradient descent-ascent in the unconstrained case with constant step sizes. We then move on to the constrained setting and introduce a new analysis that allows to use larger step sizes, to complete the bridge between the golden ratio algorithm and the existing algorithms in the literature. Doing so, we actually eliminate the link between the golden ratio 1+√(5)/2 and the algorithm. Moreover, we improve the adaptive version of the algorithm, first by removing the maximum step size hyperparameter (an artifact from the analysis) to improve the complexity bound, and second by adjusting it to nonmonotone problems with weak Minty solutions, with superior empirical performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2020

Higher-order methods for convex-concave min-max optimization and monotone variational inequalities

We provide improved convergence rates for constrained convex-concave min...
research
06/28/2023

Stochastic Methods in Variational Inequalities: Ergodicity, Bias and Refinements

For min-max optimization and variational inequalities problems (VIP) enc...
research
06/10/2022

Accelerated Algorithms for Monotone Inclusions and Constrained Nonconvex-Nonconcave Min-Max Optimization

We study monotone inclusions and monotone variational inequalities, as w...
research
02/05/2019

A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise

We consider variational inequalities coming from monotone operators, a s...
research
02/17/2023

Solving stochastic weak Minty variational inequalities without increasing batch size

This paper introduces a family of stochastic extragradient-type algorith...

Please sign up or login with your details

Forgot password? Click here to reset