Potential Function-based Framework for Making the Gradients Small in Convex and Min-Max Optimization

01/28/2021
by   Jelena Diakonikolas, et al.
0

Making the gradients small is a fundamental optimization problem that has eluded unifying and simple convergence arguments in first-order optimization, so far primarily reserved for other convergence criteria, such as reducing the optimality gap. We introduce a novel potential function-based framework to study the convergence of standard methods for making the gradients small in smooth convex optimization and convex-concave min-max optimization. Our framework is intuitive and it provides a lens for viewing algorithms that make the gradients small as being driven by a trade-off between reducing either the gradient norm or a certain notion of an optimality gap. On the lower bounds side, we discuss tightness of the obtained convergence results for the convex setup and provide a new lower bound for minimizing norm of cocoercive operators that allows us to argue about optimality of methods in the min-max setup.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2022

Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee

We propose and analyze exact and inexact regularized Newton-type methods...
research
02/27/2020

Optimality and Stability in Non-Convex-Non-Concave Min-Max Optimization

Convergence to a saddle point for convex-concave functions has been stud...
research
06/29/2019

Conjugate Gradients and Accelerated Methods Unified: The Approximate Duality Gap View

This note provides a novel, simple analysis of the method of conjugate g...
research
05/29/2019

Accelerating Min-Max Optimization with Application to Minimal Bounding Sphere

We study the min-max optimization problem where each function contributi...
research
01/31/2023

Disciplined Saddle Programming

We consider convex-concave saddle point problems, and more generally con...
research
02/20/2020

Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion and Strong Solutions to Variational Inequalities

We leverage the connections between nonexpansive maps, monotone Lipschit...
research
06/01/2023

Smooth Monotonic Networks

Monotonicity constraints are powerful regularizers in statistical modell...

Please sign up or login with your details

Forgot password? Click here to reset