Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization

06/17/2022
by   Simon S. Du, et al.
0

We consider the smooth convex-concave bilinearly-coupled saddle-point problem, min_𝐱max_𝐲 F(𝐱) + H(𝐱,𝐲) - G(𝐲), where one has access to stochastic first-order oracles for F, G as well as the bilinear coupling function H. Building upon standard stochastic extragradient analysis for variational inequalities, we present a stochastic accelerated gradient-extragradient (AG-EG) descent-ascent algorithm that combines extragradient and Nesterov's acceleration in general stochastic settings. This algorithm leverages scheduled restarting to admit a fine-grained nonasymptotic convergence rate that matches known lower bounds by both <cit.> and <cit.> in their corresponding settings, plus an additional statistical error term for bounded stochastic noise that is optimal up to a constant prefactor. This is the first result that achieves such a relatively mature characterization of optimality in saddle-point optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2022

Nesterov Meets Optimism: Rate-Optimal Optimistic-Gradient-Based Method for Stochastic Bilinearly-Coupled Minimax Optimization

We provide a novel first-order optimization algorithm for bilinearly-cou...
research
01/23/2019

A Universally Optimal Multistage Accelerated Stochastic Gradient Method

We study the problem of minimizing a strongly convex and smooth function...
research
11/12/2020

Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

We show that standard extragradient methods (i.e. mirror prox and dual e...
research
10/08/2019

Variance-Reduced Decentralized Stochastic Optimization with Gradient Tracking – Part II: GT-SVRG

Decentralized stochastic optimization has recently benefited from gradie...
research
03/15/2021

DIPPA: An improved Method for Bilinear Saddle Point Problems

This paper studies bilinear saddle point problems min_xmax_y g(x) + x^⊤A...
research
02/17/2016

Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression

We consider the optimization of a quadratic objective function whose gra...

Please sign up or login with your details

Forgot password? Click here to reset