DIPPA: An improved Method for Bilinear Saddle Point Problems

03/15/2021 ∙ by Guangzeng Xie, et al. ∙ 0

This paper studies bilinear saddle point problems min_xmax_y g(x) + x^⊤Ay - h(y), where the functions g, h are smooth and strongly-convex. When the gradient and proximal oracle related to g and h are accessible, optimal algorithms have already been developed in the literature <cit.>. However, the proximal operator is not always easy to compute, especially in constraint zero-sum matrix games <cit.>. This work proposes a new algorithm which only requires the access to the gradients of g, h. Our algorithm achieves a complexity upper bound 𝒪̃( A_2/√(μ_x μ_y) + √(κ_x κ_y (κ_x + κ_y))) which has optimal dependency on the coupling condition number A_2/√(μ_x μ_y) up to logarithmic factors.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.