Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss

05/04/2021 ∙ by Yair Carmon, et al. ∙ 0

We characterize the complexity of minimizing max_i∈[N] f_i(x) for convex, Lipschitz functions f_1,…, f_N. For non-smooth functions, existing methods require O(Nϵ^-2) queries to a first-order oracle to compute an ϵ-suboptimal point and Õ(Nϵ^-1) queries if the f_i are O(1/ϵ)-smooth. We develop methods with improved complexity bounds of Õ(Nϵ^-2/3 + ϵ^-8/3) in the non-smooth case and Õ(Nϵ^-2/3 + √(N)ϵ^-1) in the O(1/ϵ)-smooth case. Our methods consist of a recently proposed ball optimization oracle acceleration algorithm (which we refine) and a careful implementation of said oracle for the softmax function. We also prove an oracle complexity lower bound scaling as Ω(Nϵ^-2/3), showing that our dependence on N is optimal up to polylogarithmic factors.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.