Optimal rates for first-order stochastic convex optimization under Tsybakov noise condition

07/12/2012
by   Aaditya Ramdas, et al.
0

We focus on the problem of minimizing a convex function f over a convex set S given T queries to a stochastic first order oracle. We argue that the complexity of convex minimization is only determined by the rate of growth of the function around its minimizer x^*_f,S, as quantified by a Tsybakov-like noise condition. Specifically, we prove that if f grows at least as fast as x-x^*_f,S^κ around its minimum, for some κ > 1, then the optimal rate of learning f(x^*_f,S) is Θ(T^-κ/2κ-2). The classic rate Θ(1/√(T)) for convex functions and Θ(1/T) for strongly convex functions are special cases of our result for κ→∞ and κ=2, and even faster rates are attained for κ <2. We also derive tight bounds for the complexity of learning x_f,S^*, where the optimal rate is Θ(T^-1/2κ-2). Interestingly, these precise rates for convex optimization also characterize the complexity of active learning and our results further strengthen the connections between the two fields, both of which rely on feedback-driven queries.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro