Stochastic Conditional Gradient++

02/19/2019
by   Hamed Hassani, et al.
0

In this paper, we develop Stochastic Continuous Greedy++ (SCG++), the first efficient variant of a conditional gradient method for maximizing a continuous submodular function subject to a convex constraint. Concretely, for a monotone and continuous DR-submodular function, SCG++ achieves a tight [(1-1/e)OPT -ϵ] solution while using O(1/ϵ^2) stochastic oracle queries and O(1/ϵ) calls to the linear optimization oracle. The best previously known algorithms either achieve a suboptimal [(1/2)OPT -ϵ] solution with O(1/ϵ^2) stochastic gradients or the tight [(1-1/e)OPT -ϵ] solution with suboptimal O(1/ϵ^3) stochastic gradients. SCG++ enjoys optimality in terms of both approximation guarantee and stochastic stochastic oracle queries. Our novel variance reduction method naturally extends to stochastic convex minimization. More precisely, we develop Stochastic Frank-Wolfe++ (SFW++) that achieves an ϵ-approximate optimum with only O(1/ϵ) calls to the linear optimization oracle while using O(1/ϵ^2) stochastic oracle queries in total. Therefore, SFW++ is the first efficient projection-free algorithm that achieves the optimum complexity O(1/ϵ^2) in terms of stochastic oracle queries.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset