Improved Incremental First-Order Oracle Complexity of Variance Reduced Methods for Nonsmooth Convex Stochastic Composition Optimization

02/07/2018
by   Tianyi Lin, et al.
0

We consider the nonsmooth convex composition optimization problem where the objective is a composition of two finite-sum functions and analyze stochastic compositional variance reduced gradient (SCVRG) methods for them. SCVRG and its variants have recently drawn much attention given their edge over stochastic compositional gradient descent (SCGD); but the theoretical analysis exclusively assumes strong convexity of the objective, which excludes several important examples such as Lasso, logistic regression, principle component analysis and deep neural nets. In contrast, we prove non-asymptotic incremental first-order oracle (IFO) complexity of SCVRG or its novel variants for nonsmooth convex composition optimization and show that they are provably faster than SCGD and gradient descent. More specifically, our method achieves the total IFO complexity of O((m+n)(1/ϵ)+1/ϵ^3) which improves that of O(1/ϵ^3.5) and O((m+n)/√(ϵ)) obtained by SCGD and accelerated gradient descent respectively. Experiments on sparse mean-variance optimization problem demonstrates that our method outperforms other competing methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset