Katyusha Acceleration for Convex Finite-Sum Compositional Optimization

10/24/2019
by   Yibo Xu, et al.
0

Structured problems arise in many applications. To solve these problems, it is important to leverage the structure information. This paper focuses on convex problems with a finite-sum compositional structure. Finite-sum problems appear as the sample average approximation of a stochastic optimization problem and also arise in machine learning with a huge amount of training data. One popularly used numerical approach for finite-sum problems is the stochastic gradient method (SGM). However, the additional compositional structure prohibits easy access to unbiased stochastic approximation of the gradient, so directly applying the SGM to a finite-sum compositional optimization problem (COP) is often inefficient. We design new algorithms for solving strongly-convex and also convex two-level finite-sum COPs. Our design incorporates the Katyusha acceleration technique and adopts the mini-batch sampling from both outer-level and inner-level finite-sum. We first analyze the algorithm for strongly-convex finite-sum COPs. Similar to a few existing works, we obtain linear convergence rate in terms of the expected objective error, and from the convergence rate result, we then establish complexity results of the algorithm to produce an ε-solution. Our complexity results have the same dependence on the number of component functions as existing works. However, due to the use of Katyusha acceleration, our results have better dependence on the condition number κ and improve to κ^2.5 from the best-known κ^3. Finally, we analyze the algorithm for convex finite-sum COPs, which uses as a subroutine the algorithm for strongly-convex finite-sum COPs. Again, we obtain better complexity results than existing works in terms of the dependence on ε, improving to ε^-2.5 from the best-known ε^-3.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2018

Inexact SARAH Algorithm for Stochastic Optimization

We develop and analyze a variant of variance reducing stochastic gradien...
research
07/07/2020

An Accelerated DFO Algorithm for Finite-sum Convex Functions

Derivative-free optimization (DFO) has recently gained a lot of momentum...
research
09/04/2018

Compositional Stochastic Average Gradient for Machine Learning and Related Applications

Many machine learning, statistical inference, and portfolio optimization...
research
02/24/2022

Finite-Sum Compositional Stochastic Optimization: Theory and Applications

This paper studies stochastic optimization for a sum of compositional fu...
research
03/22/2018

SUCAG: Stochastic Unbiased Curvature-aided Gradient Method for Distributed Optimization

We propose and analyze a new stochastic gradient method, which we call S...
research
12/17/2020

Stochastic Compositional Gradient Descent under Compositional constraints

This work studies constrained stochastic optimization problems where the...
research
12/19/2019

Zeroth-order Stochastic Compositional Algorithms for Risk-Aware Learning

We present Free-MESSAGEp, the first zeroth-order algorithm for convex me...

Please sign up or login with your details

Forgot password? Click here to reset