ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization

02/15/2019
by   Nhan H. Pham, et al.
0

In this paper, we propose a new stochastic algorithmic framework to solve stochastic composite nonconvex optimization problems that covers both finite-sum and expectation settings. Our algorithms rely on the SARAH estimator introduced in (Nguyen et al., 2017a) and consist of two steps: a proximal gradient step and an averaging step that are different from existing nonconvex proximal-type algorithms. The algorithms only require a smoothness assumption of the nonconvex objective term. In the finite-sum case, we show that our algorithm achieves optimal convergence rate by matching the lower-bound worst-case complexity, while in the expectation case, it attains the best-known convergence rate under only standard smoothness and bounded variance assumptions. One key step of our algorithms is a new constant step-size that helps to achieve desired convergence rate. Our step-size is much larger than existing methods including proximal SVRG schemes in the single sample case. We generalize our algorithm to mini-batches for both inner and outer loops, and adaptive step-sizes. We also specify the algorithm to the non-composite case that covers and dominates existing state-of-the-arts in terms of convergence rate. We test the proposed algorithms on two composite nonconvex optimization problems and feedforward neural networks using several well-known datasets.

READ FULL TEXT
research
07/08/2019

A Hybrid Stochastic Optimization Framework for Stochastic Composite Nonconvex Optimization

In this paper, we introduce a new approach to develop stochastic optimiz...
research
10/25/2018

SpiderBoost: A Class of Faster Variance-reduced Algorithms for Nonconvex Optimization

There has been extensive research on developing stochastic variance redu...
research
02/13/2020

Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization

Adaptivity is an important yet under-studied property in modern optimiza...
research
01/29/2019

Stochastic Conditional Gradient Method for Composite Convex Minimization

In this paper, we propose the first practical algorithm to minimize stoc...
research
12/19/2022

Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation Constrained Optimization

Many real-world problems not only have complicated nonconvex functional ...
research
10/18/2012

Optimal Computational Trade-Off of Inexact Proximal Methods

In this paper, we investigate the trade-off between convergence rate and...
research
07/17/2022

SPIRAL: A Superlinearly Convergent Incremental Proximal Algorithm for Nonconvex Finite Sum Minimization

We introduce SPIRAL, a SuPerlinearly convergent Incremental pRoximal ALg...

Please sign up or login with your details

Forgot password? Click here to reset