
Momentum with Variance Reduction for Nonconvex Composition Optimization
Composition optimization is widelyapplied in nonconvex machine learning...
read it

Discriminative Bayesian Filtering Lends Momentum to the Stochastic Newton Method for Minimizing LogConvex Functions
To minimize the average of a set of logconvex functions, the stochastic...
read it

Stochastic FrankWolfe Methods for Nonconvex Optimization
We study FrankWolfe methods for nonconvex stochastic and finitesum opt...
read it

Riemannian stochastic recursive momentum method for nonconvex optimization
We propose a stochastic recursive momentum method for Riemannian noncon...
read it

Understanding Modern Techniques in Optimization: FrankWolfe, Nesterov's Momentum, and Polyak's Momentum
In the first part of this dissertation research, we develop a modular fr...
read it

Online NonConvex Constrained Optimization
Timevarying nonconvex continuousvalued nonlinear constrained optimiz...
read it

Cutting plane methods can be extended into nonconvex optimization
We show that it is possible to obtain an O(ϵ^4/3) runtime  including...
read it
Katyusha X: Practical Momentum Method for Stochastic SumofNonconvex Optimization
The problem of minimizing sumofnonconvex functions (i.e., convex functions that are average of nonconvex ones) is becoming increasingly important in machine learning, and is the core machinery for PCA, SVD, regularized Newton's method, accelerated nonconvex optimization, and more. We show how to provably obtain an accelerated stochastic algorithm for minimizing sumofnonconvex functions, by adding one additional line to the wellknown SVRG method. This line corresponds to momentum, and shows how to directly apply momentum to the finitesum stochastic minimization of sumofnonconvex functions. As a side result, our method enjoys linear parallel speedup using minibatch.
READ FULL TEXT
Comments
There are no comments yet.