
Subgradient methods near active manifolds: saddle point avoidance, local convergence, and asymptotic normality
Nonsmooth optimization problems arising in practice tend to exhibit bene...
read it

Stochastic optimization under time drift: iterate averaging, step decay, and high probability guarantees
We consider the problem of minimizing a convex function that is evolving...
read it

Escaping strict saddle points of the Moreau envelope in nonsmooth optimization
Recent work has shown that stochastically perturbed gradient methods can...
read it

Active strict saddles in nonsmooth optimization
We introduce a geometrically transparent strict saddle property for nons...
read it

Robust stochastic optimization with the proximal point method
Standard results in stochastic convex optimization bound the number of s...
read it

Stochastic algorithms with geometric step decay converge linearly on sharp functions
Stochastic (sub)gradient methods require step size schedule tuning to pe...
read it

Lowrank matrix recovery with composite optimization: good conditioning and rapid convergence
The task of recovering a lowrank matrix from its noisy linear measureme...
read it

Composite optimization for robust blind deconvolution
The blind deconvolution problem seeks to recover a pair of vectors from ...
read it

Uniform Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
We investigate the stochastic optimization problem of minimizing populat...
read it

Stochastic modelbased minimization under highorder growth
Given a nonsmooth, nonconvex minimization problem, we consider algorithm...
read it

Stochastic subgradient method converges on tame functions
This work considers the question: what convergence guarantees does the s...
read it

Stochastic modelbased minimization of weakly convex functions
We consider an algorithm that successively samples and minimizes stochas...
read it

Stochastic subgradient method converges at the rate O(k^1/4) on weakly convex functions
We prove that the projected stochastic subgradient method, applied to a ...
read it

Catalyst Acceleration for GradientBased NonConvex Optimization
We introduce a generic scheme to solve nonconvex optimization problems u...
read it

Variable projection without smoothness
The variable projection technique solves structured optimization problem...
read it
Dmitriy Drusvyatskiy
is this you? claim profile