Algorithms of Robust Stochastic Optimization Based on Mirror Descent Method

07/05/2019
by   Anatoli Juditsky, et al.
0

We propose an approach to construction of robust non-Euclidean iterative algorithms for convex composite stochastic optimization based on truncation of stochastic gradients. For such algorithms, we establish sub-Gaussian confidence bounds under weak assumptions about the tails of the noise distribution in convex and strongly convex settings. Robust estimates of the accuracy of general stochastic algorithms are also proposed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2019

Robust stochastic optimization with the proximal point method

Standard results in stochastic convex optimization bound the number of s...
research
05/21/2020

Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

In this paper, we propose a new accelerated stochastic first-order metho...
research
06/03/2019

A Generic Acceleration Framework for Stochastic Composite Optimization

In this paper, we introduce various mechanisms to obtain accelerated fir...
research
05/22/2017

Follow the Signs for Robust Stochastic Optimization

Stochastic noise on gradients is now a common feature in machine learnin...
research
06/12/2020

Stochastic Optimization for Performative Prediction

In performative prediction, the choice of a model influences the distrib...
research
10/08/2020

Emergent Jaw Predominance in Vocal Development through Stochastic Optimization

Infant vocal babbling strongly relies on jaw oscillations, especially at...
research
12/01/2016

Robust Optimization for Tree-Structured Stochastic Network Design

Stochastic network design is a general framework for optimizing network ...

Please sign up or login with your details

Forgot password? Click here to reset