Nearly Optimal Robust Method for Convex Compositional Problems with Heavy-Tailed Noise

06/17/2020
by   Yan Yan, et al.
0

In this paper, we propose robust stochastic algorithms for solving convex compositional problems of the form f(_ξ g(·; ξ)) + r(·) by establishing sub-Gaussian confidence bounds under weak assumptions about the tails of noise distribution, i.e., heavy-tailed noise with bounded second-order moments. One can achieve this goal by using an existing boosting strategy that boosts a low probability convergence result into a high probability result. However, piecing together existing results for solving compositional problems suffers from several drawbacks: (i) the boosting technique requires strong convexity of the objective; (ii) it requires a separate algorithm to handle non-smooth r; (iii) it also suffers from an additional polylogarithmic factor of the condition number. To address these issues, we directly develop a single-trial stochastic algorithm for minimizing optimal strongly convex compositional objectives, which has a nearly optimal high probability convergence result matching the lower bound of stochastic strongly convex optimization up to a logarithmic factor. To the best of our knowledge, this is the first work that establishes nearly optimal sub-Gaussian confidence bounds for compositional problems under heavy-tailed assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2021

Near-Optimal High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise

Thanks to their practical efficiency and random nature of the data, stoc...
research
02/14/2023

Breaking the Lower Bound with (Little) Structure: Acceleration in Non-Convex Stochastic Optimization with Heavy-Tailed Noise

We consider the stochastic optimization problem with smooth but not nece...
research
03/22/2023

Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises

Recently, several studies consider the stochastic optimization problem b...
research
10/25/2022

Parameter-free Regret in High Probability with Heavy Tails

We present new algorithms for online convex optimization over unbounded ...
research
08/17/2022

High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

In this work we study high probability bounds for stochastic subgradient...
research
12/14/2020

Better scalability under potentially heavy-tailed feedback

We study scalable alternatives to robust gradient descent (RGD) techniqu...
research
07/31/2019

Robust stochastic optimization with the proximal point method

Standard results in stochastic convex optimization bound the number of s...

Please sign up or login with your details

Forgot password? Click here to reset