Nearly Optimal Robust Method for Convex Compositional Problems with Heavy-Tailed Noise

06/17/2020 ∙ by Yan Yan, et al. ∙ 0

In this paper, we propose robust stochastic algorithms for solving convex compositional problems of the form f(_ξ g(·; ξ)) + r(·) by establishing sub-Gaussian confidence bounds under weak assumptions about the tails of noise distribution, i.e., heavy-tailed noise with bounded second-order moments. One can achieve this goal by using an existing boosting strategy that boosts a low probability convergence result into a high probability result. However, piecing together existing results for solving compositional problems suffers from several drawbacks: (i) the boosting technique requires strong convexity of the objective; (ii) it requires a separate algorithm to handle non-smooth r; (iii) it also suffers from an additional polylogarithmic factor of the condition number. To address these issues, we directly develop a single-trial stochastic algorithm for minimizing optimal strongly convex compositional objectives, which has a nearly optimal high probability convergence result matching the lower bound of stochastic strongly convex optimization up to a logarithmic factor. To the best of our knowledge, this is the first work that establishes nearly optimal sub-Gaussian confidence bounds for compositional problems under heavy-tailed assumptions.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.