Stability and Deviation Optimal Risk Bounds with Convergence Rate O(1/n)

03/22/2021
by   Yegor Klochkov, et al.
0

The sharpest known high probability generalization bounds for uniformly stable algorithms (Feldman, Vondrák, 2018, 2019), (Bousquet, Klochkov, Zhivotovskiy, 2020) contain a generally inevitable sampling error term of order Θ(1/√(n)). When applied to excess risk bounds, this leads to suboptimal results in several standard stochastic convex optimization problems. We show that if the so-called Bernstein condition is satisfied, the term Θ(1/√(n)) can be avoided, and high probability excess risk bounds of order up to O(1/n) are possible via uniform stability. Using this result, we show a high probability excess risk bound with the rate O(log n/n) for strongly convex and Lipschitz losses valid for any empirical risk minimization method. This resolves a question of Shalev-Shwartz, Shamir, Srebro, and Sridharan (2009). We discuss how O(log n/n) high probability excess risk bounds are possible for projected gradient descent in the case of strongly convex and Lipschitz losses without the usual smoothness assumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2019

High probability generalization bounds for uniformly stable algorithms with nearly optimal rate

Algorithmic stability is a classical approach to understanding and analy...
research
07/17/2022

Uniform Stability for First-Order Empirical Risk Minimization

We consider the problem of designing uniformly stable first-order optimi...
research
05/13/2013

Boosting with the Logistic Loss is Consistent

This manuscript provides optimization guarantees, generalization bounds,...
research
04/06/2022

High Probability Bounds for a Class of Nonconvex Algorithms with AdaGrad Stepsize

In this paper, we propose a new, simplified high probability analysis of...
research
02/21/2023

Exploring Local Norms in Exp-concave Statistical Learning

We consider the problem of stochastic convex optimization with exp-conca...
research
11/03/2022

Optimal Algorithms for Stochastic Complementary Composite Minimization

Inspired by regularization techniques in statistics and machine learning...
research
04/04/2014

Optimal learning with Bernstein Online Aggregation

We introduce a new recursive aggregation procedure called Bernstein Onli...

Please sign up or login with your details

Forgot password? Click here to reset