Noisy Linear Convergence of Stochastic Gradient Descent for CV@R Statistical Learning under Polyak-Łojasiewicz Conditions

12/14/2020
by   Dionysios S. Kalogerias, et al.
0

Conditional Value-at-Risk (CV@R) is one of the most popular measures of risk, which has been recently considered as a performance criterion in supervised statistical learning, as it is related to desirable operational features in modern applications, such as safety, fairness, distributional robustness, and prediction error stability. However, due to its variational definition, CV@R is commonly believed to result in difficult optimization problems, even for smooth and strongly convex loss functions. We disprove this statement by establishing noisy (i.e., fixed-accuracy) linear convergence of stochastic gradient descent for sequential CV@R learning, for a large class of not necessarily strongly-convex (or even convex) loss functions satisfying a set-restricted Polyak-Lojasiewicz inequality. This class contains all smooth and strongly convex losses, confirming that classical problems, such as linear least squares regression, can be solved efficiently under the CV@R criterion, just as their risk-neutral versions. Our results are illustrated numerically on such a risk-aware ridge regression task, also verifying their validity in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2020

Statistical Learning with Conditional Value at Risk

We propose a risk-averse statistical learning framework wherein the perf...
research
06/10/2013

Non-strongly-convex smooth stochastic approximation with convergence rate O(1/n)

We consider the stochastic approximation problem where a convex function...
research
11/14/2014

Stochastic Compositional Gradient Descent: Algorithms for Minimizing Compositions of Expected-Value Functions

Classical stochastic gradient methods are well suited for minimizing exp...
research
03/20/2017

Guaranteed Sufficient Decrease for Variance Reduced Stochastic Gradient Descent

In this paper, we propose a novel sufficient decrease technique for vari...
research
09/12/2018

On the Stability and Convergence of Stochastic Gradient Descent with Momentum

While momentum-based methods, in conjunction with the stochastic gradien...
research
02/25/2020

Biased Stochastic Gradient Descent for Conditional Stochastic Optimization

Conditional Stochastic Optimization (CSO) covers a variety of applicatio...
research
02/26/2018

Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization

In this paper, we propose a novel sufficient decrease technique for stoc...

Please sign up or login with your details

Forgot password? Click here to reset