Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses

06/12/2020
by   Raef Bassily, et al.
0

Uniform stability is a notion of algorithmic stability that bounds the worst case change in the model output by the algorithm when a single data point in the dataset is replaced. An influential work of Hardt et al. (2016) provides strong upper bounds on the uniform stability of the stochastic gradient descent (SGD) algorithm on sufficiently smooth convex losses. These results led to important progress in understanding of the generalization properties of SGD and several applications to differentially private convex optimization for smooth losses. Our work is the first to address uniform stability of SGD on nonsmooth convex losses. Specifically, we provide sharp upper and lower bounds for several forms of SGD and full-batch GD on arbitrary Lipschitz nonsmooth convex losses. Our lower bounds show that, in the nonsmooth case, (S)GD can be inherently less stable than in the smooth case. On the other hand, our upper bounds show that (S)GD is sufficiently stable for deriving new and useful bounds on generalization error. Most notably, we obtain the first dimension-independent generalization bounds for multi-pass SGD in the nonsmooth case. In addition, our bounds allow us to derive a new algorithm for differentially private nonsmooth stochastic convex optimization with optimal excess population risk. Our algorithm is simpler and more efficient than the best known algorithm for the nonsmooth case Feldman et al. (2020).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2022

Differentially Private Stochastic Gradient Descent with Low-Noise

In this paper, by introducing a low-noise condition, we study privacy an...
research
03/19/2023

Lower Generalization Bounds for GD and SGD in Smooth Stochastic Convex Optimization

Recent progress was made in characterizing the generalization error of g...
research
05/20/2023

Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent

Algorithmic stability is an important notion that has proven powerful fo...
research
02/25/2021

Machine Unlearning via Algorithmic Stability

We study the problem of machine unlearning and identify a notion of algo...
research
06/25/2021

Private Adaptive Gradient Methods for Convex Optimization

We study adaptive methods for differentially private convex optimization...
research
04/04/2018

Stability and Convergence Trade-off of Iterative Optimization Algorithms

The overall performance or expected excess risk of an iterative machine ...
research
01/09/2022

Stability Based Generalization Bounds for Exponential Family Langevin Dynamics

We study generalization bounds for noisy stochastic mini-batch iterative...

Please sign up or login with your details

Forgot password? Click here to reset