DeepAI AI Chat
Log In Sign Up

Uniform Generalization Bound on Time and Inverse Temperature for Gradient Descent Algorithm and its Application to Analysis of Simulated Annealing

by   Keisuke Suzuki, et al.
nec global

In this paper, we propose a novel uniform generalization bound on the time and inverse temperature for stochastic gradient Langevin dynamics (SGLD) in a non-convex setting. While previous works derive their generalization bounds by uniform stability, we use Rademacher complexity to make our generalization bound independent of the time and inverse temperature. Using Rademacher complexity, we can reduce the problem to derive a generalization bound on the whole space to that on a bounded region and therefore can remove the effect of the time and inverse temperature from our generalization bound. As an application of our generalization bound, an evaluation on the effectiveness of the simulated annealing in a non-convex setting is also described. For the sample size n and time s, we derive evaluations with orders √(n^-1log (n+1)) and |(log)^4(s)|^-1, respectively. Here, (log)^4 denotes the 4 times composition of the logarithmic function.


page 1

page 2

page 3

page 4


Generalization Bounds of SGLD for Non-convex Learning: Two Theoretical Viewpoints

Algorithm-dependent generalization error bounds are central to statistic...

Time-independent Generalization Bounds for SGLD in Non-convex Settings

We establish generalization error bounds for stochastic gradient Langevi...

Weak Convergence of Approximate reflection coupling and its Application to Non-convex Optimization

In this paper, we propose a weak approximation of the reflection couplin...

Robust Bi-Tempered Logistic Loss Based on Bregman Divergences

We introduce a temperature into the exponential function and replace the...

Scaling Nonparametric Bayesian Inference via Subsample-Annealing

We describe an adaptation of the simulated annealing algorithm to nonpar...

Generalization Bounds for Stochastic Gradient Descent via Localized ε-Covers

In this paper, we propose a new covering technique localized for the tra...

Convex Optimization with Nonconvex Oracles

In machine learning and optimization, one often wants to minimize a conv...