DeepAI AI Chat
Log In Sign Up

Uniform Generalization Bound on Time and Inverse Temperature for Gradient Descent Algorithm and its Application to Analysis of Simulated Annealing

05/25/2022
by   Keisuke Suzuki, et al.
nec global
4

In this paper, we propose a novel uniform generalization bound on the time and inverse temperature for stochastic gradient Langevin dynamics (SGLD) in a non-convex setting. While previous works derive their generalization bounds by uniform stability, we use Rademacher complexity to make our generalization bound independent of the time and inverse temperature. Using Rademacher complexity, we can reduce the problem to derive a generalization bound on the whole space to that on a bounded region and therefore can remove the effect of the time and inverse temperature from our generalization bound. As an application of our generalization bound, an evaluation on the effectiveness of the simulated annealing in a non-convex setting is also described. For the sample size n and time s, we derive evaluations with orders √(n^-1log (n+1)) and |(log)^4(s)|^-1, respectively. Here, (log)^4 denotes the 4 times composition of the logarithmic function.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/19/2017

Generalization Bounds of SGLD for Non-convex Learning: Two Theoretical Viewpoints

Algorithm-dependent generalization error bounds are central to statistic...
11/25/2021

Time-independent Generalization Bounds for SGLD in Non-convex Settings

We establish generalization error bounds for stochastic gradient Langevi...
05/24/2022

Weak Convergence of Approximate reflection coupling and its Application to Non-convex Optimization

In this paper, we propose a weak approximation of the reflection couplin...
06/08/2019

Robust Bi-Tempered Logistic Loss Based on Bregman Divergences

We introduce a temperature into the exponential function and replace the...
02/22/2014

Scaling Nonparametric Bayesian Inference via Subsample-Annealing

We describe an adaptation of the simulated annealing algorithm to nonpar...
09/19/2022

Generalization Bounds for Stochastic Gradient Descent via Localized ε-Covers

In this paper, we propose a new covering technique localized for the tra...
11/07/2017

Convex Optimization with Nonconvex Oracles

In machine learning and optimization, one often wants to minimize a conv...