DeepAI AI Chat
Log In Sign Up

Weak Convergence of Approximate reflection coupling and its Application to Non-convex Optimization

by   Keisuke Suzuki, et al.

In this paper, we propose a weak approximation of the reflection coupling (RC) for stochastic differential equations (SDEs), and prove it converges weakly to the desired coupling. In contrast to the RC, the proposed approximate reflection coupling (ARC) need not take the hitting time of processes to the diagonal set into consideration and can be defined as the solution of some SDEs on the whole time interval. Therefore, ARC can work effectively against SDEs with different drift terms. As an application of ARC, an evaluation on the effectiveness of the stochastic gradient descent in a non-convex setting is also described. For the sample size n, the step size η, and the batch size B, we derive uniform evaluations on the time with orders n^-1, η^1/2, and √((n - B) / B (n - 1)), respectively.


page 1

page 2

page 3

page 4


Geometric ergodicity of SGLD via reflection coupling

We consider the geometric ergodicity of the Stochastic Gradient Langevin...

Convergence of constant step stochastic gradient descent for non-smooth non-convex functions

This paper studies the asymptotic behavior of the constant step Stochast...

Universal Stagewise Learning for Non-Convex Problems with Convergence on Averaged Solutions

Although stochastic gradient descent () method and its variants (e.g., s...

Stochastic Non-convex Ordinal Embedding with Stabilized Barzilai-Borwein Step Size

Learning representation from relative similarity comparisons, often call...

Linear Convergence of Adaptive Stochastic Gradient Descent

We prove that the norm version of the adaptive stochastic gradient metho...