Accelerating Nonconvex Learning via Replica Exchange Langevin Diffusion

07/04/2020
by   Yi Chen, et al.
0

Langevin diffusion is a powerful method for nonconvex optimization, which enables the escape from local minima by injecting noise into the gradient. In particular, the temperature parameter controlling the noise level gives rise to a tradeoff between “global exploration” and “local exploitation”, which correspond to high and low temperatures. To attain the advantages of both regimes, we propose to use replica exchange, which swaps between two Langevin diffusions with different temperatures. We theoretically analyze the acceleration effect of replica exchange from two perspectives: (i) the convergence in χ^2-divergence, and (ii) the large deviation principle. Such an acceleration effect allows us to faster approach the global minima. Furthermore, by discretizing the replica exchange Langevin diffusion, we obtain a discrete-time algorithm. For such an algorithm, we quantify its discretization error in theory and demonstrate its acceleration effect in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2019

On Distributed Stochastic Gradient Algorithms for Global Optimization

The paper considers the problem of network-based computation of global m...
research
09/29/2022

On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling Walks

Classical algorithms are often not effective for solving nonconvex optim...
research
08/29/2018

Online ICA: Understanding Global Dynamics of Nonconvex Optimization via Diffusion Processes

Solving statistical learning problems often involves nonconvex optimizat...
research
10/02/2020

Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction

Replica exchange stochastic gradient Langevin dynamics (reSGLD) has show...
research
06/29/2020

Spectral Gap of Replica Exchange Langevin Diffusion on Mixture Distributions

Langevin diffusion (LD) is one of the main workhorses for sampling probl...
research
02/11/2021

A Continuized View on Nesterov Acceleration

We introduce the "continuized" Nesterov acceleration, a close variant of...
research
03/16/2020

Tuning Ranking in Co-occurrence Networks with General Biased Exchange-based Diffusion on Hyper-bag-graphs

Co-occurence networks can be adequately modeled by hyper-bag-graphs (hb-...

Please sign up or login with your details

Forgot password? Click here to reset