On Distributed Stochastic Gradient Algorithms for Global Optimization

10/21/2019
by   Brian Swenson, et al.
0

The paper considers the problem of network-based computation of global minima in smooth nonconvex optimization problems. It is known that distributed gradient-descent-type algorithms can achieve convergence to the set of global minima by adding slowly decaying Gaussian noise in order escape local minima. However, the technical assumptions under which convergence is known to occur can be restrictive in practice. In particular, in known convergence results, the local objective functions possessed by agents are required to satisfy a highly restrictive bounded-gradient-dissimilarity condition. The paper demonstrates convergence to the set of global minima while relaxing this key assumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2019

Distributed Global Optimization by Annealing

The paper considers a distributed algorithm for global minimization of a...
research
09/29/2022

On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling Walks

Classical algorithms are often not effective for solving nonconvex optim...
research
07/04/2020

Accelerating Nonconvex Learning via Replica Exchange Langevin Diffusion

Langevin diffusion is a powerful method for nonconvex optimization, whic...
research
01/04/2007

Statistical tools to assess the reliability of self-organizing maps

Results of neural network learning are always subject to some variabilit...
research
03/18/2019

Annealing for Distributed Global Optimization

The paper proves convergence to global optima for a class of distributed...
research
06/16/2023

Practical Sharpness-Aware Minimization Cannot Converge All the Way to Optima

Sharpness-Aware Minimization (SAM) is an optimizer that takes a descent ...
research
05/13/2018

The Global Optimization Geometry of Shallow Linear Neural Networks

We examine the squared error loss landscape of shallow linear neural net...

Please sign up or login with your details

Forgot password? Click here to reset