Derivative-Free Global Optimization Algorithms: Bayesian Method and Lipschitzian Approaches

04/19/2019
by   Jiawei Zhang, et al.
0

In this paper, we will provide an introduction to the derivative-free optimization algorithms which can be potentially applied to train deep learning models. Existing deep learning model training is mostly based on the back propagation algorithm, which updates the model variables layers by layers with the gradient descent algorithm or its variants. However, the objective functions of deep learning models to be optimized are usually non-convex and the gradient descent algorithms based on the first-order derivative can get stuck into the local optima very easily. To resolve such a problem, various local or global optimization algorithms have been proposed, which can help improve the training of deep learning models greatly. The representative examples include the Bayesian methods, Shubert-Piyavskii algorithm, Direct, LIPO, MCS, GA, SCE, DE, PSO, ES, CMA-ES, hill climbing and simulated annealing, etc. One part of these algorithms will be introduced in this paper (including the Bayesian method and Lipschitzian approaches, e.g., Shubert-Piyavskii algorithm, Direct, LIPO and MCS), and the remaining algorithms (including the population based optimization algorithms, e.g., GA, SCE, DE, PSO, ES and CMA-ES, and random search algorithms, e.g., hill climbing and simulated annealing) will be introduced in the follow-up paper [18] in detail.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2019

Derivative-Free Global Optimization Algorithms: Population based Methods and Random Search Approaches

In this paper, we will provide an introduction to the derivative-free op...
research
03/11/2019

Gradient Descent based Optimization Algorithms for Deep Learning Models Training

In this paper, we aim at providing an introduction to the gradient desce...
research
07/15/2021

SA-GD: Improved Gradient Descent Learning Strategy with Simulated Annealing

Gradient descent algorithm is the most utilized method when optimizing m...
research
07/22/2014

Global optimization using Lévy flights

This paper studies a class of enhanced diffusion processes in which rand...
research
02/08/2023

Adaptive State-Dependent Diffusion for Derivative-Free Optimization

This paper develops and analyzes a stochastic derivative-free optimizati...
research
08/16/2018

Experiential Robot Learning with Accelerated Neuroevolution

Derivative-based optimization techniques such as Stochastic Gradient Des...

Please sign up or login with your details

Forgot password? Click here to reset