A Comparison of Optimization Algorithms for Deep Learning

07/28/2020
by   Derya Soydaner, et al.
0

In recent years, we have witnessed the rise of deep learning. Deep neural networks have proved their success in many areas. However, the optimization of these networks has become more difficult as neural networks going deeper and datasets becoming bigger. Therefore, more advanced optimization algorithms have been proposed over the past years. In this study, widely used optimization algorithms for deep learning are examined in detail. To this end, these algorithms called adaptive gradient methods are implemented for both supervised and unsupervised tasks. The behaviour of the algorithms during training and results on four image datasets, namely, MNIST, CIFAR-10, Kaggle Flowers and Labeled Faces in the Wild are compared by pointing out their differences against basic optimization algorithms.

READ FULL TEXT

page 22

page 23

research
03/03/2022

AdaFamily: A family of Adam-like adaptive gradient methods

We propose AdaFamily, a novel method for training deep neural networks. ...
research
05/20/2021

Optimizing Neural Network Weights using Nature-Inspired Algorithms

This study aims to optimize Deep Feedforward Neural Networks (DFNNs) tra...
research
01/17/2019

Optimizing Deep Neural Networks with Multiple Search Neuroevolution

This paper presents an evolutionary metaheuristic called Multiple Search...
research
02/20/2020

An Elementary Approach to Convergence Guarantees of Optimization Algorithms for Deep Networks

We present an approach to obtain convergence guarantees of optimization ...
research
10/23/2021

In Search of Probeable Generalization Measures

Understanding the generalization behaviour of deep neural networks is a ...
research
03/12/2020

Hyper-Parameter Optimization: A Review of Algorithms and Applications

Since deep neural networks were developed, they have made huge contribut...
research
12/11/2020

The Implicit Bias for Adaptive Optimization Algorithms on Homogeneous Neural Networks

Despite their overwhelming capacity to overfit, deep neural networks tra...

Please sign up or login with your details

Forgot password? Click here to reset