Effectiveness of Optimization Algorithms in Deep Image Classification

10/04/2021
by   Zhaoyang Zhu, et al.
0

Adam is applied widely to train neural networks. Different kinds of Adam methods with different features pop out. Recently two new adam optimizers, AdaBelief and Padam are introduced among the community. We analyze these two adam optimizers and compare them with other conventional optimizers (Adam, SGD + Momentum) in the scenario of image classification. We evaluate the performance of these optimization algorithms on AlexNet and simplified versions of VGGNet, ResNet using the EMNIST dataset. (Benchmark algorithm is available at [https://github.com/chuiyunjun/projectCSC413]https://github.com/chuiyunjun/projectCSC413).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2022

hp3D User Manual

User Manual for the hp3D Finite Element Software, available on GitHub at...
research
12/06/2021

DANets: Deep Abstract Networks for Tabular Data Classification and Regression

Tabular data are ubiquitous in real world applications. Although many co...
research
03/30/2023

Invertible Convolution with Symmetric Paddings

We show that symmetrically padded convolution can be analytically invert...
research
01/24/2023

Model soups to increase inference without increasing compute time

In this paper, we compare Model Soups performances on three different mo...
research
07/16/2020

Interpretable Neuroevolutionary Models for Learning Non-Differentiable Functions and Programs

A key factor in the modern success of deep learning is the astonishing e...
research
08/21/2023

CoNe: Contrast Your Neighbours for Supervised Image Classification

Image classification is a longstanding problem in computer vision and ma...
research
09/05/2023

AdaPlus: Integrating Nesterov Momentum and Precise Stepsize Adjustment on AdamW Basis

This paper proposes an efficient optimizer called AdaPlus which integrat...

Please sign up or login with your details

Forgot password? Click here to reset