AdaFamily: A family of Adam-like adaptive gradient methods

03/03/2022
by   Hannes Fassold, et al.
8

We propose AdaFamily, a novel method for training deep neural networks. It is a family of adaptive gradient methods and can be interpreted as sort of a blend of the optimization algorithms Adam, AdaBelief and AdaMomentum. We perform experiments on standard datasets for image classification, demonstrating that our proposed method outperforms these algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset