Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training

02/14/2023
by   S. Gratton, et al.
0

A class of multi-level algorithms for unconstrained nonlinear optimization is presented which does not require the evaluation of the objective function. The class contains the momentum-less AdaGrad method as a particular (single-level) instance. The choice of avoiding the evaluation of the objective function is intended to make the algorithms of the class less sensitive to noise, while the multi-level feature aims at reducing their computational cost. The evaluation complexity of these algorithms is analyzed and their behaviour in the presence of noise is then illustrated in the context of training deep neural networks for supervised learning applications.

READ FULL TEXT
research
03/03/2022

Parametric complexity analysis for a class of first-order Adagrad-like algorithms

A class of algorithms for optimization in the presence of noise is prese...
research
05/23/2023

A Block-Coordinate Approach of Multi-level Optimization with an Application to Physics-Informed Neural Networks

Multi-level methods are widely used for the solution of large-scale prob...
research
08/17/2016

Mollifying Networks

The optimization of deep neural networks can be more challenging than tr...
research
05/19/2021

Trilevel and Multilevel Optimization using Monotone Operator Theory

We consider rather a general class of multi-level optimization problems,...
research
02/19/2012

Classification by Ensembles of Neural Networks

We introduce a new procedure for training of artificial neural networks ...
research
04/06/2021

The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case

Intrinsic noise in objective function and derivatives evaluations may ca...
research
01/17/2023

Scaling Deep Networks with the Mesh Adaptive Direct Search algorithm

Deep neural networks are getting larger. Their implementation on edge an...

Please sign up or login with your details

Forgot password? Click here to reset