A Novel Gradient Methodology with Economical Objective Function Evaluations for Data Science Applications

09/19/2023
by   Christian Varner, et al.
0

Gradient methods are experiencing a growth in methodological and theoretical developments owing to the challenges of optimization problems arising in data science. Focusing on data science applications with expensive objective function evaluations yet inexpensive gradient function evaluations, gradient methods that never make objective function evaluations are either being rejuvenated or actively developed. However, as we show, such gradient methods are all susceptible to catastrophic divergence under realistic conditions for data science applications. In light of this, gradient methods which make use of objective function evaluations become more appealing, yet, as we show, can result in an exponential increase in objective evaluations between accepted iterates. As a result, existing gradient methods are poorly suited to the needs of optimization problems arising from data science. In this work, we address this gap by developing a generic methodology that economically uses objective function evaluations in a problem-driven manner to prevent catastrophic divergence and avoid an explosion in objective evaluations between accepted iterates. Our methodology allows for specific procedures that can make use of specific step size selection methodologies or search direction strategies, and we develop a novel step size selection methodology that is well-suited to data science applications. We show that a procedure resulting from our methodology is highly competitive with standard optimization methods on CUTEst test problems. We then show a procedure resulting from our methodology is highly favorable relative to standard optimization methods on optimization problems arising in our target data science applications. Thus, we provide a novel gradient methodology that is better suited to optimization problems arising in data science.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2020

Learning the Step-size Policy for the Limited-Memory Broyden-Fletcher-Goldfarb-Shanno Algorithm

We consider the problem of how to learn a step-size policy for the Limit...
research
12/30/2020

A Review into Data Science and Its Approaches in Mechanical Engineering

Nowadays it is inevitable to use intelligent systems to improve the perf...
research
09/30/2021

Paradigm Shift Through the Integration of Physical Methodology and Data Science

Data science methodologies, which have undergone significant development...
research
05/14/2013

Optimization with First-Order Surrogate Functions

In this paper, we study optimization methods consisting of iteratively m...
research
02/25/2018

Cakewalk Sampling

Combinatorial optimization is a common theme in computer science which u...
research
12/19/2018

Progressive Data Science: Potential and Challenges

Data science requires time-consuming iterative manual activities. In par...
research
01/04/2022

Test and Evaluation of Quadrupedal Walking Gaits through Sim2Real Gap Quantification

In this letter, the authors propose a two-step approach to evaluate and ...

Please sign up or login with your details

Forgot password? Click here to reset