Globally optimal solutions to a class of fractional optimization problems based on proximity gradient algorithm

06/20/2023
by   Yizun Lin, et al.
0

We establish globally optimal solutions to a class of fractional optimization problems on a class of constraint sets, whose key characteristics are as follows: 1) The numerator and the denominator of the objective function are both convex, semi-algebraic, Lipschitz continuous and differentiable with Lipschitz continuous gradients on the constraint set. 2) The constraint set is closed, convex and semi-algebraic. Compared with Dinkelbach's approach, our novelty falls into the following aspects: 1) Dinkelbach's has to solve a concave maximization problem in each iteration, which is nontrivial to obtain a solution, while ours only needs to conduct one proximity gradient operation in each iteration. 2) Dinkelbach's requires at least one nonnegative point for the numerator to proceed the algorithm, but ours does not, which is available to a much wider class of situations. 3) Dinkelbach's requires a closed and bounded constraint set, while ours only needs the closedness but not necessarily the boundedness. Therefore, our approach is viable for many more practical models, like optimizing the Sharpe ratio (SR) or the Information ratio in mathematical finance. Numerical experiments show that our approach achieves the ground-truth solutions in two simple examples. For real-world financial data, it outperforms several existing approaches for SR maximization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2018

Stochastic Conditional Gradient Methods: From Convex Minimization to Submodular Maximization

This paper considers stochastic optimization problems for a large class ...
research
04/19/2018

BISTA: a Bregmanian proximal gradient method without the global Lipschitz continuity assumption

The problem of minimization of a separable convex objective function has...
research
05/25/2022

Complexity-Optimal and Curvature-Free First-Order Methods for Finding Stationary Points of Composite Optimization Problems

This paper develops and analyzes an accelerated proximal descent method ...
research
09/22/2019

Contractivity of Runge-Kutta methods for convex gradient systems

We consider the application of Runge-Kutta (RK) methods to gradient syst...
research
05/12/2020

Gradient-Free Methods for Saddle-Point Problem

In the paper, we generalize the approach Gasnikov et. al, 2017, which al...
research
04/30/2018

Equivalent Lipschitz surrogates for zero-norm and rank optimization problems

This paper proposes a mechanism to produce equivalent Lipschitz surrogat...
research
08/19/2021

Parallel Quasi-concave set optimization: A new frontier that scales without needing submodularity

Classes of set functions along with a choice of ground set are a bedrock...

Please sign up or login with your details

Forgot password? Click here to reset