DeepAI AI Chat
Log In Sign Up

A Modified Nonlinear Conjugate Gradient Algorithm for Functions with Non-Lipschitz Gradient

by   Bingjie Li, et al.
National University of Singapore
Zhejiang University

In this paper, we propose a modified nonlinear conjugate gradient (NCG) method for functions with a non-Lipschitz continuous gradient. First, we present a new formula for the conjugate coefficient β_k in NCG, conducting a search direction that provides an adequate function decrease. We can derive that our NCG algorithm guarantees strongly convergent for continuous differential functions without Lipschitz continuous gradient. Second, we present a simple interpolation approach that could automatically achieve shrinkage, generating a step length satisfying the standard Wolfe conditions in each step. Our framework considerably broadens the applicability of NCG and preserves the superior numerical performance of the PRP-type methods.


page 1

page 2

page 3

page 4


An Algorithm for Computing Lipschitz Inner Functions in Kolmogorov's Superposition Theorem

Kolmogorov famously proved that multivariate continuous functions can be...

Divergence of the ADAM algorithm with fixed-stepsize: a (very) simple example

A very simple unidimensional function with Lipschitz continuous gradient...

Broadening the convergence domain of Seventh-order method satisfying Lipschitz and Hölder conditions

In this paper, the local convergence analysis of the multi-step seventh ...

Active set expansion strategies in MPRGP algorithm

The paper investigates strategies for expansion of active set that can b...

Contractivity of Runge-Kutta methods for convex gradient systems

We consider the application of Runge-Kutta (RK) methods to gradient syst...