BISTA: a Bregmanian proximal gradient method without the global Lipschitz continuity assumption

04/19/2018
by   Daniel Reem, et al.
0

The problem of minimization of a separable convex objective function has various theoretical and real-world applications. One of the popular methods for solving this problem is the proximal gradient method (proximal forward-backward algorithm). A very common assumption in the use of this method is that the gradient of the smooth term in the objective function is globally Lipschitz continuous. However, this assumption is not always satisfied in practice, thus casting a limitation on the method. In this paper we discuss, in a wide class of finite and infinite-dimensional spaces, a new variant (BISTA) of the proximal gradient method which does not impose the above-mentioned global Lipschitz continuity assumption. A key contribution of the method is the dependence of the iterative steps on a certain decomposition of the objective set into subsets. Moreover, we use a Bregman divergence in the proximal forward-backward operation. Under certain practical conditions, a non-asymptotic rate of convergence (that is, in the function values) is established, as well as the weak convergence of the whole sequence to a minimizer. We also obtain a few auxiliary results of independent interest, among them a general and useful stability principle which, roughly speaking, says that given a uniformly continuous function defined on an arbitrary metric space, if we slightly change the objective set over which the optimal (extreme) values are computed, then these values vary slightly. This principle suggests a general scheme for tackling a wide class of non-convex and non-smooth optimization problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2017

Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms

We propose a unifying algorithm for non-smooth non-convex optimization. ...
research
12/16/2019

Leveraging Two Reference Functions in Block Bregman Proximal Gradient Descent for Non-convex and Non-Lipschitz Problems

In the applications of signal processing and data analytics, there is a ...
research
06/20/2023

Globally optimal solutions to a class of fractional optimization problems based on proximity gradient algorithm

We establish globally optimal solutions to a class of fractional optimiz...
research
04/22/2019

Provable Bregman-divergence based Methods for Nonconvex and Non-Lipschitz Problems

The (global) Lipschitz smoothness condition is crucial in establishing t...
research
05/29/2020

Long term dynamics of the subgradient method for Lipschitz path differentiable functions

We consider the long-term dynamics of the vanishing stepsize subgradient...
research
10/08/2019

Bregman Proximal Framework for Deep Linear Neural Networks

A typical assumption for the analysis of first order optimization method...
research
09/17/2018

Projective Splitting with Forward Steps only Requires Continuity

A recent innovation in projective splitting algorithms for monotone oper...

Please sign up or login with your details

Forgot password? Click here to reset