Gradient Flows and Accelerated Proximal Splitting Methods

08/02/2019
by   Guilherme França, et al.
3

Proximal based methods are well-suited to nonsmooth optimization problems with important applications in signal processing, control theory, statistics and machine learning. There are essentially four basic types of proximal algorithms currently known: forward-backward splitting, forward-backward-forward or Tseng splitting, Douglas-Rachford and the very recent Davis-Yin three-operator splitting. In addition, the alternating direction method of multipliers (ADMM) is also closely related. In this paper, we show that all these different methods can be derived from the gradient flow by using splitting methods for ordinary differential equations. Furthermore, applying similar discretization scheme to a particular second order differential equation results in accelerated variants of the respective algorithm, which can be of Nesterov or heavy ball type, although we treat both simultaneously. Many of the optimization algorithms we derive are new. For instance, we propose accelerated variants of Davis-Yin and two extensions of ADMM together with their accelerated variants. Interestingly, we show that (accelerated) ADMM corresponds to a rebalanced splitting which is a recent technique designed to preserve steady states of the differential equation. Overall, our results strengthen the connections between optimization and continuous dynamical systems and offers a more unified perspective on accelerated methods.

READ FULL TEXT

page 26

page 27

research
10/02/2020

Distributed Proximal Splitting Algorithms with Rates and Acceleration

We analyze several generic proximal splitting algorithms well suited for...
research
08/13/2018

Relax, and Accelerate: A Continuous Perspective on ADMM

The acceleration technique first introduced by Nesterov for gradient des...
research
10/15/2021

Halpern-Type Accelerated and Splitting Algorithms For Monotone Inclusions

In this paper, we develop a new type of accelerated algorithms to solve ...
research
08/23/2019

Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints

Many large-scale and distributed optimization problems can be brought in...
research
06/10/2020

Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators

Proximal operations are among the most common primitives appearing in bo...
research
11/13/2019

Superiorization vs. Accelerated Convex Optimization: The Superiorized/Regularized Least-Squares Case

In this paper we conduct a study of both superiorization and optimizatio...
research
06/29/2023

A Low-Power Hardware-Friendly Optimisation Algorithm With Absolute Numerical Stability and Convergence Guarantees

We propose Dual-Feedback Generalized Proximal Gradient Descent (DFGPGD) ...

Please sign up or login with your details

Forgot password? Click here to reset