Optimization of Graph Total Variation via Active-Set-based Combinatorial Reconditioning

02/27/2020
by   Zhenzhang Ye, et al.
7

Structured convex optimization on weighted graphs finds numerous applications in machine learning and computer vision. In this work, we propose a novel adaptive preconditioning strategy for proximal algorithms on this problem class. Our preconditioner is driven by a sharp analysis of the local linear convergence rate depending on the "active set" at the current iterate. We show that nested-forest decomposition of the inactive edges yields a guaranteed local linear convergence rate. Further, we propose a practical greedy heuristic which realizes such nested decompositions and show in several numerical experiments that our reconditioning strategy, when applied to proximal gradient or primal-dual hybrid gradient algorithm, achieves competitive performances. Our results suggest that local convergence analysis can serve as a guideline for selecting variable metrics in proximal algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2020

Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization

Various types of parameter restart schemes have been proposed for accele...
research
10/18/2012

Optimal Computational Trade-Off of Inexact Proximal Methods

In this paper, we investigate the trade-off between convergence rate and...
research
01/16/2018

Combinatorial Preconditioners for Proximal Algorithms on Graphs

We present a novel preconditioning technique for proximal optimization m...
research
12/09/2021

Continuation Path with Linear Convergence Rate

Path-following algorithms are frequently used in composite optimization ...
research
12/19/2017

Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs

A regularized optimization problem over a large unstructured graph is st...
research
03/26/2019

First-Order Methods with Increasing Iterate Averaging for Solving Saddle-Point Problems

First-order methods are known to be among the fastest algorithms for sol...
research
11/13/2019

Superiorization vs. Accelerated Convex Optimization: The Superiorized/Regularized Least-Squares Case

In this paper we conduct a study of both superiorization and optimizatio...

Please sign up or login with your details

Forgot password? Click here to reset