An Optimization Framework with Flexible Inexact Inner Iterations for Nonconvex and Nonsmooth Programming

02/28/2017
by   Yiyang Wang, et al.
0

In recent years, numerous vision and learning tasks have been (re)formulated as nonconvex and nonsmooth programmings(NNPs). Although some algorithms have been proposed for particular problems, designing fast and flexible optimization schemes with theoretical guarantee is a challenging task for general NNPs. It has been investigated that performing inexact inner iterations often benefit to special applications case by case, but their convergence behaviors are still unclear. Motivated by these practical experiences, this paper designs a novel algorithmic framework, named inexact proximal alternating direction method (IPAD) for solving general NNPs. We demonstrate that any numerical algorithms can be incorporated into IPAD for solving subproblems and the convergence of the resulting hybrid schemes can be consistently guaranteed by a series of simple error conditions. Beyond the guarantee in theory, numerical experiments on both synthesized and real-world data further demonstrate the superiority and flexibility of our IPAD framework for practical use.

READ FULL TEXT
research
02/26/2020

Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization

Various types of parameter restart schemes have been proposed for accele...
research
04/08/2019

Binary matrix completion with nonconvex regularizers

Many practical problems involve the recovery of a binary matrix from par...
research
07/29/2021

A Preconditioned Alternating Minimization Framework for Nonconvex and Half Quadratic Optimization

For some typical and widely used non-convex half-quadratic regularizatio...
research
08/17/2023

An inexact proximal majorization-minimization Algorithm for remote sensing image stripe noise removal

The stripe noise existing in remote sensing images badly degrades the vi...
research
02/11/2023

Hierarchical Optimization-Derived Learning

In recent years, by utilizing optimization techniques to formulate the p...
research
04/28/2018

Toward Designing Convergent Deep Operator Splitting Methods for Task-specific Nonconvex Optimization

Operator splitting methods have been successfully used in computational ...
research
09/24/2019

On the Convergence of ADMM with Task Adaption and Beyond

Along with the development of learning and vision, Alternating Direction...

Please sign up or login with your details

Forgot password? Click here to reset